Friday, December 31, 2010

A paradigm mismatch ?

Here are my comment on an often repeating topic of conversation about 6th semester projects in Masters Of Computer Applications course. A lot of people are considering programming for web based applications aka websites for their last semester projects in MCA. Since this involves a whole stack of technologies, one can observe that it in turn, helps a person or more importantly, his mentor to hide a poor design behind this complexity.
Coming back to the topic, a lot of my friends and people from various coaching centers have certainly got preconceived notions regarding different technologies to code with as their server side programming language. However, one can observe the faint rumblings of faint but still existent religious fervor of 'my language is better than yours'.
To start off with, students try and then ignore Java as an alternative because it is too complex (It is a big pain, but then has its benefits) and huge for their web applications. However, this fear can be allayed by the fact that one can choose from a wide variety of frameworks(MVC like Struts & SpringMVC or component based like Tapestry or JSF) and technologies (a plethora of middleware tech.) within the java ecosystem itself. PHP is touted as a compelling alternative, but for me, the scripting approach only works for demonstrative purposes. Anything larger either needs a huge patience, or frameworks, which are immature so far in this platform. I really am not impressed by frameworks like Zend here to name a few. Microsoft .NET deserves a special mention because it is already baked in (with IDE & app server ready, for instance) as well as has a meaningful architecture(through code behind). However the very strength of this framework becomes its nemesis as students are not encouraged to question/hack into the innards of any tool used. The upcoming framework like asp.net MVC too represents a copied and chaotic exercise to perform something that is coming too late.
My personal outlook about the whole situation is that no single technology rules the roost. If you ask me, it is java for building middleware (business intensive) & expandable (one where you can fit in a lot of components/features) websites. Php is really cute as you get the time to play around with css & javascript based functionality around your site (After all, the data is the king and your site functions as its glorified frontend). If I require a lot of versatility as well as ease of development, I'll go for .NET as it really hits that sweet spot when you are willing to work inside the confines of the framework.
I had this discussion with myself a few days back in a moment of clarity as I was contemplating mine and others' 6th semester project for my MCA course. I specifically ignored Rails (and frameworks based on it) as one generally threads a safe line in our course. Researching and hacking into a shiny new technology and writing a perfect but undocumented & incomplete project would only lead towards its cancellation.
BTW, I had time to write all this stuff on new year's eve as tomorrow is my 5th semester ERP examination. So good luck for me and a very happy new year 2011 AD for my readers!

Monday, December 6, 2010

The stuff behind the fluff – Is Green IT an exercise in vain?

Cloud computing is generally thought to be efficient, but this study changes it all. According to the recent UK study as published by DW World, cloud computing study reveals higher environmental impact than was previously thought. The quantity of the data centers as well as the consumers would collectively create this problem, which would be further intensified by the use of richer digital media. What this study surprised me was the analysis that it would be casual home users who would lead the excessive usage through rich media sharing/duplication.
In the past, Information Storage and Management used to focus solely on the enterprise specific needs. For handling the binary media format, technologies like CAS(Content Addressing Systems) are already in place. But according to this study, there would be a demand of 3200 Mega Bytes per person per day in just few years time. Given the pace of Internet proliferation and advancement, this is not a vague guess but a worrisome one.
The article pointed out the following courses of actions :
Creation of better/faster computers – This is really not possible as stated due to the lag between demands and Moore's Law
Green data centers – Again, here too the politics play an important role, BRIC nations in particular do have a cavalier attitude towards implementing clean sources.

What was missing here in my option was the point that the research was addressing the solution of the problem rather than its cause. So, instead of addressing the data storage issues, we need to rigorously safeguard against the redundancy in such data. Semantic web is one such approach. Other approaches such as tagging the digital media and having a single copy for identical media (different locations with same signature) could solve this problem. Also, instead of consolidization, the work on distributed computing should also be promoted.
The distributed computing can easily be done using map/reduce algorithms or through application development platforms like Apache Hadoop. The issue of data can be addressed using peer-to-peer data exchange as in bit-torrent. These wouldn't create problems themselves because of the ever decreasing costs of online bandwidth and its performance.