Showing posts with label java. Show all posts
Showing posts with label java. Show all posts

Monday, September 4, 2017

Addressing a uniform language based app stack

In the past I've been a proponent of ployglot development  - wherein one project contains a host of different languages - such as Javascript on the frontend backed by a mix of various server side languages like java/.Net that make up the application level layer and its corresponding business layer and which further interacts with a domain driven language/tool like SAP abap/ERP which could (theoretically be coded in an altogether different language).
Having said that, developing and maintaining such an application in the real world quickly escalates into a proverbial fireball that gets tangled in the different parts of the application. Now let me explain this - for instance, a part of team is writing a web-service in java, and another part is writing a different part in python. This leaves with no scope of inter-operability in future (if one needs to mix and match say 2 functions of these two services). I can also cite example of Ruby On Rails into this, where the fairly immaculate ruby code gets interspersed with bloated html code, which leaves a bad taste in the maintainability of the application.

A wonderful example is the case of node.js which has proven why a fairly consistent language helps alleviate issues of managing different languages in a single application, thereby creating fewer bugs and reducing the overall development coordination and time.

I've found scalaJs to be an equally good tool to be used in Play/ scala based web applications and services which require complex frontend development as well. Hopefully, I'll be able to write some applications to try and assess its impact over conventional apps.

Thursday, February 19, 2015

Database Migrations for Java Application

One of the tools that I've used in the recent days that merits a blogpost is Flyway.
Flyway is a database migration tool that provides database state migration in applications that do not have a built in migration.
Rails and (some) rails based frameworks already have this, but it is pain to implement this in say, a java application. Flyway alieviates the need to create this as it has native support for many popular databases(including cloud based Dbs and counting?) as well as a Java based api with support for build tools like ant, maven and gradle.
It has only 6 basic functions: Migrate, Clean, Info, Validate, Baseline and Repair.
Unlike rails, you are not restricted with timestamp values (and they are a pain if you have renamed a file and left out a digit - which makes ordering difficult to trace). Here, you can specify a file as V__.sql

While a rails developer is bound to miss the up and down in a migration file (as the rails like backward migration is missing in the current version), the rest of the application works as desired and is indeed handy.

Having some significant traction, this project is in use by different teams and does not require huge overhead/configuration to work easily with hibernate/spring based applications as well. The project is located at github so you can easily head over there to check out its source as well.

Friday, November 21, 2014

Custom Java Metrics in Eclipse

Recently, a relative inquired if I could provide a tool to calculate custom metrics for java code, so I began search and found an old, but open source plugin for eclipse that does just exactly that- calculate metrics without doing any fluffy stuff.
I took its source code and provided it on github.


I had to do minor changes to make it working quickly, which was to use only the metrics table prospective and hide the views not in use.

Here, the metrics need to be provided with 2 essential things:
  1. The name of metric in the plugin.xml file (which is read by the plugin in the list of metrics to calculate)
  2. Name of the calculator file, which contains the logic used in calculating a specific metric value

The required custom metrics were outlined in a paper that discussed about quality of object oriented design and was academic in nature, so the nature of metrics created is quite simple.

For instance, the metric 'Inheritance Ratio' took the form of following tag in plugin.xml

    name="Inheritance Ratio"
    id="INHR"
    level="type">


and also mapped its logic via the calculator tag:
    name="Inheritance Ratio"
    calculatorClass="net.sourceforge.metrics.calculators.InheritanceRatio"
    level="type">



Next, there is need of simply creating this calculator file and overriding the calculate method. Another thing is to create constants like INHR (in this example) by which the metric and its calculator may communicate during runtime.

So, by piggybacking on this application, I was easily able to provide the metrics tool without worrying anything about the UI and source code management. I plan to finish this quickly and then improve it as per suggestions, which are always welcome!
https://github.com/SumitBisht/Metrics

Wednesday, October 29, 2014

Using Hybris for e-commerce

Recently, I had a chance to work on Hybris, which is a private company (now acquired by SAP) offering an application framework with the same name, which is a customizable platform for handling e-commerce B2B and B2C needs. While I was initially skeptical of the framework in the sense that it is not apparently clear how to create an online store like Magento or Spree, but is much more than a CMS that creates a shopping front-end for the user.
Technically, hybris is a framework built on top of spring that runs on a customized tomcat or SpringSource DM Server and uses maven for build automation. While this architecture is carefully thought out, the problem lies with its openness- the community is quite limited and as learners, we are restricted to the hybris wiki and forums. Given the evolving nature of the application (different versions come frequent and fast), opening it up would make a lot of sense. As SAP recently acquired Hybris, it may provide integration with its tools and databases in future as well as provide forums and community support like other mainstream software.

Coming back to uniqueness in this framework, here are a few notable observations:
Open-Close model: While the software is not open source, the way of creating application on top of framework comes with least surprises and there is a lot of flexibility.
User oriented: Cockpits are specialized interfaces that power users/admins of the software can use to quickly access the information present in their application.
Thought out: Like a matured product, entire gamut of e-commerce application is present and one can not only provide the http based web solution but also plugin with an existing application.
Scalable: As jvm based approach is followed, it is quite scalable - though it requires an upfront resource allocation during development, it sure pays back to ensure the scalability of the website.
Performance: The customization comes at a cost of computing resources, as well as slow/complex process workflows. The overall performance of the application if compared against other ecommerce solutions is worse.
XML based: The configuration is mainly in xml and parts like the UI is built on custom framework, which is pretty restrictive. While these might not be the issues, but they stuck out like a sore thumb to me, which brings us to another conclusion discussed in the next point.
Old standard Architecture: Application development seems archaic on this framework as the choice of the technologies used as well as provided seems to be a decade old - and regardless of the things mentioned on the website, integration with popular technologies is quite hard.

As I continue my exploration into this framework while working on real world projects, the things gained so far would help me to compare this against both open-source and corporate alternatives and enable me to contrast the differences more accurately against each of them.


Thursday, June 5, 2014

Book Review: Client Server Web Apps with Javascript and Java



The book, 'Client server Web Apps with Javascript and Java' by Casimir Saternos aply provides its puchline, 'Rich, Scalable and Restful'. These words do not only cover the essence of this book, but also describe the adoption of Javascript based frameworks and technologies on the user-interface/frontend of today's Enterprise Java applications.

A new term is used to introduce the users - Client-Server, which signifies that client side of an application is as important as its server side and the amount of programming efforts required on the client side is also as big as it is managed on the server. It is similar to the other topics that are introduced in this book - completely from scratch, which enforce learning familiar concepts like JavaScript refreshing to learn.
Even for an experienced developer, there are lot of things to watch out for like in chapter 2 where excerpts from 'Javascript: The Good Part' by Douglas Crockford are cited for concise learning. Similarly, in the next chapter detailing REST and JSON, the non-existence of url/syndication in JSON and its related debate surrounding HATEOAS (Hypermedia As The Engine Of Application State) is explained. The JVM specific languages are mentioned by highlighting build tools related to them, which is potentially confusing if the person reading is not familiar with build, version and test tools.
The next part of the book starting with Chapter 5 deals with the client side web application and quickly introduces the user towards finer points like asset pipelining and is followed up on the next chapter by introducing different JVM based servers to run and deploy the web application. Lightweight Java servers and developer productivity tools are listed in the couple of following chapters, which I think do not add much value to the overall premise of the book. The next chapter then covers the design and principals of RESTful web services and demonstrates one created in Jersy which is then followed up by jQuery.

However, the Chapter 10 covers Angular and Sinatra (a mini-web framework in ruby) which is a let down as Java8 has provided native node.js runtime through project Rhino and it would have been interesting to see angular being used in the full MEAN stack (MongoDB, Express.js, Angular.js and Node.js) as express.js is the De-facto framework in the node.js land and angular and express share quite a lot in common. That said, beginners to Angular should use this chapter to get a feel of angular and do not worry much about the choice of server side framework used to provide the RESTful service to the client side application in question. This chapter covers Angular.js in sufficient detail and covers the actual theme of the book but at the end of the chapter, as a user I am left wanting for more - especially given the multitude of client side frameworks available as of today. I will definitely keep an eye open for improvements in this chapter in the future revisions of this book.

The final three chapters deal with the packaging and deployment and touches these areas briefly - it covers just the starting pointers and the users can themselves choose the tools to learn further as they need more.
Another plus I found with this book was a well balanced Appendix - on one hand, practical examples on using different lightweight databases were given and on the other hand, various facts and trivia regarding REST was detailed.

Overall, this book is a gem of knowledge to existing/new programmers who are starting looking into the exciting world of client side javascript based webapps that interact mainly with lightweight web services.
For a few sections where a simple Java based Restful service is demonstrated, sinatra running on jRuby is created which is fine, but can potentially confuse some java programmers who are not familiar with the ruby/jruby landscape. Instead, some offshoot library of Sinatra created in Java could've been used. This apart from the smaller chapter 10 and the fact that any other client side framework and tools like grunt/bower have no mention is my main grouse from the otherwise stellar book that deserves a read for those who are starting up on new age web apps.

Disclaimer: I have been provided a free copy of this book by OReilly under their Blogger review program.

Tuesday, March 18, 2014

8 new features for Java 8

Jdk 1.8 aka, Java 8 is launched today meaning that the General Availability release of it is out in the open and developers can switch from Early Release releases to a tested release for production use. But what does it means for you, the busy Java developer? Well, here are some points that I condensed to mark this release:

1.Lamda Expressions

I started with lambda expressions as this is probably the most sought after feature in the language after probably Generics/Annotations in Java 5.
Here's the syntax:

(argtype arg...) -> { return some expression.. probably using these arguments }

What it does is that it reduces the code where it is obvious, such as in an anonymous innerclass. (Swing action handlers just got sexy, yay!)
So, a thread can be changed as:
Runnable oldRunner = new Runnable(){
    public void run(){
        System.out.println("I am running");
    }
};
Runnable java8Runner = () ->{
    System.out.println("I am running");
};

Similar to Scala, type inference is also possible in Lambdas. Consider the following available example:
Comparator c = (a, b) -> Integer.compare(a.length(), b.length());

Here, the types of a,b (In this case String, from the Comparator interface) are inferred as the compare method is implemented.

The symbol used to separate the block from arguments, -> is quite similar to => already used in Scala and if you are good at it, there is not much reason to switch as you will feel the way lambdas are implemented in java is inadequate(and verbose), but for a good 'ol java programmer, this is the way to go.

2.Generic Type changes and improvements

Taking clues from Lambdas, generic collections can also infer the data types to be used to an extent. The methods for instance using a generic collection need not specify genric types. Hence, the following method
SomeClass.method();
Can be called simply ignoring the type information:
SomeClass.method();
The type can be inferred by the method signature, which is helpful in nested calls like myCollection.sort().removeUseless().beautify();


3. Stream Collection Types(java.util.stream)

A stream is a iterator that allows a single run over the collection it is called on. Along with Lambdas, this is another noteworthy feature to watch out for. You can use streams to perform functional operations like filer or map/reduce over collections which can be streamed as individual elements using Stream objects. Streams can run sequentially or parallely as desired. The parallel mode makes use of fork/join framework and can leverage power of multiple cores.
Example:
List guys = list.getStream.collect(Collectors.toList())
can also be implemented parallely as
List guys = list.getStream.parallel().collect(Collectors.toList())

Another nice example that reduces the collection to a single item is by calling reduce algorithem.
int sum = numberList.stream().reduce(0, (x, y) -> x+y);
or,
int sum = numberList.stream().reduce(0, Integer::sum);


4. Functional Interfaces (java.util.function)

These interfaces contain some default methods which need not be implemented and can run directly from the interface. This helps with existing code - changing interfaces need not make all the classes implementing it implement new methods. This is similar to Traits in Scala and functional interfaces will be compatible with lambdas.

5. Nashorn - The Node.js on JVM

This is the javascript engine that enables us to run javascript to run on a  jvm. It is similar to the V8 engine provided by chrome over which Node.js runs. It is compatible with Node.js applications while also allowing actual Java libraries to be called by the javascript code running on server. This is exciting to say at the least as it marries scalability and asynchronous nature of Node.js with safe and widespread server side Java middleware directly.


6. Date/Time changes (java.time)

http://download.java.net/jdk8/docs/api/java/time/package-summary.html
The Date/Time API is moved to java.time package and Joda time format is followed. Another goodie is that most classes are Threadsafe and immutable.

7. Type Annotations

Now annotations can be used to decorate generic types itself.
Eg: List<@Nullable String> which is not desired always, but can prove to be useful in certain circumstances. Apart from decorating Generic types, it can also be used in constructors and casting.

  new @NonEmpty @Readonly List(myNonEmptyStringSet)
  new @Interned MyObject()

  myString = (@NonNull String) myObject;
Even the array objects can be annoted:
  @NotNull String[] arr;

The inclusion of RuntimeVisibleTypeAnnotations and RuntimeInvisibleTypeAnnotations attributes which cause the .class file to save the annotation information.

8.Other - (nice to have) Changes

Reflection api is slightly increased with the support of TypeName, GenericString, etc.
String.join() method is a welcome addition as a lot of self created utility classes are created instead. So, the following example
String abc= String.join(" ", "Java", "8");
Will get evaluated as "Java 8".
In the Collections package, the Comparator interface is revamped and methods like reversed, comparing and thenCOmparing have been added which allow easy customization of comparison over multiple fields. Other libraries like the Concurrency and NIO have also been updated but is nothing noteworthy for following up and is keeping with the changes in the api.


Overall, Java8 is well thought of and is making mainstream java concise and picking some good parts of Scala/Clojure for the improving its syntax and addressing much sought features.

Thursday, November 28, 2013

Book Review: Java EE 7 Essentials

I have recently finished reviewing the book, 'Java EE 7 Essentials' by Arun Gupta which took me considerable amount of time as I was on and off in its study and have not done any major development/study in enterprise java space recently.
As I was having prior experience in j2ee, I was more keen in knowing what new changes have been transpired since the Java EE 6 in the enterprise java specification. One of the noticeable observation is the fact that this book does not follows a sample application (like the famous petstore example). Depending upon your reading habits, it may or may not bother you.
To understand topics like dependency injection, I had to use the internet as my resource as well, for example, to understand differences in its approach in enterprise java & spring. However, when comparing this against the standard j2ee resources, this is more fun to read and does not bores you down to every little detail (which can be obtained from the api/documentation itself).
The book covers the entire enterprise java stack which can be classified under the following points:

  • Servlets, JSF
  • RESTful and SOAP based Web Services
  • JSON processing
  • Websockets & server endpoints
  • EJB & JPA specification
  • Context & Dependency Injection in Java EE
  • Concurrency
  • Bean Validation
  • JTA and JMS
  • Batch Processing
It is helpful to know a basic idea of these concepts as the book dives right into what's new and noteworthy for JavaEE 7 release.
Finally, creation of a sample application is explained that is a three tired architecture which uses the enterprise java detailed in the book.
For an enterprise java expert or beginner, this is the current go-to book to get started with the Java EE 7 specification and makes the learning somewhat less painful that it used to be.
Although setting up application servers is not dealt with, basic instructions to install and use Netbeans IDE (Java EE bundle) are provided which contains the tomcat application server itself.
Disclaimer: I received my copy of the book in its beta version through the oreilly's bloggers review program.

Wednesday, June 5, 2013

Musings on refactoring



I happened to be across the refactoring in ruby book, which is the
adoption of the refactorings book by fowler from java into ruby. On
reading, one thing which strikes you is the fact that refactoring is persistent across languages and platforms. Java is verbose and allows various coding styles and idiosyncrasies which is evident why lot of java programmers churn
out code that is unnecessarily complex. The proliferation of IDEs has also encouraged this malpractice of creating not just sub-optimal but also unnecessarily complex code.

However, finding the same in ruby has traditionally been difficult as ruby, like its japanese inventor has been neat and concise in its apporach, allowing basically for a lot of bang for a small amount of code. However, as the community grows, there is an acute need to maintain the standards in ruby as well as the developer base is exploding exponentially in ruby. In my opinion, refactoring is a practice that every developer needs to apply and enforce amongst their team as a cardinal rule in programming. Applying this context in the updated refactorings book, I feel while using ruby the answers have become more to the point, but at the cost of grossly sloppy code. It could have better demonstrated in a language that is equally neat and offers variety like java. Python obviously springs into mind where at one place there is a C like procedural code and at the other end there are python experts proclaiming the use of pythonic code.

Having the same refactoring into another language feels alien for a while, but then it is a bliss to realize that same problem existed into another language as well, it was just that a learner like me got carried away in the syntactic differences.

At the end of the day, it is the learner who needs to decide what to and more importantly what not to include in the changes to be made in the codebase.

Thursday, May 23, 2013

Book Review: Learning Play Framework!

This is the book review of learning play framework2 by Andy Petrella.
The book focuses on scala based development on play framework. While the
book focuses on existing users of play, it gives a fast paced
introduction to scala and the level of complexity of the book is
slightly above the developer documentation, but not all the way high up
for serious deployment. There is a chapter for the uninitiated in scala,
which I feel is redundant as a section alone will do, given the
proliferation scala has made in java community. Also, it is easy to get
lost while following the samples, so I'd recommend that you keep the
sample questions handy while using this book.

The book starts with setup of play and a bit about its ecosystem, which is redundant. Next, it gives a fast-paced review of scala(it is not complete and I'd recommend the freely available 'scala for the impatient' book as a learning point). The book then demonstrates scala based templating mechanism in play and also plays with css configuration using LESS. It then delves into data management where data related views and routes are explained. This chapter also provides with the changes made in scala 2.1.1 and provides information about ebeans. The next couple of chapters  cover some of the exciting new features that were not present earlier and it deals with multipart content types as well as realtime client interactions through websocket.
A chat application is demonstrated and usage of Akka is shown to create non blocking multi-threaded access for clients, which is quite neat.

Interacting with web services is dealt next with a twitter integration into an existing web application is displayed. and the book caps off with chapters on testing and deployment. While testing is a chapter that I feel should have come before to further bring the TDD into development, the deployment chapter is totally updated. I was expecting to see the likes of jenkins and metrics tool in CI deployment, but there was various cloud vendors and their offerings that were waiting to be demonstrated.
Overall, this book is a great book to start with Play Framework2 and also does not disappoints with the ecosystem surrounding play.
Disclaimer: I received a copy of this book through Packt Publishing.
You can check out more about the book via http://www.packtpub.com/learning-play-framework-2/book

Thursday, March 28, 2013

Demystifying DevOps


There has been a lot of noise recently about this 'DevOps' movement. In this blogpost, I present my take on the topic and hopefully, help the reader towards understanding it.

Wikipedia puts this as:
 "DevOps (a portmanteau of development and operations) is a software development method that stresses communication, collaboration and integration between software developers and information technology (IT) professionals."

Image: Wikimedia Commons
However, owing to pressures often within large corporations having plentiful resources at their disposal, it is easy to create a hype around a process that in general, is common sense and at the very best is automation.
Speaking of agile delivery, it is easy to deliver a product in small teams that the big ones. As my experiences go, in my present company the product gets a new release every evening(India time), prior to which the changes made in the release are tested by the commiters of the changes itself before the decision about build is taken. After the release is done, basic testing as well as testing of the new features is carried out. While this is pain staking methodology, this works well as the team size is relatively less, leading to fewer overall changes. As an astute reader would have pointed out by now, what is missing is automation as well as demonstration of scalability - about what to do when the product goes live. Some uneasy questions like these are answered by the DevOps which:

  • Focuses on convergence of Development and Operations.
  • Minimizes the gulf/barrier between development and production.
  • Brings about QA into development.
  • Is agile, and strives the processes after development to be agile/flexible
  • Required as rapid scaling & uptime of applications is necessary.
  • Focus on automation, towards reducing repetitive work

So, where does a software developer like me comes in?
Here's what I typically deal while playing with web applications: Java, ruby and some combination of these thrown in together with the odd php or python based applications. In this recession prone economy, there is a requirement for cutting costs; which can be effectively met by ensuring that developers also stick in charge for the deployment process. As this lean process gains its acceptance, it is obvious that development is going to be easier for operations people and similarly for developers, deployment/production environment is going to reduce its level of complexity.

Where does it affects us ?

For the java developers, the heavy duty work gets 'magically' done by the enterprise servers that we deploy on. As java is built with performance in mind, it is easy to ramp up the scale of operations (tomcat is insufficient, deploy on glassfish, Too many queries, use memcached, database connections are piling, use connection pooling).
However, the (physical) servers over which these applications are deployed in production also need to be profiled and studied.

To run load/stress tests, one needs to have a production environment. These are easily available as Amazon Machine Instances or rackspace or any other myriad cloudproviders, however this costs money and given the increasing hardware capabilities, one can perform similar tasks without using the cloud(and paying for such kinds of testing).

Virtualbox needs no introduction and is a free virtulization tool from Oracle (it was a sun offering earlier). However, for our example, even virtualbox fails out of box as it can only provision a single vm. To enhance its capability, one can use Vagrant, which is a tool for building and distributing virtual development environments. In other words, it is what rvm is for ruby; a manager that lets the user have clean state (of the server) as well as abstracted setup(different server configurations from same set of VMs). Vagrant is primarily a driver for virtualbox vms.

One can work with it on a per-project basis(similar to a version control) and configure it through a plaintext file, Vagrantfile which vagrant uses as a DSL. The nicest thing about it is that it does everything that you need virtualbox UI for, which is really beneficial while running numerous headless servers.
For example:
Vagrant::Config.run do |config|
   config.vm.define :app do|app_serv_config|
     app_serv_config.vm.customize ["modifyvm", :id, "--name", "app", "--memory", "512"]
   end
   config.vm.define :app2 do|app_serv_config|
     app_serv_config.vm.customize ["modifyvm", :id, "--name", "app2", "--memory", "512"]
   end
   config.vm.define :db do|db_config|
     app_serv_config.vm.customize ["modifyvm", :id, "--name", "db", "--memory", "1024"]
   end 
end

This is just getting started with different servers as in real life, you'd need to provision the vms - setup them with useful stuff and services, deploy and run your programs on them. Two of the widely popular tools that deserve a mention are: puppet and chef.
While the documentation and discussions around them typically involve rails, there is no reason why java, and for that matter any other technology stack can not be used.
Puppet helps us chiefly with setting up the stuff; it is not big on automation, but saves us the work of setting up different softwares in a vm manually.
For instance, the following puppet config file sets up apache and ensures that it runs as a service during vm startup:

Exec {
  path => "usr/local/sbin:/usr/local/bin:/bin:/opt:/usr/sbin"
}
package {
  "apache2":
      ensure => present
}
service {
    "apache2"
         ensure => true,
         enable => true
}

Once we've set up our environment, only half of the battle is over as we also need to monitor and report the status of the environment to evaluate the areas of concern in the setup.
Nagios is a wonderful monitoring tool that displays the health of server - which is needed in a headless environment. To use it, we need to build a puppet module and  enable its service and use the nagios server interface on the vm itself to gleam the details of the vm, displayed as services  as well as re-shedule it when needed. To monitor the nagios instance itself, one can use a service like pingdom.

There are various other tools related with setting up of the servers, but would stretch the discussion far from the concept of devopts in practice. Thus, there is a need to resist the urge of going on to cloud, but keep the production environment close to the development environment itself. This would not result in the elimination of cloud as production environments are now predominantly becoming cloud based, but will go a long way in ensuring lesser pains when the product goes 'live'.

Wednesday, October 31, 2012

An insight into sonar plugin development


In the recent months, I've been involved in developing a language plugin for sonar that displays different metrics for a specified language. I am writing this post as there is not much content available for this topic even when sonar is a widely popular tool.

Below are tips to avoid some of the common pitfalls in developing sonar:
  • Read the official documentation carefully, even when there isn't much of it.
  • Download the sources of all the open source plugins and try looking their code for proper understanding into plugin development.
  • Do read books available on the topic in addition to the blogs for proper understanding.
  • Use maven, this is a real lifesaver and focus on TDD while coding - as you cannot hope to debug the process otherwise.

If you've followed these steps diligently,  you'd have a basic understanding on the innards of sonar plugin development.
In very brief words, a sonar plugin comprises of a java based program that performs the heavy duty work of loading/populating the code metrics and a ruby based UI (embedded ruby pages(erb)) that helps in displaying of this data, in a nice manner and also uses view helpers to create eye-catching widgets that display these data. There is another provision for a complete rails based application to be created in place of this, but that's a different idea altogether.







In my case, I needed to Populate the data from a database , so in a class implementing Sensor interface, following method was present

  public void analyse(Project project, SensorContext sensorContext)
Inside this method, I can use Project object to read from project properties,etc as inputs to my process and SensorContext object as output in the form of  measures;

sensorContext.saveMeasure(new Measure(MyMetricAnalysisClass.MetricName, Double.valueOf(metricVal)));

One of the nice features of working with erb templates in the UI for widgets is that you can use a lot of helper methods to create slick effects with your data. One such example is a piechart:
<%= piechart( 'Field1 = '+var1.to_s+'; Field2 = '+var2.to_s+';'  , { :size => "500x200"}) -%>

One caveat present while transferring the data from server to views is that only text data works well, the information about using data structures is sketchy at best so we are left with using the data delimimted with ';' (See the first argument in piechart helper method above). Surprisingly, other data cause widgets to fail with no descriptive error messages. For larger data, it is more intuitive to use a GWT page instead of a widget as the users can focus more on the details provided in the page as opposed to the concise totals present on the widgets.
Hope this post helps the users developing plugins on this platform something to get started with.

Wednesday, August 22, 2012

Problem with as/400 jdbc driver

While iterating over the resultset obtained from an as/400 datasource today, I was constantly getting this error:
descriptor index not valid
Upon searching for this error, I found its solution that should better be shared with everyone.
The jdbc driver starts the columns from 1 instead of 0 index, so when calling some data from the ResultSet object, such as resultSet.getString(1), the column fetched is the first column, and not the second column as it seems.
It could be nice if the error message explained this problem in a non-cryptic manner. This problem is one of those nagging problems that repeat from time to time(as this api doesn't seem to conform to java conventions in this case).

Tuesday, August 21, 2012

Book Review: Hadoop: The Definitive Guide, Second Edition by Tom White

















Anyone interested in big data management today has at least a passing familiarity with Hadoop, an open source map-reduce algorithm implementation. Here's my review of the second edition of one of the most comprehensive books on the topic.
As a longtime hadoop enthusiast, I already had read the first book, I was interested in finding out what this second edition has in store for the readers.

The book builds over its predecessor and apart from addition of Hive and Sqoop, a case study covering graph visualization in social networks has been added. The hadoop version has been updated, as a developer, I'd recommend latest stable release of hadoop as it is an active project. However, as Tom White is himself a committer in this project, various project insights are added along the way as in the original edition.

From the first time hadoop adopter's point of view too, this text is an easy to adapt and the learning curve of hadoop is lessened to a great extent.
The book starts by building the context, presenting the history and ecosystem of hadoop and gives its user a high level overview. The underpinings of hadoop, or the mapreduce algorithm and its implementation in hadoop is covered in the next few chapters. This contains practical aspects of running any hadoop application including HDFS file manipulation and map reduce operation in detail. An exhaustive list of mapreduce techniques alongwith their examples are then covered that come up in everyday development while using hadoop api to interface with big data.
Another highlight of this book is the comprehensiveness of running and deploying hadoop in various configurations. Also, closely knit data management tools in the hadoop ecosystem or its sub-projects such as pig, hive, hbase, zookeeper and sqoop have been covered.
This is followed by various case studies that make an interesting read. It was disheartening to see no major updates in the case studies compared to the previous edition .

From a person already having the original edition of this book, the second edition does not have much to cover, but for a person not having read any previous editions, this is a comprehensive book.
Note: This book has been provided to me for reviewing under the Oreilly Blogger Review Program.

Wednesday, May 30, 2012

Reducing the repetitive java code

Just came across this neat hack of reducing your boilerplate java code. The project lombok is one such micro framework that injects the commonly occurring code based on the passed annotations. It is not an annotation processor, but rather an IDE integration that adds code on the fly.
After downloading, run the jar and specify your eclipse installation. You then need to restart the eclipse for the changes to take place. One of the most used annotation is the @Data annotation that wraps various other annotations in itself.
In case of this annotation name creating conflicts, the user can opt for member specific annotations .
eg:

import lombok.Data;
@Data
public class Pojo{
    private String name;
    private int id;
    private float weight;
....
}
In this case, we only need to specify the member fields and all the getters/setters as well as commonly overridden methods like toString and hashCode would be auto-generated by the IDE.
There are provisions for other IDEs and command line execution also for non-eclipse users which make this a nifty tool to use.