Monday, October 19, 2009

My experience with Google AppEngine - Java (Part 2)

In continuation of My experience with Google AppEngine - Java (Part 1).

After uploading the same DTA file which work local to Google AppEngine, I received a big fat Internal Server Error message. I diligently checked the Logs available through GAE admin console, and I found a suspicious error. I no longer have the exception information, (GAE rolled-over the logs), but it was quite obvious the problem is due to the file upload code. The example on Apache Commons File Upload User Guide, does not work out of the box on GAE.

The answer was not too hard to find after some googling. GAE/J does not support File system. The proper example is available even on the GAE/J FAQ: How do I handle multipart form data? or How do I handle file uploads to my app?

After updating my code according to the example in the FAQ, I no longer get an Internal Server Error, but my page would not refresh with the new information. It works fine locally, but will not work on GAE/J!

This is one of the tougher problems since the symptom is not specific, it is difficult do even know what to google about. There was no error message which I can google about. One of the best approach I find to resolve this issue is to think like the system. It is similar to the good old say of "think in someone else's shoe". However, the someone else this time is actually Google App Engine.

Going through the thought experiment combine with code inspection was a powerful way to troubleshoot a problem these kind of problems. It does require a good understanding of the different layers of abstraction in a distributed computational system. Good thing the early days of tinkering with PC parts + reading random computer articles/books + education did paid off. It all came down to this small segment of code.

Object myObject = request.getSession().getAttribute("mySessionObject");
// do something with myObject
return;

All I had to do was:
Object myObject = request.getSession().getAttribute("mySessionObject");
// do something with myObject
request.getSession().setAttribute("mySessionObject", myObject);
return;

Long and behold, the code is now working properly.

After going through the issue above, I am convince writing stateful code over a technology (HTTP) which was inherently stateless is difficult task. With the computer ecosystem getting increasingly complex, layered with abstraction on top of abstraction, it is getting harder and harder to find people who understanding the stack of technologies.

With the increase advancement in browsers, libraries that normalize the difference between browser, I think it is a good time to re-examine moving the state back into the browser (except security). No more 50+ MB session objects.

The plotter code stores the raw data of what to plot in the J2EE session, and I would like to move it out and maintain that information on the client side. In fact, I would like to re-architect the page such that no function will require a page load.

With that goal in mind, the only real reason the page requires a reload is the file upload form post. Some googling around, I was excited to find DWR3 is moving to support file upload.

After many tries, I was still unable to get DWR3 file upload to work on GAE. Even just including the DWR3 libraries into the GAE project, the project will fail to start up properly in the local sandbox environment. This is actually a good thing this failed locally, instead of having to find out after hours of writing the code that it would not work on GAE.

The error I was getting was:

javax.servlet.ServletException: org.directwebremoting.extend.ContainerConfigurationException: java.security.AccessControlException: access denied (java.lang.RuntimePermission modifyThreadGroup)
at org.directwebremoting.servlet.DwrServlet.init(DwrServlet.java:77)

To give DWR3 credit, it is still in a pre-release stage and the code-base I worked with was RC1 and RC2. Although DWR3 was explicitly listed as "Compatible" the the "Will it play on App Engine" list from google, it looks like I am not the only one who is out of luck.

Issue#376 has been open against DWR to hopefully get it to support GAE and other platforms.

There are many posting online from different people on different mailing list with the same problem.

The most useful post online I found was this. Basically, the problem has to do with DWR3 spawning threads during start up. These threads are responsible for some house keeping within the DWR3 library, but spawning threads is a big no-no in GAE/J. There are workarounds in the mailing list to override those house keeping code, but is it really worth investing all these energy towards it?

Maybe DWR3 is moving towards supporting the GAE/J platform, however, at the time of my investigation, only DWR RC1 and RC2 was available.

Inspecting ContainerUtil version 1.35 tagged for RC2, the problematic code of using the DefaultScriptSessionManager in the setupDefaults() method during initialization is still there. (DefaultScriptSessionManager is the one spawning threads).

container.addParameter(ScriptSessionManager.class.getName(), DefaultScriptSessionManager.class.getName());

There are later versions of the file, and the setupDefaults method in ContainerUtil has changed, but I don't think investing the energy at this stage to get DWR3 just to start up in GAE/J is a wise decision. There may be a lot more compatibility issue between DWR3 and GAE/J down the road.

So with all these rambling, what is the moral of the story?

GAE/J is a great platform which promise to reduce a lot of infrastructure and system administrative cost. However, there's a catch. The power of Java is its ubiquitousness. There are Java libraries for many many purpose, from game engines to integration with mainframes. But GAE/J does not fully support the J2SE/J2EE standards, and a lot of libraries which you or your enterprise relies on may not work on the GAE/J platform.

Secondly, writing truly distributed server side code on J2EE is not easy. I suspect there are many code out there running on the J2EE platform is either working because it is running in a single JVM, or relies on load-balancer session stickiness to make the non-distributable code works in a "cluster". You will lose that "luxury" moving to the GAE/J platform. The above example of remembering to set you session attribute back into the session is only the tip of the iceberg of the challenge in distributed code on J2EE. There are many applications out there with enormous session objects which will grind to a halt if the session object needs to replicate across nodes, as in the case on GAE/J.

Saturday, October 3, 2009

My experience with Google AppEngine - Java (Part 1)

Recently in the Information Technology space, cloud computing is the latest buzz words. One of the players in town is Google. Their offering, Google App Engine (GAE), provides Platform as a Service (PaaS). Python and Java are the two platforms provided by GAE, aka GAE/P and GAE/J.

Recently, I had a chance to worked on a mini web application project. The application is simple, where a user can upload a file captured by one of these nifty device, iButtons, and the information are charted out similar to the OneWireViewer desktop application.

A few years back when Google first came out with GAE/P, I toyed with it a bit, writing a simple (and ugly) bulletin board application. It was a good introductory experience, and PaaS idea looks promising. Unfortunately, my *real* work project went into crunch mode and I can no longer devote my time to study Python and Django framework on GAE/P.

Then Google came up with the Java Edition of GAE, which reinvigorate my interest in PaaS. With my background in Java, GAE/J is a nature choice to build my mini web charting application.

Although GAE/J was still in "Preview-Edition", it already came with a pretty decent eclipse plugin. I still remember the early days with GAE/P where I have to memorize command-lines in order to build, compile and run the local GAE/P environment to test. With this eclipse plugin, it freed up some of my brain cells to focus on the actual application development.

Armed with a sample data file, the first thing to do is to build a parser to read-in the file and transform it into a POJO data model which can be manipulated easily. Simple enough, no issues there.

Next up was to create the upload functionality. Apache Commons fileupload comes in handy. I follow the instruction available on their website and so far so good. The functionality works seamlessly on the local GAE environment (build on Jetty). After integrating the upload functionality and the parser, I have half the application completed in a few hours.

Lastly, the charting part. With my new found love for all things Google, I gave Google Chart API a spin. The concept of Google Chart is simple and elegant. It is a REST-ful API which you supply all the input to the src attribute of the <img> tag, and viola, you get a chart image back. Elegant yes, but not without its flaw. By simply looking at the API, the inherent limitation of the design became apparent. Since the <img> tag basically using a HTTP GET command, the payload of the API is limited by the inherent length of a URL!

http://chart.apis.google.com/chart?cht=p3&chd=t:60,40&chs=250x100&chl=Hello|World

You know how people say "Love is Blind"? Yes it is true. Even in the nerdy world of software development. Damn Google with its slick API and sexy looking charts. It is so hard to resist! With my new found love for all things Google, I ignored my intuition and solidered on.

Another hour or two, I have a local working prototype. Time to celebrate? ... Not yet. Let's upload the application and test it on the Cloud!

Being all things new and shiny, of course it did NOT work on the actual GAE/J cloud. Upon uploading a the test file, I get a nice big:

Internal Server Error

To be continued ...

Wednesday, September 23, 2009

TTC losing money again?

TTC is on the news again. http://www.thestar.com/news/gta/article/699522. I love the first line "The TTC's success has helped create a $17.4 million deficit". I can be very successful at spending/losing money too. If only I can find a job where success is measured by ones ability to lose money. However, consider TTC has $1.2B operating budget, missing the mark by less than 1.5% is not that bad. The chronic budget shortfall is the bigger issue. Perhaps management should pad their budget with more contingency :)

The whole article or the shenanigan around budget shortfall because of metro pass is really not that interesting. Just some political/media verbiage to justify a fare hike. With the complexity of balancing a $1.2B operation, I am sure the shortfall is a lot more complicated than "because of our success with the metro pass". It is kind of like answering the "what is your weakness" job interview question with "I sometimes work too hard" or the like. Just plain silly.

What is far more interesting are the comments and unverified facts in the readers' comments section of the article:
  • Over paid Union, fire them all, privatize it, 100K ticket collector, blah, blah blah ...
  • TTC's ~70% cost recover from users is better than most cities in North America
  • TTC's drivers' contract has a "GTA clause" which TTC drivers are guaranteed the best paid job in the GTA area.
  • TTC claims the the rider defraud rate is low, approximately 5%. Anyone who has use the system will question how on earth did they come up with that number with their stone age equipment? Survey?

Saturday, September 19, 2009

Previous PATH system variables

After using SETX commands on windows, I have accidentally set my system PATH to empty! The good news is your previous/other system PATH is available from the windows registry.

Simply run "regedit" and go to "HKLM\SYSTEM\ControlSetXXX\Control\Session Manager\Environment\Path" to retrieve the path in previous/other configurations.

Thursday, September 17, 2009

MS Windows' SETX command

I was sick of clicking to control panel > system > advance properties > environment variable just to update the system PATH variable. Fortunately, the new Vista build comes with the command SETX.

To add to the system path, now all I have to do is run the following command in an administrator mode command prompt:
SETX -m PATH "%PATH%;"
I also think it is available to XP user by downloading Windows XP Service Pack 2 Support Tools

Sunday, September 13, 2009

Freeundelete from OfficeRecovery.com

I got in to the bad habit of just doing straight file deletes on windows, skipping the recycling bin all together. To do this, simply hold down the shift key when you press the delete button. Explorer will still prompt you for a confirmation, but anyone who has done this enough times will have their muscle memory take over and hit enter key right after.

Unfortunately, I have done this to one of my pet project which I have been working for a few days now. Pet project also means no SCM, no backup. Lesson learned.

The good old undelete is no longer available from DOS! In a frantic search to recover the files, I came across the "FreeUndelete" tool available from OfficeRecovery.com, and it worked like a charm.

Although there are a few corrupted files, most of the files restored without any problems. I hope I don't have to use this tool every again.

Sunday, September 6, 2009

RPX: Aggregating Identity Services as a Service

I recently listen to a podcast by Brian Ellin and Doug Kaye on IT conversation. The topic of the day was a relatively new SaaS offering by JanRain. The service offers the ability to integrating existing authentication and identity services, for example, Facebook, Google, Windows Live, etc. Why ask your user to sign-up for yet another account when user identity is not part of your core business?

There were lots of attempts to contain the proliferation of user name, passwords and online accounts, but none that I know of really took the approach of integrating with existing services and then package it as a service. I think the guys at JanRain is onto something.

Check out RPX by JanRain @ https://rpxnow.com/

Friday, September 4, 2009

Digital Hardware Project

After some random browsing on the internet, I discovered my digital hardware project is publicly out there on the internet.


I had ambitious dreams back then. I convinced my partner we are going to build a VOIP application on an FPGA using merely the laboratory time in one course. I even manage to secure a $2,000 piece of hardware from Xilinx as well as the expensive programming and simulation software. At the end of the course, we don't have an operational prototype, nonetheless it was a valuable learning experience.

Without further ado, here is the write-up of the project.

Wednesday, August 26, 2009

Compaq 6910p close lid and freeze laptop problem

As the title suggest, the Compaq 6910p laptop from HP my company give me freezes every time I close and re-open the lid. I finally have time to look into the problem.

Apparently, HP has a fixed out for this issue. BIOS patch version F.17 (4 Nov 2008) addresses this issue.

The description of the fix is:

"- Fixes an issue where closing the notebook lid and leaving the notebook idle for several minutes causes the LCD display to be blank when the notebook lid is re-opened."


To check what version of BIOS you have, run "msinfo32.exe" from the command line.


Just download the executable from the link above and run it. You will have to disable BitKeeper encryption if you are using Vista.

Otherwise, just leave you laptop alone and let the program flash your BIOS.

Your laptop will reboot itself. Just run "msinfo32.exe" again to make sure you have the latest BIOS update.

Friday, July 31, 2009

The Mythical Man-Month

I have recently completed my reading on the "Mythical Man Month" by Frederick P. Brooks, Jr. This book was written over 20 years ago about software development. After 20+ years, a lot of the observations made by Mr. Brooks were still very relevant. One of the takeaway for me was the importance of conceptual integrity. Nowadays, software are so complex and the turn over of developers so great that no body have a good idea of how the application works anymore.

Combine the complexity of software with the get-rich-quick and self-entitled generation, very often the conceptual integrity wasn't even sound in the first place.

Often times, I see people come on and off projects. Perhaps people take the whole joke about "cogs in a machine" too literally. It was assume that a few weeks of "Knowledge Transfer" session was suffice for the project to continue humming along.

This is why the recent paper by Tom DeMarco really struck a cord with me. With todays metric-centric project management towards software development, Tom DeMarco comes out to clarify his often mis-quoted phrase "You can’t control what you can’t measure". In fact, last week I just came back from a internal week-long training. The entire training rally around the quantitative portion of project management and de-emphasize the qualitative part of management.

Unfortunately, the discipline of project management seems to continue centered around workplans and metrics. There are droves of "managers" with their glossy PMP certificates which ask their "sub-ordinates" to create a MS-Project plan. Then they put those plans together into a "master-workplan" and chase people for their ETC. Where the heck is the "management"? These are merely administrative task. Yet, companies continue to pay "Project Manager" big money merely to do administrative task.

When you have "project managers" who "manage" your software development project, who proudly proclaim they have never written a line of code, it is time to get our of that project.

Thursday, July 30, 2009

Wolfram|Alpha

The folks from Wolfram has an ambitious goal:
Wolfram|Alpha's long-term goal is to make all systematic knowledge immediately computable and accessible to everyone.
After trying it out a few times, it proved to be a valuable tool. For example, couple of weeks ago, my friends and I were talking about Y2K, the IT boom of the late 1990s and early 2000s. The topic of when is next IT boom (if there is one come) was inevitably brought up. Being a nerdy bunch, we all know about the limitation of time in a lot of computer systems. Since in a lot of system, time is defined as the number of second from the Epoch time (January 1 1970) where the storage mechanism is a 32-bit signed integer, it is fairly easy to see eventually, the number of seconds from Epoch will exceed a 32-bit signed integer.

So when is the exact time?

This is where Wolfram|Alpha comes in handy. Simply type in "1 January 1970 + (2^31-1) seconds" in their input text field.

So it is on January 19, 2038. That's still a long way away. Maybe all system would have moved to a 64-bit platform by then, or maybe not?

Monday, July 20, 2009

The Scala Bandwagon

Learning a new language was not as easy as I thought. Especially learning a whole new language paradigm. I am going to collect a list of useful Scala resources in this post and see where that rabbit hole leads me to:

Official Scala Language website:
  • Learning Scala
  • Scala IDEs - I am so accustom to IDE, I can no longer go back to the VI days. This might actually be a bad thing though. After all, it is a good idea to learn all the fundamental tools of a language before moving on to tools aim for productivity.
Other Articles:
  • The Seduction of Scala - 1 of a series of articles of where Dean Wampler venture into the Scala. This is probably a good place for me to start as his motivation and background to venture into Scala is very similar to mine.
  • Mixing Java and Scala - A practical approach on how to introduce Scala to an existing Java project.
Other non-learning readings:
Update:
  • There got to be a way easier than cutting and pasting links on a blog post. Long and behold, delicious came to mind. I am ashame to be so late to the delicious game. Here are my Scala bookmarks.

'IAMRICH' Mustang

On nice sunny day on my way to a golf course, I came across a white convertible Mustang with a 'IAMRICH' license plate.

Seriously, with a white Mustang?!

Friday, July 17, 2009

That's a wrap

After more than 2 years at THE project. It's a wrap. It has been a great experience for me from meeting with executives to talk about the vision and strategy, to staying up all night for production cut-over, to fixing code defects shoulder-to-shoulder with developers. It was an end-to-end experience for me.

After 2 releases, it was complete. As a proud nerd, I had to make a lot of sacrifices. Sometimes, technical elegance is not what business needs. The difficulty with an Architect is balancing all aspect of the project, weighing in technical challenges, budget, business needs, ROI, team mix and capabilities, this is truly work of art which no methodologies could replace.

Quality Center and Vista

My laptop was recently updated with the latest corporate image. The OS is Vista. For all the fancy graphics came with a price.

As a consultant working on enterprisy projects, HP's Quality Center was inevitably part of the mix. The URL no longer works for me!

Well, it took some serious Googling and investigation to find out what the problem actually is. I guess the problem stems from the new security feature which all process are no longer run as "Admin", the the ActiveX controls in QC requires "Admin" access.

To get around this, I download the standalone QC Explorer and make sure I ran it as an "administrator". Voila, back in business.

The URL for QCExplorer is at http://updates.merc-int.com/qualitycenter/qc90/others/qcexplorer/index.html

Thursday, July 16, 2009

Year over year improvement


Thanks to a friend of mine who introduce me to the game of golf. I have started to play golf seriously 4 years ago. I can't believe it has been 4 years! Adhering to the characteristics of a true nerd, I have gather some statistics of my game to chart my progress. So far, progress looks good, but I am beginning to see a leveling curve (not good!). Oh well, it is hard to keep up with a linear improvement while only being a weekend golfer with a full-time job!

Unfortunately, I was not rigorous enough in keeping other important statistics such as fairway hits, putts and green in regulations. I guess I should really start to keep these stats to help chart my improvement, and take a high-level analysis on what area of my game I could focus more on.


Wednesday, June 3, 2009

Micro-economy within a company

I had this idea of a corporate currency which is used internally within company to "reward" people. This virtual can directly or loosely correlate to the person's bonus or salary. Looks like this concet is used in Linden Lab's the make of Second Life.

Listen to Philip Rosedale's podcast who touch on this topic here:

Sunday, April 19, 2009

Useful SAP Transactions

I worked on a consulting project laying down the roadmap and consolidate a client's customer identity and access management.  Like a lot of large enterprises, they use SAP.  Through various interviews and meetings, with some common sense, it is pretty clear a lot of the client employee's knowledge is either:
  • Silo to their particular department or function
  • Out-of-date or plain wrong
So instead of more just asking around, I went straight to "the system".  Before this project, I kind of hear SAP is notorious for their user-unfriendliness.  Now, I got to experience first hand.

After a few months of reverse engineering the clients organization through interviews and system analysis, I have came up with a list of useful SAP transcations (some for R3, some for CRM, some for both):

XD01-03 - View/Edit Customer Master Record
PIDE - View/Assign classification to Account Group.  Use for synchronization between R3/CRM
SE38 - View/Edit Programs
SE37 - View/Edit Function & Modules
XDN1 - Customer # Range
OBAR - Assign Customer Account Group to Number Range
SE16 - Browse SAP Tables (useful CRM tables includes BUT000, BUT051
SE11 - Maintain SAP Tables
SU53 - Display authorization data (of your login account)
WE02 - IDOC List
SM59 - RFC configuration
SM58 - Transcation RFC error log
BD87 - Status monitor for ALE messages
SE93 - Maintain transcation
SALE - Display IMG (contains stuff about ALE)
SM04 - User List (display a list of connections to SAP)

Useful program
RBDMOIND - Update status in WE02

Monday, April 13, 2009

Functional Programming

In my Google Reader subscription, the words "Functional Programming" kept re-appearing every so often.  I finally took the plunge to take a look at what all the buzz is about.

After a few laps around the internet, an answer on stack overflow gave a pretty good elevator pitch about all the fuse around FP.

http://stackoverflow.com/questions/411290/why-do-people-think-functional-programming-will-catch-on
Just as graphical user interfaces and "code as a model of the business" were concepts that helped OO become more widely appreciated, I believe that increased use of immutability and simpler (massive) parallelism will help more programmers see the benefits that the functional approach offers. 
Since my back ground is in Java, and rather than starting from scratch with a purly functional language like Haskell, Scala seems a good place to start.  Also, joel.neely made a good juxtaposition with FP and pure FP language like Haskell against OO and pure OO language like Smalltalk:
However, languages that enforce a functional style are getting lots of virtual ink these days, and whether those languages will become dominant in the future is an open question. My own suspicion is that hybrid, multi-paradigm languages such as Scala or OCaml will likely dominate over "purist" functional languages in the same way that pure OO language (Smalltalk, Beta, etc.) have influenced mainstream programming but haven't ended up as the most widely-used notations.

TinyURL

I recently came across this Question on stack overflow about someone wanting to write a URL shortener service like TinyURL.


Upon further reading, it occurs to me that TinyURL probably use some similar table look-up implementation.  I.e. 
  1. "http://tinyurl.com/dbeeod" where "dbeeod" is the key and the looked up value is "http://www.infoq.com/presentations/Making-Roles-Explicit-Udi-Dahan"
  2. Then TinyURL simply does a redirect from the key to the looked up URL. 
Seems like a logical way to implement the solution for this problem.  

I have been using TinyURL for links on my tweets on Twitter, then it dawns on me, if TinyURL goes down or goes out of service, all the TinyURL links will be brokens.

Saturday, April 11, 2009

Weak type vs Strong type Languages

Or is it short term programmer productivity vs long term code maintainability?

Facebook is a good example of how using PHP, they were able to achieve up-front benefit of programmer productivity which led to their initial primary goal of pushing the product out the door.

In the long term, as Aditya Agarwal, Director of Engineering at Facebook, puts it. Facebook, written in the weakly type PHP programming language, is making the large code-base difficult to maintain and analyze.


The way Facebook address this issue is an interesting one.  While PHP remains the front-end web programming language, a good chuck of their services is written in a strongly typed programming language such as C++.

Perhaps, web-programming should follow this evolution? Weakly type language such as Ruby, PHP, Python, are used for version 1.0 to push the product out the door.  As the product matures and code-base grows, refactor the services to use strongly type language?

OSGi

I have heard a lot about OSGi, but never really got a change to look into what it is all about.  Today I finally got a chance to do some digging and found a very good introductory article/tutorial from Javaworld:


Very powerful concept for large enterprisy project.

Second part of the OSGi series from Javaworld includes Spring DM.  Another great article to learn about the concepts!

Friday, April 10, 2009

Custom Spring Security Authenication

I recently answer a question on Stack Overflow on how to create custom authenication in Spring Security.  I think it is a pretty good quick start guide.

Monday, March 30, 2009

Orthogonality

Google Reader.  One of my favourite tool which enable me to aggregate some of my favourite source for information.  It has also expand my horizon to the vast knowledge and information out there in the Internet.

One of my recent addition to my Google Reader subscription is Jeff Atwood's Coding Horror. Through his blog I also picked up so very interesting blogs such as Eric Lippert's Fabulous Adventures In Coding.

An interesting read I got recently from the blogs originates form Eric Lippert's Five Dollar Programming Words: Orthogonality.

My first experience with this word is probably in introduction to geometry where the word means "at right-angle".  However this word in the programming or IT solution sense really ring a bell with me.  

Recently, on my consulting gig, I was working on a solution to a business problem which was very "difficult".  Difficult not in the sense which the solution is difficult to understand nor was it difficult to implement.  I didn't quite get to the bottom of the root cause of the problem until this five dollar word so succinctly puts it.  The solution to the problem lacks orthogonality.  

In other words, a small change to a requirement or approach often leads to changes in other use-cases or other areas of the solution.  Unfortunately, I think I am at the point of no return.  I.e. there is no time to go back and re-work the approach.

Nonetheless, this word really had a profound impact to my approach on problem solving and designing a solution.  Next time around, I would spend much more time and emphasis on improving the orthogonality of a solution.

Introduction to the Credit Crisis

Great introduction to the credit crisis: The Crisis of Credit Visualized.  Probably an over-simplification, but nonetheless, a great introduction.

Joel's Big Macs vs. The Naked Chef

I have been working in one of those IT consulting companies that Joel commented on in his article Big Macs vs. The Nacked Chef and I couldn't help nodding my head in agreement while reading his blog.

Most of the time, I whole-heartedly agree with Joel's points.  However this time around, I had to disagree with him on the whole notion of Beware of Methodologies.  I worked in one of those companies which these methodologies are created and taught. It is a great way to standardize the lingo across geography and a diversed workforce.  After studying the methodologies and the tools around, it is pretty clear to me these were written by some very smart people.  One of those tools I am glad I got a chance to used is the estimators.

Even though the estimators are written in, god forbid, excel and vbscript, the out-come of the tools was a excellent framework to work within to estimate projects.  Each field has their own estimator and each estimators are different.  No doubt some are better the other, nonetheless it is one way to put structure into the art project estimation which I argue even the most talented programmer, project manager or genius can gain some benefit from.

I want to emphasize this one more time: these methodologies and tools are merely framework or guidelines.  I.e. if the methodologies list any deliverable which doesn't make any sense for your particular project, document exactly why it doesn't make sense to do it, and don't do it.  The estimator is even a better example of this.  In the custom application development estimator, it list every activity in the custom application development methodology.  In the custom application development methodology, it list every activity in custom application development known to man-kind.  By default, after you enter all you inputs to the estimator, the estimation is going to be insanely huge!  This is normal.  The next step in the estimation phase is to go through the tasks, assess the client's needs, and rationalize their needs against the tasks.  Scale back the task estimation if it is too had, drop it if it is not needed.  Bottom-line, work with the client and document the decisions on scaled-back or drop task.

More often than not, even the smartest person cannot forsee everything, or he/she assume it does not need to be done.  It is often the aggregation of a missed task here, a wrong assumption there which cause consulting projects to fail.  In other words, these methodologies enable people to learned from other's past experience.

Don't get me wrong, these methodologies are by no means perfect.  In my opinion, these  methodologies are a great framework to work within.  The problem is not the tools or methodologies itself, but the people using them.  The not-so-talented people often treat these tools or methodologies as the "Silver-bullet".  Instead, they are merely guidelines or frameworks which one could work within.  It is not an end, but merely a mean to achieve the end result desired. 

After all, it is garbage-in garbage-out and there is no black-magic or cure for that.

Sunday, March 29, 2009

Immutability vs Readability - The Spring bean injection method question

I have worked with Spring for awhile on the side, as a pet project, and something just to try out. Not until recently, did I get a chance to use it in an "Enterprisy" environment. I didn't quite expect this discussion to ever come up, but I was genuinely impressed when a manager actually brought up the topic around the approach to spring bean configuration.

SpringSource recommends configuring Spring beans using the constructor method. This allow the beans to be immutable and immutable objects improves the orthogonality of the code.

The manager on the project said the company recommends bean configuration through properties. The reasoning she provide is improvable readability. This is true since reading the spring configuration of beans injected through constructor does not indicate which "property" is being set. However given the integration of today's IDE with spring, one can quickly look up the source code to determine the property being set via the constructor. If a third party library is being used, IDE these days are also tightly integrated with javadoc where you can quickly look up the parameter you are setting too.

Given all the advantage of immutable object and improve orthogonality of the code, I would strongly suggest this out weigh the "inconvenience" or "decrease in readability" of the configuration file. I am a big highly orthogonal code or system, but this is a topic all by itself for another post. For now see Eric Lippert's blog on orthogonality in the context of programming @ http://blogs.msdn.com/ericlippert/archive/2005/10/28/483905.aspx

Like all things in life, I do think constructor-injection is a rule of thumb. It should be by no means follow blindly. Constructor-injection is hard to read due to the nature of setting via either "Order" or "Type", once you get a certain amount of properties to set and a lot of them are of the same type, it gets really confusing. When you reach that point, it is important which we re-evaluate the original position: Is making the bean immutable worth the price to pay in readability?