Java

Atlanta Java DevCon 2004

Finally, an inexpensive Java symposium. With conferences such as JavaOne ($1800) and TheServerSide Symposium ($1200) pushing the limits of IT Training budgets, and even local events like the NFJS conferences ($500) priced so companies can only send a fraction of their development team, I’m excited that AJUG has decided to present AJUG Java DevCon 2004, at a bargain price of $50.

The $50 price tag makes it possible for companies to send their entire team, or even for individuals to realistically pay their own way. Of course, there isn’t the full roster of nationally renowned speakers like at JavaOne or TSS Symposium, but you do get the Pragmatic Andy Hunt, Art of Java Web Development author Neal Ford, Sun expert Richard Manning, and a few other folks.

On a side note, I have been accorded the honor of speaking at this event. The session is called “Leveraging J2EE Security”, and the focus will be on concepts similar to “J2EE Security is superior, but you aren’t using it”. It’s not the most sexy, cutting-edge topic. Rather, much like this weblog, it aims to address a real meat-and-potatoes issue that can be immediately put to use in many applications with an immediate, positive impact. Whether or not you come to my session, I hope you’ll attend the event if you’re in the Atlanta area – I believe it will be one of the best value-for-money events available to deepen your Java skills.

Advertisement
Java

Easy Hibernate Transactions

Trying to manage transactions can get ugly. Sure, you can gloss over the issue, treating each insert/update/delete as an atomic transaction – if you like your database in an inconsistent state, or have a trivial database. At some point, though, you’re likely to decide you need transactions, or else your logic attempting to keep the DB consistent will become unmanageable. At first, it seemed cumbersome – needing to pass the Hibernate session around to every persistence method that I wanted to share in the transaction. Then, I discovered a better way.

In using Hibernate for persistence, I often happened upon the discussion of a ThreadLocal, but frankly found the term a bit unintuitive and intimidating. Let me distill it so you don’t have to wade through things – for Hibernate’s needs, ThreadLocal allows you to create a singleton per thread. In most web applications, a single request follows a single thread of execution until it returns a response to the user. By using this ThreadLocal type of object, we can access the same session throughout the web request without having to pass it around.

Here are the general pieces of a suggested approach:

  • Servlet Filter – The servlet filter exists to intercept requests and open and close the session and store it in the ThreadLocal. It also contains a getSession() method, used to get or create a session for this thread. Sample code for Persistence class here.
  • Data Service Classes – the classes you call to get your data. These classes request their session using Persistence.getSession().
  • Business Logic Classes – these classes encapsulate the business logic, including the logic surrounding multiple database interactions.

So a method of your Data Service Class might look like this:

public static void update(User u) {
   Session sess = Persistence.getSession();
   sess.update(u);
}

A method in your business logic could look like this:

public void updateBoth( User user1, User user2 ) {
   Transaction tx = Persistence.getSession().beginTransaction();
   try {
      UserService.update(user1);
      UserService.update(user2);
      tx.commit();
   } catch (Exception e) {
      Persistence.rollback(tx);
   }
}

Instead of writing cumbersome open/close session logic in our persistence classes, or having to pass the Session into each method, the approach lets us cleanly share a session across multiple persistence calls in a simple, thread-safe manner. This discovery has made my persistence code much cleaner, and I hope that you find it beneficial as well!

Java

JDO 2.0 – A Usable Spec?

The JDO 2.0 JSR has been submitted with the following major goals:

1.Make JDO easier to use.
2.More closely align JDO with J2EE.
3.Standardize JDO’s support of relational databases.
4.Broaden the scope of JDO to include more vendor persistence architectures.

#3 is perhaps the most critical. In current JDO implemementations, the O-R Mapping approach is completely defined by the vendor. Although they are probably similar, and migration won’t be completely brutal, the lack of a standard is probably a key factor discouraging adoption of JDO.

Other key features in the JSR include:

  • Disconnected operation – allow users to extract objects from the database, disconnect, modify, and resubmit changes. It sounds like it tracks a unique identifier for disconnected objects and uses this to determine what data to update.
  • Interface persistence – in JDO 1.0, interfaces couldn’t be persisted. Just classes. Fair enough.
  • Binary compatibility not required – In JDO 1.0, there was apparently a contract the vendor had to conform to when the enhanced the bytecode. No real idea why this was required, so it’s good it’s going.
  • J2EE alignment – the main thrust of this is a cleaner transaction API
  • JDOQL enhancements – adding aggregates, string manipulation, better range of return values. This is evidence of the maturing of the technology.
  • Enhanced relationships – Bidirectional and composition relationships MAY be addressed.

Of course there are other minor tweaks that will creep in. In any case, a recent Solarmetric presentation on JDO led me to conclude that JDO 1.0 hasn’t quite hit its stride, but JDO 2.0 will have enough of the candy that O-R Mapping junkies are used to that it can start to pick up some momentum. Thankfully, the expert group has kept the specs manageable up to this point to where migration to JDO is a realistic prospect (for an anti-pattern, look at JSF).

Java

Is UML the problem with MDA?

I’ve struggled again and again with MDA – something just doesn’t feel right. Last night at AJUG, Jon Kern of Agile Manifesto fame presented his view on MDA. His view of MDA is at least refreshing – if CompuWare’s marketing approach continues along the lines of his approach, they have a chance. His view was very pragmatic – there is no silver bullet, and it certainly isn’t MDA, but MDA can be a valuable tool in the process. It is NOT about just drawing a picture and pushing a button to generate the app, rather it is a tool that can be used to ensure a common architecture, separation of concerns, and enable rapid development.

Now none of these things sound implicitly bad, and my use of XDoclet suggests that I don’t have a probem with code generation in general, so what’s the problem? Then it struck me that perhaps my problem with MDA is that UML is the starting point. MDA tools now seem like a natural fit for development teams already using strict UML as the starting point of their development process. For them, the MDA tool is at worst, a different tool to create their UML diagrams from, and at best, a tremendous time saver – the choice is theirs. There is still an issue of whether you really want your senior developers’ primary job to be monkeying with the tool to have it generate the right architecture, and how many expert developers will embrace this role, but that’s another discussion.

The problem comes with teams that don’t use formal UML as their starting point. UML is used to varying degrees by different teams – in addition to the strict approach, and the obvious not-using-it-at-all approach, many practitioners I know prefer to use “loose” UML to convey ideas. At my previous employer, we all had Rational Rose AND Visio Enterprise Edition. The bulk of the time, developers chose Visio over Rose. The reason? Rose was preoccupied with making sure all of the rules of UML were followed, and spawned all manner of warnings. In Visio, the developer could express the idea they were trying to convey, without worrying about whether a line had the appropriate diamond at the end of it – the point was that there was a line. The audience for these diagrams was a human audience, capable of inferring meaning and asking simple questions in face-to-face meetings to fully understand. Clearly an MDA tool must have all of the particulars spelled out for it in order to correctly generate code, else it will surely frustrate users by its incorrect inferences and assumptions.

Vendors will suggest that any team can benefit from MDA. The cynical side of me responds to MDA by enjoying the mock-acronym posted in a TSS discussion thread, “Massively Documented Advertising”. An honest assessment of the situation seems to indicate that the usefulness of an MDA tool to a team is closely tied to their degree of UML adoption. In order to claim that MDA is one-size-fits-all, you must also prove that adoption of strict UML is necessary for all teams. MDA can be viewed as code generation where UML serves as a powerful metadata language. If you already have suitable UML, fantastic. If you don’t want to use UML in that formal sense, other forms of metadata may suffice. In some cases, even that’s overkill.

General

Georgia Tech and the Ripoff Loss

My Georgia Tech Yellow Jackets had a great, unexpected run deep into the NCAA tournament, the best run in school history. Coach Paul Hewitt is a class act, and I’m glad we’ve locked him down for many years to come. Next year looks solid, with only 2 of the 9 players in our regular rotation graduating (Clarence Moore and Marvin Lewis).

I’m still in mourning over that game, and quite irritated to boot. On the one hand, there are the things you can’t help. In each game, a player stepped up and led the team, whether it was Will Bynum, Marvin Lewis, Jarrett Jack, someone was on. Last night, nobody had their shot. It was like the rim had been put under a voodoo curse, with every shot clanging off the iron and into the hands of a Connecticut player. While UConn played a role in that, a lot of it was well set-up, well aimed shots that just wouldn’t roll right on this night.

What could have been helped was the completely inept refereeing that has been prominent throughout the tournament, but took center stage in this game. From start to finish, light taps and even no-contact shots were called as fouls against the Yellow Jackets. Tech players would get shoved, charged, and body checked for a no-call. Over the back on rebounds, body fouls on shots, and hacks on the dribble were ignored, causing several losses of possession. The Huskies would barrel through defenders, completely out of control, and not get called for the player control foul. Meanwhile, Luke Schenscher (who has a posse) would get a clean block or a hand up in front of a shooter and get whistled for a foul. Georgia Tech plays a physical game, and if the officials want to call a game to minimize physicality, that’s fine, as long as it’s consistent. The “activist officiating” (tongue-in-cheek) witnessed last night was one-sided, allowing UConn to play a fierce, physical game, while disrupting the Yellow Jackets’ tempo regularly with trivial touch fouls. Georgia Tech was clearly outplayed last night, but it’s tough to say what might have been if the refs had swallowed their whistles instead of allowing Tech players to get mugged, losing rebounds and opportunities because of it, while giving UConn a free pass to the free throw line at every opportunity. Perhaps B.J. Elder could have found his shot, Ish could have defended more strongly, Bynum and Jack could have found some driving room. We’ll never know. But I still give great respect to both teams – UConn won, and the reffing was not their fault, and this Tech team beat all expectations to get to this point.