Eclipse Galileo Review

It's that time of the year again... here's my short review of Eclipse Galileo, which I've been using since M6.

While reading this review, keep my usage pattern in mind: programming for a large portal written in Java using frameworks such as Spring and Hibernate. Domain objects are mostly annotated, with some Hibernate XML thrown in. Spring configuration is still using XML for now.

Here are the things I like about this release so far:

  • HTML tags now auto-close. Same for XML tags; if you type "/" at the end of an open tag, the close tag is removed. Not a huge feature, but it's the kind of polish you find in IntelliJ that's starting to make its way into Eclipse.
  • Fully qualified class names are hyperlinked in XML files! This is a HUGE help for our team, as navigating those Spring XML files is always tedious.
  • Go to implementation. When you hover on a method call, you get a small pop-up that lets you choose between going to the declaration or the implementation. I'm still catching myself going to the declaration and pressing CTRL-T, but I'll get myself re-trained eventually. It would be nice if this feature had a shortcut key.
  • Switching editors with CTRL-PageUp and CTRL-PageDown. This used to switch tabs within the editor if present, to my great annoyance when I had XML files opened. I never use the "design" view for XML, so having to go CTRL-F6 or mouse around just because of a design view I never use was pretty annoying to me.
  • Rectangular/Column/Block selection mode. This used to be an add-in, but it's nice to have it built-in. This was one of the few things that had me boot up vi for text editing once in a while...
  • The compare editor is now a much more complete Java language editor, with completions and so on. Since I often have to resolve annoying conflicts by hand when merging old branches, this is a welcome addition.
  • Hyperlinks in JavaDoc pop-ups. Man, this is great! How often I've wasted time closing the JavaDoc popup and navigating to the parent class or to "see also" links by hand... Or even given up and gone to the web browser to browse the online HTML doc. In my mind, anything that saves me from switching to a different app just to navigate documentation is a Good Thing; it's one less hurdle to staying in the Zone.
  • The previous version of WTP had an annoying bug in JSPs: malformed tags would eventually lead to a stack overflow on complex pages, on every keystroke, until you closed and reopened the file. I haven't encountered this problem with Galileo.

But the major plus in my mind is the new plugin installation UI, known as p2. It's still not up to par with the Debian package manager front-ends, especially when explaining conflicts or missing dependencies. But it's much less confusing than previous efforts. I may actually use it to upgrade Eclipse next time around, something I never dared to do before.

What's to dislike? Well, nothing much. I still wish they'd implement proper themes for syntax highlighting, since the Export Preferences system exports all preferences. It's possible to edit the resulting file to strip non-highlighting related data, but it's a pain. Don't underestimate the power of syntax highlighting themes! Programmers like to share those, especially when the default theme is dark on light instead of the much less eyestrain-inducing light on dark.

Also, I wish installing Subversion integration wouldn't involve adding third-party connectors that sit in a different update site. I know it's related to licensing, but this is somewhat annoying. SVN support should be built-in, given that CVS is largely deprecated now.

Oh, and this isn't part of the Eclipse Galileo release, but the Maven plugin for Eclipse is really shaping up nicely. We've converted all our active projects to Maven, and it's really convenient to be able to just grab a project out of the repository and start working immediately. Not to mention how nice the new archetype UI is.


Again with more gadgets

Well, my daughter was born this september, and she's still being a bit grouchy, so you'll forgive the lack of posts.

First of all, while she was sleeping in the first months and I was exiled to the living room as the little one was co-sleeping (before we discovered that she really hates it and would rather be in her own bed... She's strange, that one), I found time to update cbrPager somewhat. The main gain was the addition of background threads to pre-load pages. Oddly enough, page switching, while twice as fast, is still slow; it looks like the N800's relatively puny processor is having trouble just scaling the bitmap. I can't very much scale it on load, because the user may require it to be resized. But since then, very few changes. One pending feature request which I'll get around to implementing someday.

Second, I decided to take the plunge after much consultation with online price comparison engines and reading reviews, and, basically, the result is that I'm writing this from an Acer Aspire One AOA110-1698 netbook. It's going to replace my desktop completely. In doing this, I get two advantages:

  1. I hope to free the office to become the little one's room as she becomes older. I'll only require a desk that can hold the 22' LCD, the netbook in front, the external hard drive and DVD-ROM, and a keyboard drawer. This means not much depth is required, and this gives us a lot of flexibility on the size of the desk that I can live with.
  2. I can actually write this while watching the Dear Daughter sleep.

The latter was not part of the original plan, but I've got to admit, the Atom is a pretty amazing chip. It's faster than my old Athlon XP 2000+, and the GMA945 graphics, while not really the speediest in the world, are still quite a bit faster than my old Radeon 9000. Now, I don't recommend using that machine as a primary machine unless you're upgrading from an old clunker like mine; if right now you have a quad-core bomb, this little netbook won't impress you too much. But it's plenty fast enough for my needs; I even got some programming done on it.

Acer Aspire One Mini Review

This is turning into a mini review, so let's just get it done so I can gush about my daughter at the end of the article :) First, let's cover the external appearance of the device.

When you get that machine out of the box, it can't help turn heads. Everybody at the office wanted to have a look at the thing.

The shiny body, as often mentioned, attracts fingerprints more than anything else, so it doesn't impress me that much. Build quality is good, feeling as solid as my Dell corporate lappy.

The touchpad is quite small, and this can be annoying; unless you give it very high acceleration, you'll have trouble traversing the screen in a single swipe. I highly recommend using an external mouse for any kind of serious work. Still, it's built-in, and I still use it quite a bit when no at the desk.

The keyboard is the main reason I chose the One over, say, the Asus EeePC 900A or the Dell Mini 9. It's simply amazing. It's as usable to me as a fully laptop keyboard, and the key stiffness is even better. Key travel is great. Acer can do really nice hardware when they care, so it's too bad their stuff is usually so crappy. But the one feels more solid than the cheaper 14" laptops you'll find in brick-and-mortar stores (such as the really crappy 14" eMachine laptop that sells for the same price--trust me, unless you have to have a 64bit machine, the One is a better box). I only have one complaint about the keyboard: the left control key doesn't work very well, sometimes not registering. I solved that by remapping my caps lock key to control, which I do on regular PCs anyway. If the unit had been bought in a store, I would have returned it, but it's not worth the hassle since I have an alternate solution. One day, I'll take the keyboard apart and try to figure out if it's just a particle stuck under the keycap or something.

The screen is fantastic. At lowest brightness, it's very clear and sharp. You can plug in an external monitor. The graphic card appeared to give a fuzzy image at first, but after messing around with modelines in X, I got it to be pretty sharp. Too bad there's no DVI connector, but there's no room for both VGA and DVI, and VGA is still more widespread, so I understand the decision.

After having to crawl under my desk to reach rear-facing USB ports on my desktop, the side USB ports (3 of 'em, just enough for two external drives + the keyboard/mouse connected to a USB/PS/2 adapter) are really a blessing, and an unanticipated benefit to switching to a netbook from a desktop.

The 1698 comes with a 6-cell battery. It adds a silly protusion at the back, and I'm sure they could have come up with a better design. But do get a 6-cell. This thing has better battery life than my N800, and it shows... I don't use the N800 for web surfing much anymore (it's still getting much use as an e-book/comic book reader/time waster game machine). With the 6-cell, it's still pretty light, slightly less heavy than a thick hardcover book.


Unless you don't know Linux at all, ditch the Linpus that comes with it. It's not that bad, but it's not that interesting either. There isn't that much software available for it compared to, say, Debian, Ubuntu or the latest Fedora (Linpus is based on an old Fedora release).

I initially considered installing Ubuntu, but in the end, I opted for Debian. The SSD in the 1698 is only 16 GB (it's big for an SSD, but it's far from my roomy 80GB I got on my desktop--it's now in an enclosure serving as the drive for multimedia stuff). I'm very satisfied with the boot time (about 20 seconds, not visibly much more than Linpus) and hardware support (full, after some monkeying around).

To install Debian on the AAO, the best hints are on on the Debian Wiki, these guys' blog (it was actually very, very helpful), and the kernel built by this guy (follow the link in the message). I had to monkey around with the kernel sources because I wanted to use madwifi to get the LED workings, and the headers package wasn't working right, but besides that, everything was very smooth.

To get the external screen working at full resolution, I added the modeline as explained somewhere (I can't find the link). The modeline has to be added to the Monitor section. For 1680x1050@60Hz (Samsung SyncMaster 2253w--it was a steal at 200 CAD), it's

        Modeline "1680x1050" 149.00 1680 1760 1944 2280  1050 1050 1052 1089

And then xrandr will see the right resolutions. I think it reads the resolutions from the built-in panel and ignores the external monitor; looks like a bug in the Intel video driver.

To get the screen switching functionality, I used arandr to create the scripts, and tweaked them a bit. In xfce, it looks like xfwm4 is a bit dumb when the monitor is added after X is started. So, the script to switch to the dual-screen configuration looks like this:

killall xfwm4
xrandr --output TV --off --output LVDS --mode 1024x600 --pos 0x1050 --rotate normal --output VGA --mode 1680x1050 --pos 0x0 --rotate normal

And everything works find. It's best to boot X without the external monitor plugged in, or Thunar has very tiny fonts when you unplug the VGA and switch to internal panel only. Don't ask me why; there might still be some glitches in the RandR extension. Still, RandR is really, really cool, and it looks even more flexible than many Windows XP display drivers (dual screen support is sometimes spotty--the worst part is, it's really driver dependent...).

I also installed XP on a small 4GB partition, just for the odd game that doesn't run in WINE or Dosbox. Install was OK, but installing the updates was really, really slow. I thought it was the SSD that was slow, and maybe that's part of it... but maybe it's just Windows Update. Someone at the office had to install SP3 + updates, and it took him as much time on his relatively fast desktop with a good 5400 RPM hard disk as on my paltry Atom 1.6 GHz with a really slow SSD. So, YMMV. If you're a heavy XP user, the hard disk may be a better choice. The external drive is a good compromise; I'll use it for heavy data crunching. But still, Windows really likes writing all over the disk. Use FAT32 and no swap.


I still haven't tested the camera or the card readers. I'll test the card readers soon, moving the /home to a Class 6 8GB card; this should be faster than the SSD. When SSDs become cheaper, I'll swap the built-in one. In general, it's not that slow, but once in a while the whole system locks up waiting for the disk, and that's really annoying. Still, I like the SSD--it stays cool and boot times are really, really good with it. Heck, it boots faster than the fancy Core 2 Duo laptop the office provided me, and that uses a 7200 RPM disk drive!

It's still somewhat fiddly to install other OSs on that thing. But it's workable. Linpus is OK, just not as nice as .deb based distros (why they chose Fedora as a base is beyond me).

I haven't installed Notebook remix, because it depends on GNOME. Why it's not a plain GTK+ program is beyond my understanding. Maybe I'll hack it up a bit, it looks like I've become quite good at porting GNOME programs to plain GTK+.

In Canada, you get the weirdo keyboard with an extra key. However, I don't find it that bad, it's actually quite workable. I just hit the backslash instead of the Enter key once in a while. Nothing too bad, really. The Page Up/Page Down keys around the up arrow are really placed in an ideal way.

There's this fan control thing that is suggested online, but it doesn't seem to work too well with the latest BIOS (fan keeps turning on and off). The fan isn't that loud anyhow; it's not really noticeable. Maybe I lucked out and got a version with a nice fan.


Overall, I'm very happy with this purchase (400$, taxes and shipping included, at Swift Gamers--the guy ships fast, he shipped same day at 19:00 and it got home the following morning!). Mobile computing is in, folks. We now have enough power at low enough prices that you'll see small, light lapops everywhere. And I cannot emphasize enough how small and light the device really is. The only thing that might stop it is the CPU makers' fear that it'll cannibalize the sales on their high-power processors. But I think that'll only delay the inevitable. Low power CPUs are now very powerful, and make no mistake, people want low power. Being free of the power cable was already very cool with the N800, but now, I have my full desktop's power in my hands, and it's just... really cool.

The future of computing may not be determined by trying to beat the physical limits so Moore's law can still be applied. I think it'll be about finding more power-efficient ways of eking out a lot of computing power in small packages, at least for the end users. The big CPUs will be relegated to server racks, where they'll serve the computing power remotely. Star Trek TNG's vision of a big mainframe with small tablets serving as terminals may be the true future, with ubiquitous WiFi or 3G/WiMAX giving fast access to centralized resources.

Or maybe not. I'm not tech pundit, I don't pretend to know the future. But it's a really nice way to organize one's work, in any case.

And now, for something completely different...

This is usually a technology oriented blog, but I'd be a very poor father if technology were my only concern.

Stéphanie Nguyen-Emond came into my wife's life and mine this september, and despite the sleep deprivation (that little rascal was sleeping only 10 hours a day at first, and crying all the remaining hours of the day--with very little reason, believe me!), the worries about the little one not drinking well, the diapers, daddy getting sprayed with the little one's waste products several times, the Dear Daughter twisting around whenever she's being picked up (she wants to see stuff, and hates it when she's not facing outwards), etc. etc. etc... she's given my life meaning and purpose.

She's a robust baby, with very, very good lungs (we can attest to that every time she wakes up), drinks too fast sometimes, and has a wonderful smile when I change her diaper.

As a parent, you have to learn to be patient. Sometimes, I get very frustrated, and then feel bad about it because I feel I should have more patience, more skill calming her down. But the minute after she gives a really wide smile and coos, and then I feel like everything is all right with the world. Despite the crazy economic times and the uncertainty about the whole financial system, I find that I'm still optimistic. I'll work hard to leave her a nice world, and the difficulties we live right now will no doubt yield some good in the end. Hopefully a world that is more just, less crassly materialistic, and focused on protecting the middle class rather than the rich and powerful. One can only hope. Society is a pendulum, and it has swung too far to the advantage of scoundrels and con artists. It's only a matter of time before it swings back, and I hope I'll live to see it so that's the world my daughter will inherit.

Regardless, I'll be there for her, always. And that has to be worth something. Stéphanie, ton père t'aime et sera toujours là pour toi.


Again with the N800 :-)

Well, I'm at it again. I wanted to read mangas on the N800, but the only working application, Comix (a Python application) was somewhat slow. Not the Python code itself, which was actually quite fast, but rather the overall approach. You see, to speed up processing, Comix would always unzip the whole comic archive file (cbz or cbr) in a temporary directory and read from there. It does make page flipping faster.

It also fills the N800's main drive as fast as you can say, "out of disk space" :-)

The approach is acceptable on the desktop, where at worse, we're talking about 50 MB of temp disk space for really large archives. Not an issue, even on my relatively small (by today's standards) 80 GB drive.

On the tablet, though, not only is it problematic (you get 256 MB of drive space and that's it!), but even writing to external memory cards is really, really slow. Expect 2-5 minutes before you see the first page, and meanwhile Comix is hung. And woe to you if it crashes–you'll have to recover all this space at the command line.

The crash thing didn't bother me too much, but the slow boot time did. I read fast, especially manga which tend to have relatively little dialog per image. So I started looking for an alternate solution that ran on GTK+ and that I could potentially port.

After swimming through numerous Java applications (useless on the tablet), there were two candidates: cbrPager and Comical.

Choosing the One

Comical initially looked more interesting. It embeds libunzip, unrar (through a special exception to the GPL in the license) and did all image decoding in memory, which I thought could be faster. cbrPager, in contrast, had a similar approach to Comix (unzip to a temporary directory), except that it unzipped only one image at a time. I worried that this would be slow. Also, cbrPager used the Gnome libraries, some of which are not available on Maemo, and I didn't really expect to have to rewrite parts of the UI layer. But unfortunately, Comical is written on top of wxGTK, which is a bit problematic on Maemo (there's a port, but it's hard to get from a repository). So I decided to take a look at cbrPager and do an initial "straight port" using the Gnome canvas and disabling parts of the interface, just to see whether it was fast enough to be a workable base and not force me to deal with wxGTK.

It was quite fast, thank you very much.

I immediately knew I had a winner.

Modifications: Front end

I Hildonized the toolbar and main window, which was relatively quick. Unfortunately, one of the features that's really useful for tablet-based comic book readers is rotation. Comix had it, together with very high quality antialiasing. A small modification to the cbrPager source gave me antialiasing and rotation, but not both at the same time; rotated images looked really, really bad.

Since I wasn't too happy with using the Gnome canvas anyways, and that it was the last Gnome dependency left (I had had to remove all others), I decided to try to find a nice way to replace it. Initially, I was quite ambitious, thinking I'd do raw updates to the main widget. Looking over in the gdk-pixbuf library, it looked possible, as that library had all the bells and whistles I needed: antialiasing with selectable algorithm, automatic rotation, image scaling, etc.

But then I started to think about how to embed the image render in the main view, and my head started hurting, thinking that I'd have to write a GTK+ widget from scratch (it's not hard, exactly, as I've learned later, but I had never done something like that before...). In the end, I decided to look for a plain image viewer (initially I though gqview, since it has the features I'm looking for) and see if I could scavenge their widget.

In the end, I found GtkImageView, which had a ready-made widget and was only lacking rotation. I added rotation and was a happy camper.

Well, almost :-) I had introduced a memory leak that ate all the RAM of the tablet really, really quickly. Fixing it was really simple, but that didn't happen before I crashed my tablet several times in an effort to figure out what exactly was going on...

From the front-end point of view, I then added a few tidbits: scrolling through direct pen usage, auto-rotate, bookmarks (my wife asked for that one really fast...), persistent zoom settings, fullscreen... All of those were pretty easy to add, which speaks volumes about the initial cbrPager code's organisation (hint: it's pretty nice).

Modifications: back end

I then got annoyed that users were forced to install unzip. One nice feature of Comical is that it embeds the unzip code. So I wanted to do the same. For RAR files, I chose not to do it right away because of licensing problems.

To do that, I needed to extract the archive reading logic. It was embedded in command line use. So I chose to encapsulate it in a gobject. This was probably a bit overkill, but it made sense with respect to the use of the GtkImageView gobject.

I first moved the code of the fork/exec logic into a gobject and got it to work that way. Once that was done, it was really easy to get it to work with direct calls to the unzip library.

I had wanted to extract the image bits in memory and decode completely in memory, thus removing the need for a temporary file. I didn't do it that way right then because there were reports of really slow decoding when doing that through pixbuf loaders. Since image decoding was actually slower than archive reading, I decided to pass for now.

Writing a gobject was kinda cool, but those things are really quite heavy... there's a lot of macros and presumably some overhead. Since the same 3 objects are reused throughout the application (one object per supported archive type), the overhead is negligible, but if I had to do it again I'd just go with a struct and function pointers, although it feels kind of silly to do that given that there's already gobjects in the application.

The result

Well, just go download it at https://garage.maemo.org/frs/?group_id=777&release_id=2075.

You can see the announcement on the Internet Tablet Talk forum. There's screenshots.

Just beware, you need a memory card in the internal slot for this to work... I'll work on a fix tomorrow.

Credits, thanks, etc.

Many thanks to the original author of cbrPager for writing the initial code, and for putting up with me :-)

And thanks to the ITT community, who will be my beta testers, as usual ;-)


N800 Update

Well, after a couple of months with the N800, I must say this was one of the better purchases I've done in electronics since getting my first computer. It's an extremely useful device.

I got a second one for my wife, and it came with an iGo foldable Bluetooth keyboard, which was a pretty good deal. The keyboard is a nice addition, and next time I go to a convention, I'm not bringing the laptop, or at least, not for anything except the tutorials. The tablet is much nicer to carry, its battery can last much longer, and I look like a total geek when I unfold the keyboard and start typing away. Given that I'm now comfortable with my geekiness, this is actually an advantage :-)

Looking at the price I paid for both tablets, it was actually cheaper than many MP3 players, even when factoring in the price of the additional SDHC cards.

So, here's my appreciation of the device after two months of use:

  • For e-books, total success. I'm nearly done with the Honorverse stuff, and I'll move on to other books soon.
  • For tech books, I haven't used it much so far because I've been commuting by car lately. But my wife is using it to study for the SCWCD, and although in her opinion it's better to look at her desktop LCD, the tablet is good enough for reading in public transit. And much less heavy than the SCWCD book...
  • For the Internet, it's really incomparable. No device that size can do it all like this one. I wish they'd upgrade to Firefox 3's final engine instead of an alpha 1 derived version, though. No doubt that's coming soon.
  • For videos, it works very well, although sometimes hitting the optimum encoding can be tricky.
  • For games, it's better than I thought in the end. iNES has excellent performance, so old NES games play extremely well. At least iNES has a saner on-screen button system than FCEUltra. Star Control II (The Ur-Quan Masters) is also extremely playable, but you need a keyboard. I use a small PlayStation 2 USB keyboard with an adapter for this, and it works kind of like a mini console controller.

Also good news, the latest release of Maemo, Diablo, has been officially released to the public. I can report that it was a pretty painless upgrade. The browser and flash support seems somewhat snappier, and I'd say the overall OS feels more solid overall. Except for the weather desktop applet, everything from Chinook (previous release) worked very well.

So Nokia can count me amongst their satisfied customers.


Behold my new toy!

Much to my wife's chagrin, I've acquired a Nokia N800 Internet Tablet. It's been a long, long time since a piece of hardware interested me, so let me tell you about this one.

First, though, some background on why I got one. I got fed up of the huge boxes of books (mostly mine, and mostly old sci-fi or computer books) that take a lot of space in our closets. Also, I'm getting worried that my main bookshelf will buckle and fall under the load; the tablets have nearly 3/4 of an inch deflection at the center. Doubling up the number of books on them was probably not very smart...

So, I decided I'd get an e-book reader.

After a lot of research on the Kindle, the Sony PRS-500 and 505, the eBookwise 1100 and a few rare and too expensive other options (Cybook, etc), I settled on a previous-generation Sony. Unfortunately, it's not really a common device in Canada, so I went to eBay.

The price was rather high in my view, for a previous generation device, and all sellers were shipping by UPS (I really don't like getting packages by UPS, the distribution office is at the other end of the city and they never deliver the package at a decent hour). So, I started looking for other options, such as PDAs, with the philosophy that if I paid more than 200$ for something, it better do more than one thing.

Research on this took a lot of time. Did you guys know that PDAs are a disappearing species? Which is bizarre, in my view, because they are useful beasts, especially with WiFi. But smart phones are killing them. Unfortunately, for e-book reading, a smart phone is not an option... especially since I already have a cell phone (the cheapest I could find) and have no desire to change providers/pay a lot just for getting extra PDA functions.

But after a while, I found a site that mentioned the Nokia 770 Internet Tablet. Being curious (what's an Internet Tablet? Since when did Nokia do something other than a phone?), I researched a bit, and here's a capsule version of what I found:

  • The Internet Tablet series of devices are 4-inch, 800x480 screen devices with WiFi and Bluetooth included;
  • The screen size/resolution yields a totally crazy 220 DPI screen;
  • The screen size means that you can browse web sites from the tablet with a minimum of scrolling;
  • The screen size also means you can read PDFs in landscape mode without too much trouble, which is a HUGE advantage compared to the Sony PRS 505, which doesn't render PDF too well. As much of what I intended to read on it were technical books that tend to be distributed in PDF, this was an excellent advantage;
  • And, the clincher: this thing runs a variant of GNU/Linux known as maemo, and hundreds of programs have been ported to it!

I originally thought of getting a 770, but in the end, I settled for an N800, which looked better, had a faster CPU, more RAM, and as an added bonus, is somewhat lighter.

So, what am I doing with this thing?

  • Reading e-books, of course, mainly David Weber stuff I got on the CD that came with the War of Honor hardcover I bought some time back;
  • Reading tech books, especially since I hacked evince to be a bit closer to my preferences (I wanted the paging buttons to scroll by screen size rather than go directly to the next page);
  • Browsing the internet. Instead of staying in our home office to browse 'til the wee hours of the morning, I can finish reading my favorite blogs elsewhere, the idea being that I don't need to stay in front of the PC 'til the wee hours. It's kinda hard putting this in practice, though;
  • Watching video. This was a surprise, since the tablet is not meant for that and apparently, Nokia sort of goofed the video hardware on that count. But when properly re-encoded, the screen is gorgeous enough and the speaker good enough quality that it's quite a pleasant experience, and doesn't tie up the TV screen. Great for watching smaller clips. My wife and I have been listening to a couple of episodes of La Linea every day when we feel like it;
  • Games. This was somewhat disappointing so far, because the d-pad isn't the best thing in the world, emulators run slow, and button placement is awkward. ScummVM was the best success so far, probably because it's already mouse-based.

But the most unexpected development of all: lately, I've stopped having much fun with my Xubuntu box, because, besides small, occasional sound card glitches I had when following Hardy, I've been having no problems at all... and no incentive to do a re-install. In fewer words, I was getting bored. But the N800 is a new environment. Even though it's GTK+ based, there's lots to do--it's still somewhat rough, and there's a lot of stuff to port to it.

I got the urge to hack on it just for fun.

The first thing I did was recompile evince (a PDF viewer, slightly more flexible than the one that's included by Nokia) to give it key assignments I like better. If anybody wants the patch, I'll be glad to provide it (it's quite trivial). Setting up the cross-compiling environment (known as scratchbox) wasn't the simplest environment setup I've ever done, but after a couple of tries, I got it to work right. I had to try a second time because somehow, I hosed the first version when switching targets.

The second is far more interesting. As I was complaining about the hand writing recognition on the tablet, somebody at the office pointed me to the QuikWriting system by Ken Perlin. That system lets you write without lifting your pen, and the overall mechanism is much less error-prone than hand-writing recognition. I looked and looked for an implementation and could only find an old one for Qtopia. I really wanted to try it out on the tablet to see if it was a viable option.

Finding no port, I ported it myself. The result is available in the maemo Garage. It's not quite finished--namely, I'm a bit annoyed at the current letter display, and international support would be nice. I'd also like to refactor the code and render the input areas using vector graphics instead of a stupid bitmap.

Still, it works right now, and I'm using it in production.

Doing this was the most fun I had in a long time.

As for my wife, she asked me to buy her her own N800.

A nice new toy, indeed! I'd recommend it to any GNU/Linux-head out there. It's pretty capable, the build quality feels solid, the battery life is good, and the screen is fantastic. I give this device 9/10, and I'll be sure to look out for the next version.

(Addendum: yes, I know of the N810, but the hardware is very similar, so I'll probably skip it. My dad got one, though, and he's also having fun with it.)


Musings from EclipseCon 2008, part 1

I totally forgot to mention it for the longest time, but having a fruit hat picture on my blog landed me a free entry to EclipseCon in Santa Clara, CA, USA. Alogient was kind enough to pay for my transportation and hotel room.

I'm writing this from there; the connectivity is really good overall, I wish I'd been wired like this last time I went to a conference...

I'll have more things to say about the overall experience later, but right now, there's an interesting thought I want to write down.

Today's keynote was given by the guy behind Fake Steve Jobs, who, at one point, made an interesting (if not very original) parallel between Apple and a cult. It's mostly interesting because during lunch, I sat at the "Maven" table (and unfortunately, none of the m2eclipse or q4e guys were there--guys, you suck! :-) and one of my lunch colleagues made the reflection that people get on the Maven train because they've invested so much time learning it that they want it to be worth something.

Note that the guy saying that being wry, and we all had a good laugh. But it got me to think that Maven could be seen as a cult as much as Apple, but for totally opposite reasons:

  • Apple is a cult because they do products that are designed and work really really well the first time you pick them up;
  • Maven is a cult because it does not work at all the first time you pick it up, and probably the converted really just want other people to suffer as much as them :-)

So, now you can decide whether that's the reason I'm a Maven fan...


Maven (non)-bashing

Looks like the popular sport of bashing Maven continues. Just thought I'd add my own opinion to the mix.


At work, I've instated Maven use for new projects as well as some active projects (those I could get people to migrate, anyhow) about nine months ago. My reasons were the following:

  1. We had a crufty Ant-based build system (which I built myself, I'm ashamed to say--I think my only excuse is that it replaced another, cruftier Ant-based build system with really wonked dependency management), and some of its persistent bugs (wonked version number handling and unversioned third party libraries) were really starting to get on my nerve and other developers';
  2. I'd tried out Ivy, and I had problems getting it to just work (this was way before the current version, at a time the documentation was rather sparse);
  3. A co-worker started bugging me about "why not Maven," so I got curious;
  4. I finally found out how to adapt the pom.xml file to a non-standard directory structure, from the Wicket project's pom files. This was the clincher, because I really hated the standard Maven layout (especially separated code and resources directories--ugh) and I wanted to be able to retrofit our old projects that had a more widespread layout (source in src/, tests in test/ and webapp in www/).

Initially, I was the only Maven user in the whole place. I introduced my team mate who worked on the same project I was doing, and more recently, I ported a whole bunch of legacy stuff to use Maven instead of the old script, declaring that if that stuff could be ported, anything could. Two other co-workers started using the Maven build at that time, and so far, they feel it's been positive.

Maven bashing

OK, fasten your seat belts, folks, because that's going to hurt.

I had two huge problems with Maven:

  • Bugs
  • Very, very generic documentation for many non-generic tasks, like sofware release (mvn release:prepare and mvn release:perform)

The main bugs I hit were huge bugs with the CVS SCM provider that made the release plugin basically unusable. I tried fixing it myself, but I nearly burned my eyes out navigating the SCM module source code. This is another sub-complaint: the code quality is, frankly, not that amazing. It's a bunch of too-small modules strewn together, and sometimes the modularization is pushed a bit too far for my tastes: you get a package with interfaces only, a package with the implementation, a package with just the model created automatically from modello files, and then a package for each different plug-in. That's two extra packages, in my view, and it means a very high cognitive load.

The main problem with the documentation was that it was highly variable. Sometimes, you get amazing details, lots of examples. Sometimes, you get a glib description and the goal documentation (note to plug-in developers: the goal documentation would be much better if the configuration parameters were actually explained; sometimes, it's not clear exactly what a term means in a given context, and being told "localRepository: local repository path" does not help much). Sometimes, you only get autogenerated stuff with no description whatsoever.

Also, I wish more attention were paid to error messages. This is a common problem in software, however; Maven is not the only (nor necessarily the worse) culprit. Exception: the archetype plugin is one of the stupidest thing I've seen from an error-reporting point of view. When it's missing files, it just throws "unable to copy file" without having the decency to tell you which one... even the stack trace is mum about this.

Maven praises

However, I'm not convinced that this is an excuse to ditch the whole system. The main concepts, if not necessarily the code, are quite solid. I wish there was a way to specify ordering when binding many plug-ins to a single phase, but that's the kind of stuff that's not needed that often.

Maven works mostly as documented, and when it works, it works really, really well. In my experience, it's relatively speedy (at least, compared to my hokey ant script). The release plugin, now that they fixed many, many bugs with CVS interaction (actually, the CVS plugin may have been the culprit, I was never able to ascertain this because the code was really "indirect" about it), works beautifully.

Writing a plug-in is really easy. Even compared with Ant tasks. Yes, it's that easy. It's a shame some Plexus or Maven core stuff is so poorly documented, but you can usually figure it out by looking at a similar plug-in.

Despite what a lot of people say, it is very much possible to have Maven work outside its standards. You'll have to be more verbose in your pom.xml, or use a parent pom. Wicket, as well as our own projects, are testaments to this flexibility. You have to hunt around for information on how to do this, but it is available on the Maven site.


Despite some shortcomings, I truly believe Maven is a very good build system. It was maybe a bit too ambitious at first, but since 2.0.5, it started being able to deliver on those ambitions. It is also much more flexible than people give it credit for.

For those banging their heads with it, I advise that you look at the Wicket and AppFuse pom.xml files. Likely they've solved your problems already, so you can just stealborrow their solution. :-)


Assertions in Eclipse

A small trick I found. While running unit tests, I realized that the Maven Surefire plug-in enables assertions during the test run, while Eclipse does not do so (at least, not by default). Looking for a way to enable assertions at unit test time, I found that you can go to Windows | Preferences... > Java | JUnit and select the "Enable assertions for new JUnit launch configurations" checkbox. Hope this helps somebody somehow.


3 things I learned about software in and out of school

Following the lead of Carnage4Life and Scott Hanselman, here are my own items:

Three things I learned while in school

  1. Some people can program, others can't. Those who can't will never be good at programming, even if they work really, really hard. I learned this when I saw somebody stare at a compiler error message for 30 minutes (until I helped them with it) for a simple missing parenthesis! Staring at the error message for more than a minute is not going to help you figure it out when all it says is "Parse error..."
  2. It's hard to know when to stop generalizing. I learned that in Software Engineering when we had heated discussions about whether we should bring generalizations to the next level, something I was very much against.
  3. Working in a group is more fun, but can be disturbing to by-standers.

Three things I learned while not in school

  1. Business people tend to ignore technology under the pretense that business concerns are more important, but it's not a good idea. Technology problems don't really go away by themselves, even if CPUs become vastly faster every year.
  2. Software is a highly volatile industry. When I left school, it was the .com boom, 99.99% employment, etc. etc. Since then, I've seen ups and downs, but always in a very protracted time period.
  3. As you grow older, your experience will tend to grow stale. The reason for this is that as you gain experience, you are more and more likely to become the team lead or senior programmer of your team. As such, you won't have anybody to "pull" you forward as you did when you team lead or senior guy was there and you tried to show off that you were a better coder/designer. The only fix is to always remain open to ideas, regardless of how junior the person who expresses the idea may be most of the time.


Lesser known IoC containers

The point of this article is to talk about my experience with IoC containers other than the 1000-pound gorilla in the IoC container space.

Through some interesting circumstances (mostly, my hard-headedness about not including 15 megabytes of stuff just to get an IoC container; note that I do know Spring is modular, but it wasn't at the time), I'm one of the few people who hasn't been using Spring. At work, we've mostly used PicoContainer as our middleware "glue" layer. I've also dabbled with Guice, and Plexus (a bit) when messing around with Maven Mojos. This is a summary of my experience with each.


Let's cover the IoC container I feel I know best first--PicoContainer. PicoContainer is a small (the JAR is 100 KB or so), type 3 (constructor injection) IoC library.

And that's pretty much the only thing it will ever be, if you don't customize the library. It does constructor injection. It supports hierarchical containers to resolve dependency conflicts. And that's it.

However, the real power of PicoContainer is its open architecture. You can implement your own component management strategy by implementing ComponentAdapter and ComponentAdapterFactory. There are a few example adapters. But overall, this lets you futz around with the component instantiation without having to resort to aspect-oriented programming or exotic annotations. Also, since you are in full control of which ComponentAdapter is used for which component, you can completely customize the instantiation on a per-component basis.

As an example of how useful this can be, it was very simple to implement a specialized adapter that can inject some dependencies through construction injection, and then looks for more dependencies to inject using setter injection. When my experiences with Guice left me somewhat lukewarm, I was able to quickly whip up an @Inject annotation for PicoContainer.

Finally, the hierarchical container facility is extremely useful for one-off dependency injections, such as injecting dependencies in objects retrieved from the persistent store by Hibernate.

So, in conclusion:


  1. Very small
  2. Reasonably fast, to the point that it can be used with throwaway containers
  3. Works even with old JDKs such as 1.4
  4. Component adapters are extremely flexible
  5. Containers are very lightweight, can create and throw away child containers at will


  1. The authors believe constructor injection is the only viable option. However, although constructor injection enforces integrity, it's also very awkward, especially when you get more than 5 dependencies. OK, you probably shouldn't have that many, but sometimes, especially in Struts actions or some other equally centralized point of logic, you will get that many, and it's still less awkward than having a façade just to cut down the dependencies.
  2. Although the adapters are very flexible, there are very few useful implementations that come with PicoContainer. One that supports Guice-like semantics would be welcome.
  3. The hierarchical system for resolving dependencies works well in some situations, but is extremely awkward in others. Granted, it's possible to write a component adapter that uses an alternative resolution mechanism, but in the IoC realm, PicoContainer is a bit like assembly language, or Forth; even relatively common cases have to be implemented by hand.
  4. There is relatively little integration from 3rd-party frameworks (such as Struts, Wicket, and so on). On the other hand, such integration is always very simple to write with PicoContainer, which is not necessarily the case with every IoC container.
  5. Error stack traces are OK, but sometimes very confusing; specifically, it's not always clear which dependency was missing when trying to instantiate a component.


If you want a small container and don't need many fancy features, or if you prefer to write such features yourself to gain maximum flexibility, PicoContainer fits the deal. In my mind, it's more of an IoC library than an IoC framework.


Guice is a relative newcomer in the IoC container arena. Its main claim of fame, despite what its adopters will tell you, is that it comes from Google. Also, unlike PicoContainer, it came out at a time where people started being annoyed at massive XML files, such as those that show up in many industrial Spring-based code bases.

It immediately grabbed the interest of many a-bloggers, showed up on dzone, yadda yadda. The question is, can you believe the hype?

Well, underneath the hype is actually a very nice IoC container that goes the extra mile to provide you with near-perfect error reporting, and that is not afraid to sacrifice purity to the altar of practicality. Guice supports, out of the box, all sorts of dependency injection scenarios, provided that you mark your dependencies with the @Inject annotation. From the theoretical point of view, this is more invasive than a run-of-the-mill IoC container, since classic IoC containers should, in theory, let you use plain objects without any special annotations. This, many argue, makes Guice much less flexible than Spring or PicoContainer.

In practice, I found that adding annotations makes sense in most cases. Externalizing all dependency-related informations sounds great in theory, but in practice, you'll end up with 80% of your services having only one sensible implementation. The services with more implementations have different-enough implementations in many cases that the POJO that receives it works well only with one of them (I estimate that about 10% of application services are like this). Guice's use of annotations to mark injectable resources and to let the target object control what resource gets injected to some degree is very practical.

The IoC container is pretty fast, and does some wicked ASM and cglib-fu to provide precise error reporting and the best possible speed regardless of the situation. Overall, it has an "advanced technology" feel to it. It shows a lot of polish in its package structure and in name selection.

However, Guice is a very "closed" system. Being advanced technology is nice, but it does make certain kinds of things a bit more difficult. Here is an example of such difficult things, and incidentally the reason we went with PicoContainer at work. Guice comes in a "uber jar" format; it bundles two versions of ASM (one for cglib, and a more recent version for itself; ASM is noted for major incompatibilities between minor versions, much to the annoyance of projects using it, and cglib appears to be evolving very slowly these days). Hibernate also uses a version of ASM, albeit an older one, as well as the same version of cglib as Guice. So, in an effort to trim the 550 KB or so JAR file to its bare minimum (remember, PicoContainer takes 110 KB...), I tried to get it to work with the older ASM version. The only part that didn't compile was some code used for error reporting, so I figured I could live without it. Well, I was wrong; the previous version of ASM crashes and burns during unit tests. I'm wary of dismissing obscure crashes to situations that will not happen in production code (even though the failing unit test looked like it was doing something exotic), so I left it at that, figuring I was better sticking with PicoContainer and bloating our applications as little as possible.

Some may dismiss this as a trivial reason to not use Guice, and in some ways it is. But it is something to be aware of. I know Hibernate, for one, has had all kinds of problems with use inside environments that come with a different version of ASM. So I'm a bit wary of such annoying dependencies, especially on ASM and cglib which are known troublemakers. Granted, the problem is a bug in ASM, but it does mean that Guice's use of ASM is exotic enough to trigger that bug.

Another problem with such a closed code base is that it's hard to extend it without releasing your own build of it. I'm not that keen with depending on an annotation that's inside the IoC container's package; I'd prefer depending on, say, the @Resource annotation that's standardized in a JSR (granted, it's not supposed to have exactly the same semantics, but my point is that I dislike putting hard dependencies at the POJO level). I understand the necessity of the annotation, and it's no worse, in my mind, than the transient or volatile keyword. Except in one way: it binds the code specifically to Guice, not just any IoC container. It would be nice if you could ask Guice to use another annotation type (this is noted as issue 70 in their issue database), but right now, you can't.


  1. Still reasonably small
  2. Very advanced, gives a peek of what libraries can look like if people would start to target Java 5 (see also Stripes)
  3. High-tech, uses all kinds of tricks to maximize performance and give the best error reporting possible
  4. More popular than PicoContainer at this point, is getting all sorts of frameworks integrated with it
  5. The only IoC container I know of that allows post-instantiation injection of dependencies for non-constructor injected services. This is actually a great idea in some cases (such as deserialization)
  6. Used in Struts 2. So it's going to end up being used by somebody visible, unlike PicoContainer :-)


  1. Sometimes too advanced; that kind of trickery can hurt if you get bit by a bug (admittedly, there doesn't appear to be many of them)
  2. Binding on a specific in-package annotation is annoying, I'm curious as to why there isn't a way to configure it?
  3. Not an open architecture the way PicoContainer is. It's very easy to futz around with PicoContainer internals. Guice is much more selective with respect to what it lets you do or not do.
  4. Java 5 only. Not that I care, but somebody might. Though, they did manage to make it work with Retrotranslator.


If you want a rich IoC container without the huge XML file tradition (yes, Spring folks, I know about JavaConfig, but it's still not what most people use), with sensible defaults and excellent error reporting, Guice fits the bill. However, if you want something with a hood that you can pop open at will, you may prefer another IoC container.


I haven't played with Plexus a lot, so I can't really give too many details. My only experience was inside the Maven 2 codebase.

As far as IoC containers go, it looks OK (although it does suffer from the XML configuration file syndrome, the XML files are lighter than Spring 1.2 configurations [though not Spring 2.0 configurations], and you can use JavaDoc tags to autogenerate the XML file). However, I felt a bit annoyed at it overall. Error reporting, at least within the Maven environment, is not that great, even for simple mistakes like forgetting the appropriate JavaDoc tag or misspelling something. Plus, I can't really figure out where this project fits. It doesn't look like a project that's used by anybody except the Maven guys; why didn't they use PicoContainer to start with instead?

There are a lot of plexus components, but they don't tend to be that reliable, or their use within the Plexus infrastructure is not consistent. A good example is the password input component. Some Maven components use it, some use a plain text input component instead. It's not quite clear, when developing Plexus aware components or Mojos, which you're supposed to use for what.

A pet annoyance of mine: the Plexus guys want you to obtain the logging service through dependency injection. Although that sounds great in theory, I think it's nuts. Logging (at least, programmer logging) is purely a developer service. I'd even prefer not having to create a special object to start logging--if it could figure it out from the current stack with reasonable performance, it would be much better. So, if, to get logging, I need to create a field, a setter, and configure something in an XML file... I'm not going to use logging, or at least, I'll use Jakarta Commons Logging or SLF4J and just forget about Plexus logging.

So, all in all, it looks like an interesting project, but I don't really know why anybody would use it over Spring, or Guice, or even PicoContainer. I guess every developer likes to write an IoC container, because although it's not too hard, there are enough challenges and futzing around with reflection or byte code engineering to make it an interesting task.

Closing thoughts

So, there you have it, a quick round trip of some lesser-known containers. I also know of a few others:

  • HiveMind/Tapestry-ioc, the container behind Tapestry. Well, this one looks powerful, but like Tapestry, it changes wildly between releases, and releases happen relatively frequently. From a pure technology point of view, it looks very powerful, as HLS does not mind looking for good ideas in other projects.
  • Yan, a relatively unknown container that appears to allow injecting almost everything in almost anything. The author is one of the few who documented how to use his container in rich domain model. The Spring guys have allowed this only very recently, and it requires the use of the AOP sledgehammer to get it to work.
  • And, of course, Spring, whose basic container is becoming more powerful with every release thanks to ideas from all those lesser-known projects, but which remains a huge project. Granted, you can use only parts of it, but it's not the standard usage.

I hope this (longish) post will be of some use to somebody.