Going Web 2.0

I do not normally take to buzz words, hype and shiny new toys. These things usually annoy me.

I do however think there is something to Web 2.0. I don’t want to go into what is meant by that here. See Tim O’Reilly’s landmark essay for a definition. If you look at the technologies that go into it, most of them are late 90’s. While Tim provides a broad explanation of what Web 2.0 is, for my purposes I am interested in the following technologies:


Some of the browser standards introduced in the late 90s are:

So why are we proclaiming Web 2.0 in 2005/6? The answer I think is two fold:

First, I feared but expected that with IE dominance, Microsoft would go on to define a Web 2.0 of its own; one that work with Windows. While they have played a major part in Web Services it is the community itself that created the new platform – just like Web 1.0. The Web 2.0 stack prefers open standards and standards based browsers. Firefox and Safari both led the charge with built-in RSS readers and concern for web standards. Microsoft’s dismantling of their IE team led to a period of stability where it was possible to build frameworks by working around limitations. Microsoft, very strangely in my experience, is paying some attention to web standards. A review of the IE blogs shows that there will be some movement toward greater CSS compliance, together with RSS support. IE7 is pretty much a catchup to Firefox/Safari for Microsoft.

Second, there is the network effect. Metcalfe’s law states that the total value of a good or service that possesses a network effect is roughly proportional to the square of the number of customers already owning that good or using that service. A mobile phone example of the power of this law can be found in SMS. All GSM phones have SMS. While the 3G bandwidth speculators bid up spectrum licenses to dizzy heights the Generation Xers and Ys stuck to texting. SMS was created in GSM Doc 28/85 rev2, June 1985. SMS was ubiquitous precisely because the standard was so old. New standards such as MMS are not supported on all phones. People stick to what they know will work.

So, standards are introduced in the late 90s; time ticks by; an installed base grows and becomes ubiquitous; some smart developers build to it. Voila! Web 2.0. Tim’s article partly defines Web 2.0 through exemplars. I think one of the best ways of web2ifying your mind is to configure netvibes as your home page. Netvibes is a great example of Web 2.0 in itself and lets you add portlets? widgets? Web 2.0 bits? from plenty of others.

I think what comes next is competitive advantage by being Web 2.0 compliant. A bit like when the web itself was new, and you were either on it or not. Back then there was a competitive advantage in going online. I think there is a window of opportunity right now to gain competitive advantage from Web 2.0.

(Attribution: Many of the links for this post were shamelessly lifted from Wikipedia)

A patent on digital computing

Last year I attended a meeting of the ACM at HP in San Jose. The topic was the early history of computing. During the session the presenter touched on a patent court case for a patent on digital computing that had occurred in 1973. I was not exactly up with the latest tech news back then and missed the story. That you have not heard of the case if because the patent was not upheld. Had it been, and had the patent applicant done the usual evergreening, computer might still refer to a job role, rather than a machine.

I am probably not anti patents; just anti the present US patent system which is being foisted on the world through their bilateral trade deals. Yes, unfortunately Australia now grants software patents. Hopefully the Europeans will bring some sense to this by continuing to refuse to go the US way. My opinion is that these temporary government granted monopolies should be granted in far fewer cases than they now are. The original idea when they were first introduced in England hundreds of years ago was to foster innovation, not to prevent it.

This very fascinating story is related at http://www.scl.ameslab.gov/ABC/Trial.html.

As an aside, the court case also found that John Vincent Atanasoff and Clifford Berry had constructed the first electronic digital computer at Iowa State College in the 1939 – 1942 period, not the Mauchly and Eckert who had been taking credit for three decades. The Wikipedia entry gives a balanced account. Encarta does too, even mentioning that Atanasoff was not given credit until the court case in 1973.

Whats up with SourceForge?

Something seems to be up with SourceForge.

I have been using it heavily this week. CVS has been down three times for extended periods. The Admin web site has been down 4 times. This is seriously interrupting me!

There are two possibilities:

  1. This is just a run of bad luck
  2. Sourceforge is underfunded

To explore the second option lets look at the Form 10 filed by VA Software on 12 December 2005.

Sales of Sourceforge software declined from 1.9m to 1.4m. R&D on Sourceforge dropped from .9m to .8m.
Only 136 customers have licensed SourceForge.

Sales from advertising have gone up 36% over the same period.

Overall VA Software had a net loss of 1.2m for the 30 September 2005 quarter.

What is unclear is how much of the advertising revenue comes from sourceforge.net and how much comes from slashdot.org. Based on my own viewing habits I would guestimate that less than 10% comes from sourceforge.net. Running slashdot is much simpler than running sourceforge. Slashdot is a blog. Sourceforge is an entire software development hosting environment complete with compile farms.

So I think the second possibility, underfunding, is the likely reason for the outages. It may also be the cause of the sluggish introduction of subversion, which is about to come out of beta and be generally available, and the long period where project statistics were not available.

It may be there that there is simply not enough money in it.

Having said that, the importance of sourceforge to the open source community cannot be overstated. If they decide to offload it I hope it finds another benevolent home. Sun? O’Reilly? The UN?

Release early, release often.

Linus Torvalds said right from the beginning “Release early, release often”. This seems to be a critical requirement for open source projects to attract collaborators. There is a whole chapter on this in The Cathedral & The Bazaar, the classic open source text.

Having released ehcache-1.2beta3 with the distributed stuff in, some old collaborators and new ones have come forward. Making the beta release is a symbol that the new version will be released. Collaborators know their efforts will not be wasted.

Apart from releases here is what has been going on with lines of code:


Fisheye is cool. See the whole ehcache fisheye here. The steeper the line the more you have been doing. The flat lines represent periods of inactivity.

I am now driving ehcache-1.2 to final release. If anyone else is interested in helping out contact me at gluck _ A T _ gregluck.com.