The problem is, software is too reliable. Pause for spluttering. Start digression, intending to explain later .. I just found my trusty ole 1987 MacPlus in a dusty cupboard. It works. The entire hard disc has 20MB. My word processing package, WriteNow, made great documents and took up just 128K. The MacPlus did everything I wanted just fine. How did we get to PCs with a GB of RAM, and vast application bloatware ?
In my mind at least, this connects up with the logic of the Virtual Observatory as I have been pushing in talks for years. The number one problem is not the number of bits in astronomical databases; its the number of databases, and their heterogeneity. Slogan number one is therefore Standards, Standards, Standards. The number two problem is the I/O and last-mile bandwidth bottlenecks; these grow slower than Moore’s law. Slogan number two is Shift the Results not the Data. You can’t download the database and hack your own analysis programme. The data centre has to provide professional tools for manipulating the data at source. So we need to move a service architecture.
When we first started this game, many people said this couldn’t be right, as the whole modern trend was away from old fashioned big computers at special centres, and towards more power on people’s desktops; and that this was spirit of the internet too, and peer-to-peer file sharing and all that trendy stuff. But where are we now ? The world is dominated by a few gargantuan computer systems – they are run by Google, Amazon, eBay etc. In the spirit of the Web, this stuff just feels like its “inside your PC”, but the truth behind this illusion is enormous corporate enterprise servers.
So why do we keep buying bigger and faster PCs ? I think we might stop …Expansion in storage has made sense, as we wanted pictures, songs and movies. But this need could stop; people trust Flickr to store their pictures. Expansion in CPU speed has been driven by gaming; but dedicated hardware like the Playstation is much better and more reliable, so who cares ? What about the expansion in application and operating system size ? To some extent this has been driven by the desire for using a computer to be easier, faster, and with more bells and whistles. But most people feel that things are way past the point of good-enough-thank-you. But those poor old software companies – how can they make any money if we don’t keep buying new stuff ? They have to work harder and harder to convince us the new stuff is really better and we want it. Who is going to buy Vista ? Not me. Can’t see the point. This just has to crack ..
So what is the sustainable solution? Obvious. The customer needs to rent a service, not buy a product. They have to make us pay for every use. We are getting used to this idea already … You don’t own that music, only the right to play it. Most video use is not download; you leave it where it is and play it again. Now we can all run our calendars and spreadsheets on Google, and never need to buy an application. Pretty soon the whole idea of owning an application will seem quaint… Java Web Start seems pointed in this direction too – click here and get the latest version, thats another 50p thank you. As an individual customer you don’t pay per use on Google, as their business model relies on the ads. But you can only get to Google by paying your ISP a monthly fee. Obviously a service-based OS must be up next. Who needs any more than a very thin client ?
So for the VO, the service architecture is driven by need. For the commercial world, its driven by economics. But we are playing the same game.
What’s really taking off now of course is internet TV. This is how Google will finally take over the Internet, as explained by Bob Cringely. I will do my own take on that in a wee while. Meanwhile, astronomy also is moving towards movies – astronomy in the time domain. I shall definitely have a go at that soon.. No coincidence maybe that Google have joined the LSST partnership…