I attended a round table to discuss the use of open source software in government, with Red Hat, Ingres, Alfresco, and on the government side representatives from the Office of National Statistics and Islington Council.
It is a fascinating topic on several levels - and not just for government. The two big questions: first, what is the rationale for prefering open source; and second, if you are convinced of its desirability how on earth do you get a huge, diversified entity like the UK government to increase its adoption?
There is clearly some kind of disconnect. The government already has a policy which stops short of mandating open source, but does say:
Where there is no significant overall cost difference between open and non-open source products, open source will be selected on the basis of its additional inherent flexibility.
In other words, given two otherwise equal proposals, open source is favoured - yet in practice we were told that 95% of software in use in the UK government remains proprietary.
So why is that? There are dozens of reasons. It's the available skills, with armies of experts in Microsoft, Oracle, IBM, and so on, compared with only a few willing to specify open source technologies. It's the culture, with countless existing supplier relationships and a well-trodden procurement path with the usual suspects. It's the existing envirnonment: if you start from a point where the servers are Windows, say, and the desktops all have Microsoft Office, it feels more comfortable to continue down that path. It's the lock-in, especially when it comes to things like proprietay SQL extensions and stored procedure languages, that simply do not port easily to new database managers. And it's canny vendors, who go for site licences with the widest possible scope, so that if a small group decides to break the mould and use something different, it looks more like a cost than a saving - because they are already licenced for the proprietary stuff.
These are tough obstacles to overcome. Put another way, the rationale for adoption has to be exceptionally strong to overcome the inertia; and it was here that I found the round table unconvincing. The open source companies gave diverse reasons for the benefits. Lower cost was mentioned frequently. By contrast, John Powell, President and CEO of Alfresco, talked intensely about how the UK was somehow handing over its soul to silicon valley, by not building up local skills in open source software. Someone else said how nice it was that open source software was free so that everyone in an organization could use it; another gently observed that since most open source companies make their money from support agreements that are per user, this is often not the case.
The most persuasive rationale is that open source software tends to support open standards and therefore avoids lock-in. This lowers long-term costs, by ensuring that the customer always has freedom to switch.
The snag with all these arguments is that the proprietary companies can counter them easily. They will reel off all the open standards they support. They will draw graphs showing how much money you save by using their stuff. They will point out that those with skills in their technology can easily market them worldwide. The outcome is that nothing changes.
I don't doubt that leading vendors make excessive profits on software that should be commoditised and cheap, or that vendors follow strategies designed as much to keep us hooked on their stuff, as to advance technology for our benefit. It's a cycle that needs to change.
At the same time, the open source vendors have to recognize - and to be fair, I reckon they often do - that most of us will not switch for ideological reasons. We want to get our work done. The best open source projects, things like the Apache web server (if there is anything like it) succeed on sheer quality and reliability, not merely through being open source.
The tough question: does the open source ecosystem have sufficient resources to deliver that quality, against proprietary vendors with huge profits to pour into research and development? There are plenty of decent open source products; the number that are truly best of breed is more limited. Currently it is the big proprietary vendors that have the advantage.
I left the table with more questions than answers. Should governments introduce more draconian legislation, to counterbalance the industry bias towards the status quo? Will the move towards cloud computing break the pattern? Should legislation be focused on mandating standards support, or even that applications should be proven to work on two different platforms, rather than the matter of open versus closed source? If today's MySQL is tomorrow's Oracle, is there really any difference? Isn't there too much to worry about already, solving problems and delivering successful projects?
If pressed, I would always incline towards the best technical solution, rather than one which ticks boxes, even open source boxes. That said, we are all susceptible to being bamboozled, not even looking at open source alternatives if we already know a proprietary tool or component or platform that will do the job. Even for individual developers, it makes sense to choose the open source solution in cases where other things are equal; and to think twice before building vendor lock-in into the applications we create.
Coaching is offering options with supportCoaching is not just teaching. Sometimes, coaching is listening, sometimes it's offering an example, sometimes it's helping other people see that there are other options. Once you get past sports coaching, coaching is much more than teaching.
Tech media pundits talk incessantly about migration to the cloud, but it is always interesting to get reports from the trenches. Two recents ones interested me. The first is from Patrick McKenzie, who sells a niche Java application. He's just posted a detailed and entertaining blog entitled Why I'm Done making Desktop Applications. His application exists in both desktop and online versions, and he says he was a staunch defender of the advantages of desktop apps - "You can keep your Google Docs, Excel is superior in almost every way." Then he made an online version of his app with if anything fewer features, and tracked the statistics. He discovered the following:
The next major release will almost certainly be its last. The webapp, and my future webapps, seem to be much better investments.
Does any of this apply to corporate development? It's true that life is easier in some ways if you do not have to make a sale; and there are still some things that desktop applications do better, like integrating with Microsoft Office through COM automation, or continuing to work offline on the train or plane - though see Rails creator David Heinemeier Hansson's You're not on a plane for a contrary view.
Still, many of the other factors do apply. Lower support requests, zero installation, accurate usage monitoring - all these things have immediate financial benefit.
It is no longer web application developers who have to justify their case, but rather those who still advocate desktop development.
If you need any further persuading, take a look at this survey of 1400 Microsoft's small business customers by Accredited Supplier. Apparently 62% prefer business applications that work through a browser, and only 18% prefer desktop applications. The more chilling news for Microsoft is that 13% actively intend to switch to Google Apps - with all that implies for sales of Windows server, Exchange, and Office - while only 36% are sure that they are not switching.
All that cloud talk is translating rapidly into business decisions. I'm not sure whether the future of the business application client lies more with AJAX and HTML 5, Microsoft Silverlight, Adobe Flex, or something else; but for sure it is not a desktop technology.
The Apple Mac's OS X is a gorgeous operating system, beloved by its users, but outside the niche world of designers and musicians it is generally not the one used in business. It's an odd situation that puts pressure on system administrators to accommodate Macs into their systems, either from the aforementioned designers, or from eager home Mac users who want to enjoy it at work as well.
Last week Apple released OS X Snow Leopard, also known as OS 10.6. I was at Apple's flagship store in Regent St, London following a press briefing, and watching throngs of people line up to purchase their upgrades left me in no doubt about the Mac's continuing success. Nevertheless, because of its high price and lack of business penetration the Mac remains a minority choice. The figures are slippery, but it may be around 10% of the US market by units, 20% by value (the difference showing how expensive they are), while worldwide it is much smaller.
Now Apple has introduced native support for Exchange as a key feature of Snow Leopard, apparently giving admins one less reason to deny would-be corporate Mac users. Is it enough?
Unfortunately the Exchange support still falls short. I've spent several days working almost exclusively in the new OS, and explored the new Exchange support in Snow Leopard. Although there is a lot to like, such as deep integration with Mail, iCal and tasks, there are snags. For a start, you need Exchange 2007 SP1, and earlier versions remain common. Even if that is in place, Snow Leopard's Exchange support still falls short of Outlook on Windows. The new feature is based on Exchange Web Services (EWS), which do not expose all the features of Exchange.
I found myself missing features like the ability to send on behalf of another user, and access to public folders. Another practical problem is that unless Exchange is configured with EWS in mind, it might not work at all, especially when trying to connect over the public internet. Outlook has its problems, but its ability to use RPC over HTTP, avoiding the need for a VPN connection, is a brilliant feature. It is also strong at seamless online/offline support, stronger than Apple Mail. I'm also not convinced that Mail's new Exchange support is quite done, as evidenced by several crashes like this one:
The real issues are broader than this. Macs still fit awkwardly into Windows-based networks. System administrators like standardisation: standard application deployments, standard operating system builds that can be zapped and re-imaged; standard configurations that can be enforced using Windows group policy. The presence of Macs adds unwelcome complication.
There's also the matter of application compatibility. If your business depends on some little application written in Visual Basic 6 or Microsoft Access, for example, you have to figure out a way for Mac users to run it, by emulation, or terminal services, or some other route.
In the short term then, Snow Leopard will not transform Enterprise Mac adoption. Nevertheless, the Mac is a huge influence, beyond what its market share implies. Further, if a significant number of users are using Windows grudgingly, wishing they were not, that is not good for productivity.
Here's three suggestions. The first is to think cross-platform. The past is the past; but for new applications it is short-sighted to target only Windows. I suspect this lesson has largely been learned, but it still bears repeating. Have a cross-platform client - Silverlight may help for .NET developers, or there is Adobe Flash/Flex, or Java, or pure Web applications, to mention a few solutions.
Second, learn from Apple. Microsoft is doing so, and Windows 7 is the evidence. An interview I did with Bill Buxton last year gives further background.
Third, is it really so hard to accommodate Macs? Yes, there are still issues, and different factors apply in every organization, but few problems are insurmountable.
Incidentally, I am no Mac Zealot. I actually like Windows, and although I'm currently immersed in OS 10.6, I intend to go back to using a PC most of the time. Some things are better on the Mac, some things are not, and most things are more expensive. It does pay though to have users working in the environment they prefer; and where that is possible, it makes sense to allow it.