I did start a blog back 2008 but then forgot about it. I hope to get back to adding to this soon. For now, this was the last post.

A couple of years ago (2006 to be exact), I was staying at an Hotel in Redmond, Seattle. Not that I was visiting the great satan, I hasten to add - I was leading a data comms seminar at one of their neighbours. However, every morning in the Hotel restaurant, there was a new group of well scrubbed young men and women, all looking slightly anxious. I asked the waitress and she told me they were prospective interns going for interviews at Microsoft.

Ah, now I understood. Going back to the late seventies, I well remember many of my contemporaries shaving off their beards and donning blue suits (male) or getting their hair done and slapping on the make-up (female), and looking just as anxious while waiting for their interview as prospective IBM System Engineers.

There's one thing that you can be sure of, when nervous middle class college kids are queuing up and looking for a good salary and pension, this ain't the most exciting company in town.

I have a track record as a Cassandra. Back in the mid-eighties I recall prophesying the downfall of IBM. Of course, this was in a pub with a pint of London Pride in hand. My then workmates wouldn't believe me. But my rationale was clear. The company had stopped innovating and was relying on inertia and FUD. Its big idea - Systems Application Architecture (SAA) was more about control of the market than about offering something new and exciting, and the IBM PC, while successful, was opening up a market that IBM was having difficulty in controlling.

The tipping point came in the late eighties with the PS/2 and it's unlamented Operating System: OS/2. With the PS/2, IBM tried to regain control of the market. Its microchannel architecture was no longer as open as the old PC, and the PC would be brought back in to its corporate product strategy.

The flaw in the plan was that the independent add-on and clone market, that had developed around the old PC, was pushed aside just a little too quickly. Instead of falling in behind IBM, they went off and evolved the original PC with the Extended Industry Standard Architecture (EISA), and took the market away from IBM. Once the old King was shown to be vulnerable, inertia and FUD could no longer work their old magic, and IBM was in trouble. Only by re-inventing itself as a Software Services company did it have a future.

Let's fast forward to 2008. Is Microsoft now in the same position as IBM in the mid-eighties? It is certainly in a dominant position, and I can't recall an innovative new product that has come out of Microsoft in the last decade. At least not of the iPod or iPhone variety. It has .net - arguably the SAA of its day - and where would Microsoft be without Office.

And then there is Windows. Could Windows Vista be Microsoft's PS/2 moment?

If any recent product qualifies for the title of "dog" then Vista must be a leading candidate. Successful OSs usually enable their users to do new things. Remember Windows 95 and the great leap forward that it was then from 16-bit to 32-bit computing, or the big improvement to corporate computing that NT4/2000 offered. Even Windows XP offered something by bringing the Windows line together.

But Vista seems to be about disabling the user rather than enabling them. Digital Restrictions Management (DRM) pervades the product, deliberately degrading operations as soon you have added something it doesn't like to the system. It tries to stop you upgrading by demanding revalidation after only a few changes (just try upgrading the motherboard and see what happens). Just like the PS/2, it seems to be attacking the after market in PC add-ons and upgrades.

And then there is security. Vista seems to think that security is about nagging the user - or crying wolf as it is sometimes known. Looks good on paper, but it is not long before you become so used to clicking OK that you stop reading the message...

And then there is what it is not. Why is 64-bit computing not the norm rather than 32-bit? Why is Internet Explorer not run in a sandbox and thus stopping viruses and malware in their tracks? And, if there is one thing I object to, it is having to pay more for anti-virus and anti-spyware software. In any other industry, liability for a product's weaknesses is the responsibility of the supplier and not the user.

So is this example of bloatware for control freaks the turning point?

DRM is probably the big mistake. It is a consumer turn off. Remember Dongles? And the direction of music downloads is now to DRM free mp3s. Worse still is the limitations on consumer upgrades that Vista imposes. This can only piss off the power users and gamers that always want the latest and greatest, and as for the companies that make their money in this area: they will surely be wondering if Vista offers them any future.

Virus ridden, bloated and a consumer turn off. That is the charge that Windows Vista has to face down if it is to have a future and, indeed, if Microsoft is not going to have to re-invent itself as IBM did before it.

For myself, Vista was the final catalyst that pushed me towards Linux on the Desktop. Ubuntu Linux to be precise. I have used Linux servers for years, but the move to the Desktop was new. Vmware was also important in this move as I still have some legacy Windows applications and revving up Windows XP in a virtual machine has turned out to be an excellent way to run them.

All in all, the move has turned out to be much smoother than I expected. The host Linux environment with Firefox for browsing, Thunderbird for EMail and calender, and Open Office for Spreadsheet, Word Processing and Presentations, provides the computing backbone. The main use of Windows (in a virtual machine) now seems to be for a final MS Word compatibility check before sending off documents to clients still stuck with MS Office. Even proprietary applications such as Skype and Real Player all work under Linux.

And Linux is native 64-bit, robust and seems to be much less at risk from malware. What more can I say.

For me, Vista was Microsoft's PS/2 moment. What about the rest of you?


I have had the privilege of seeing the Government IT process from many different directions. I have bid for work, supported procurement officers in preparing their requirements specifications and evaluating tenders, and even been part of a technical audit team for a large UK Government IT procurement that went wrong. I used to think I knew how to make Government IT procurement work. From experience, I now know that that was an arrogant view. No one knows or can know how to make the Government IT procurement process work. The system itself is flawed.

I suppose that I should have known from my student days as to why you can't run any business by simply handling out contracts and letting the contractors get on with it. As they say "be careful of what you ask for because it might come true".

In mathematics, there are two related theorems that tell you this - Godel's incompleteness theorem and Turing's theorem on the undecidability of Turing machines.

Godel's incompleteness theorem is perhaps one of the most fascinating theorems in logic. It tells you that any system of logic is incomplete in that there are self-evident truths that can only be proved by extending the system of logic. As the extension reveals more such truths, further extension is required, ad nauseum. For example, it tells you that you can never publish a complete work of mathematics, without finding that it is soon made out of date.

Godel's incompleteness theorem is also the first gotcha for Government IT. The Government IT process usually starts with a big (and I mean big) requirements specification - a set of rules that the system must comply with. This will be the basis of the contract and it is vitally important that this correct and complete. Unfortunately, as Godel tells you, it can never be complete. The requirements specification is no more than a set of rules and there will always be "truths" that is self evident features of the system, that will be outside of these rules. No matter how much work you put into the development of this specification, it will never be complete.

A wonderful example of this problem is provided in Peter Cook and Dudley Moore's excellent film Bedazzeled. In the film, Peter Cook's Devil grants seven wishes to Dudley Moore's character, but each wish is interpreted in the least advantageous way. In his last wish, Dudley believes he has specified everything: he and the girl are in love, there is no competition, the surroundings are idyllic with no interruptions, etc. However, he overlooked a self-evident truth - that he should be a man! When the wish is fulfilled - he and his girl are both Nuns in a Nunnery and theirs is forbidden love.

This film should probably be required viewing for every Government IT Procurement Officer, illustrating so well that it is not what you put in the Requirements Specification that is important - but what you miss out.

So, what happens next? The Requirements Specification is then sent to the Devil - sorry - to potential contractors, who now put together a bid.

I have often been on bid teams. They are enthusiastic, the adrenalin flows as you compete to win the business, and you are usually the 'A' team, as the company puts its best (wo)men forward to win the business.

You go through the Requirements Specification and you will usually spot a few "truths" that the writer had overlooked. In order to impress them, you reveal those missing truths to the customer and hope to trash the opposition by doing so. It usually works, and the customer gets the warm feeling that you know what you are talking about.

Of course, there will still be "truths" that you have yourself overlooked. You know this through experience and so you include contingency into your estimates, and hope you haven't overlooked too much.

So your bid is prepared and it goes to the Sales Director for sign off. Now, the customer will already had have an internal budget approved, and that is usually quietly made known through whispers in corridors. The Sales Director will be aware of this and, when he sees your estimate, which will almost always be well in excess of the budget, the race is now on to push down the cost and win the bid. Your contingency gets cut down, your estimates are reduced and eventually what was a sound estimate becomes an aggressive one, with many risks signed off.

So the bids go in, most will pass the evaluation process as they will appear to have been well thought out and the contract will be awarded on price. It will typically be awarded to the contractor that was most aggressive in paring down their estimates i.e. the one who will be at the biggest risk of making a loss.

The contract is awarded and the project team assembled. The 'A' team have gone on to the next bid or to fire fight an older project that is now in trouble. The assembled team will be optimistic but lacking in experience, and the Project Manager will now be starting to worry about how the project can be brought in within budget - and usually will be busy cursing the Sales Director for landing them with an under budgeted project.

The honeymoon period of the project is now entered as both customer and project team get to know each other - but this honeymoon rarely lasts long. The Project Manager is charged with making a profit and has no real interest in simply being nice to the customer. As the project team get to work, they start to understand what is really wanted, spot more previously unrevealed truths, and start to realise why there was contingency in the original bid.

For the Project Manager, this is bad news, as the estimates for the work are now starting to creep up and profit is turning into a loss. The only way out of this is to start reading the contract very carefully and to prove to the customer that there are essential requirements that were not in the original contract - and hence will require additional payment.

Project Managers can rarely afford to be nice to their customers. One piece of advice I once received that should be engraved on every Project Manager's forehead is that "if the contract does not say that the customer should not freeze in the dark, you let them freeze in the dark".

What now follows is not just the end of the Honeymoon, but an increasingly acrimonious discussion that almost, but not quite, ends in divorce. The project team seeks to educate the customer as to what they missed out from the requirements specification and why it is so important to extend the contract to cover it. The customer seeks to prove it was in the original contract - or at least the implication was obvious. The argument often gets down to interpretations of the original clauses that were never even considered when the contract was drawn up.

The end result is usually compromise. More money is found, while the contractor will give way on some new requirements. However, this is a never ending process as both sides start to understand better and better about what the system should do - and what was missed out from the original specification. Sometimes the changes requested are very radical and important functions are just dropped altogether because there is no money left to fund them. Project timescales become extended to cope with the degree of change.

Even when the extra money has been found, costs usually escalate, as estimates were found to have been too optimistic and worse, invention was scheduled into the project plan. This happens far too often and is the result of a bid put together in haste. A subsystem is specified and a work estimate prepared for its development, but no one really thought about how it would be implemented The Project Team only realise this when they start to do the job and it becomes clear that they not only have to implement a solution, but determine what that should be.

However, the Project Manager still has one weapon left in their noble duty to make a profit for the company, and this is the one given to them by Turing's theorem on the undecidability of Turing machines. To put it simply, this theorem tells you that there is no standard way that you can test a computer program. i.e. writing the Test Plan into the Requirements Specification ain't possible in a meaningful way.

You can always tell the real novices in software procurement as they do not even require that all acceptance tests must be performed and be passed on the same software build. All they do is put in some vague sentence about writing a Test Plan that tests all requirements.

Testing is rarely considered properly in the original requirements specification, and is left to broad statements of intent, references to various methologies and the need for an outline Test Plan to be provided with the bid. Outline is usually the operative word and insufficient attention is given to the depth of testing necessary. This gives every opportunity for the Project Manager to cut back on testing and hence save cost.

So we finally get to the end of the project and the delivered system signed off against a somewhat minimal set of acceptance tests. The delivered system was developed by an often inexperienced team against a Requirement Specification with many holes in it. Some of those holes were filled but other functions were knowingly dropped in order to save costs - and there will always be missing functions that are not there because no one realised they should be there. The result is then tested against an inadequate Test Plan.

And they wonder why Government IT Projects fail.

How to make it work? Well, that's another story, but needs to recognise that IT cannot be developed by contract alone. It's an iterative process, with developers and users working alongside each other, while both better understand what has to be achieved.

When researching a paper on secure communications recently, I came across this interesting paper from the NSA. The paper is on the benefits of elliptic curve encryption technology, and argues that there are many advantages that elliptic curves have over the more traditional RSA public key encryption, such as shorter key lengths and less computational effort for the same level of security. However, as the paper notes:

Despite the many advantages of elliptic curves and despite the adoption of elliptic curves by many users, many vendors and academics view the intellectual property environment surrounding elliptic curves as a major roadblock to their implementation and use.

One company alone has 130 patents, and others also claim patents, so no wonder the industry has largely given elliptic curves a wide berth and stayed with the now out of patent RSA. One more example of how patents slow down technology adoption.

Why do we allow this to keep happening? It's not as if software patents reward those with the ideas. Very few new technologies are great leaps forward and patented algorithms are usually built on a substantial body of work that was already in the public domain. Slowing down innovation is not what patents should be about.

The most notorious example of the (legal) mis-use of patents was Unisys' LZW patent. This was for an initialisation vector for the public domain LZ78 loseless data compression algorithm. A useful contribution, but hardly rocket science. Without the preceding work, the patent would have never have been possible and the improvement itself was only a minor one.

The problem is that when Compuserve came along and defined the format for the GIF image file, they apparently thought that Unisys would not insist on royalties for their small improvement to LZ78 and included the LZW initialisation vector in their specification. GIF then became popular and, along with JPEG, became a popular image format for the World Wide Web. Then Unisys, will the full force of the law on their side, turned up to claim their royalties.

The classic patent Troll in action. Patent an idea, perhaps a small improvement over other people's ideas. Sit back and wait for someone else to put in all the hard work needed to get the product out there and make it a standard - and then cash in. All perfectly legal.

There is no absolute right to property, whether intellectual or otherwise. The concept of property only exists because it is useful to society. We allow land to be owned so that the farmer the puts in the work to raise the crops has the exclusive right to harvest them. Anything else would be inequitable and would make it pointless to farm. Without land ownership, either individual or collective, we are all hunter/gatherers.

We have copyright, because the author that puts in days, hours or weeks to write a book needs to be protected from unscrupulous publishers who would otherwise copy the result of all that work and seek to profit from someone else's activity. Again, without copyright the reward would go to those who profit from other's work rather than their own.

And, we also have patents, in order to recognise that those who put in years of research into understanding a technology and testing out various designs can publish their work, add to the total sum of human knowledge, and still earn a return on their investment. Again, steering profit to those who put in the effort.

But ideas are another matter. Anyone can have ideas and they rarely occur in isolation. Ideas are often built on other ideas, and the free exchange of ideas is needed to advance our society. Patents on ideas lead to the balkanisation of those ideas, as different players each patent a small part of a bigger idea. When someone makes that big idea a reality, the holders of patents on small parts of that idea all rush to claim their fee but without having contributing anything, or at most marginally, to its success.

Patents on ideas give the benefits to those who register the idea rather than those who put the work into making them work. The reverse of what intellectual property should be about. By failing to recognise that it is not the idea that is important, but the effort needed to bring the idea into reality that is important, they skew the playing field and stand in the way of investment rather than promoting it. Instead of "best practice" being quickly adopted throughout an industry a patent can stop it in its tracks.

Nowhere is the failure more obvious than when algorithms are patented. A program is the sum of hundreds perhaps thousands of algorithms. If each algorithm was subject to a separate patent then it would not be feasible to ever produce that program. Simply identifying each effective patent would take longer than writing the programPatently Wrong in the first, and as for the cost of licensing each one simultaneously... And how do you cope with the patent holders that see their patent as a way of protecting their market share.

Why should someone put all the effort necessary into writing a program, making it both useful and intuitive to use, and testing it, when non-contributors can take the profit?

And anyway, why can algorithms be treated as inventions? Algorithms are discovered not invented: a fact that should be obvious to any student of a Turing Machine.

I suppose the answer to why software patents have not been banned lies in the usual fact that patents on ideas benefit those with money. They can be used to protect market share and preserve monopoly positions. Accountants like them because some notional value can be placed on them to bulk up a balance sheet, and lawyers like them for the obvious reason. Who loses out? The answer is the consumer that is denied the innovative new products that patents keep out and the downward cost pressure that results from the break up of monopolies.

We in the West also lose out as patent law, when applied to ideas, works against start up companies, while those in the East are less bothered about the niceties of intellectual property law.

So why don't politicians do something about it and have a war on patents on ideas? I suppose the answer is they need campaign funds. Who provides these funds - see above.