From Handicraft to the Cloud: Part 1 of 2

As in the industrial revolution, progress in the computer revolution comes at a price.

Despite all the technological innovation, computing is all too often a frustrating and limiting experience. Is this because Google, Apple, Facebook or Microsoft are evil and lock down hardware, platforms, software and content? NO! Is this because we should all avoid proprietary software, even freeware in favour of mutual co-operative open-source projects such as Linux? Well this is barely half the story as can be seen when free open-source software is not immune to industry trends such as cloud computing, bloat, eye-candy, new version fetishisation and app stores. The elephant in the room is the broad historical trends in the industry which affect free software somewhat less than proprietary software and mirror the industrial revolution and tend to disempower, limit and alienate (in the Marxist sense) the end user. Software and personal computing suffers from class divisions.

2011 was another year of hype for cloud computing. In June 2011 Google launched the Chromebook and Apple announced iCloud. The Google Chromebook is no ordinary laptop, it relies on storing software and your data on Google servers. This is called cloud computing and has been considered the next big thing in IT by market experts for some years. The term ‘cloud’ is appropriate since its benefits are nebulous and it may also represent dark clouds on the horizon for personal computing.

The history of personal computing is almost as old as the first manned moon landing in 1969, and in technological terms, the personal computers of today are certainly more advanced. Why on earth is personal computing then, a frustrating and limiting experience? By 1965, Gordon E. Moore had predicted the rate of advancement in computer hardware (doubling every 18 months), which has proved largely accurate. By 1973, the first mouse-driven graphical user interface had been produced.  Niklaus Wirth observed that ‘software is getting slower more rapidly than hardware is getting faster’. This parallels Stanley Jevon’s observation over a century earlier that ‘advances in efficiency tend to increase resource consumption’. To find out why this is the case we have to look at the history of personal computing and its potential downfall.

‘A computer in every home’

The first million selling computer book was the Art of Computer Programming by Donald Knuth in 1968. Although it was an incredibly technical book, Knuth liked to stress the art aspect of the title, and it was certainly in stark contrast to the industry that it is today. In other respects, sentiments among computing enthusiasts would be familiar (especially to socialists) throughout history. In particular, The Hacker Ethic (Steven Levy, 1984, Hackers) which included such noble statements as ‘all information should be free’ and ‘access to computers should be unlimited and total’. This was not unusual for the time. Popular computing literature including magazines and books such as 101 Basic Computer Games (David H. Ahl, 1973) printed lines of code and encouraged users (especially children) to input the code to produce games. Most personal computers offered a command-line interface (even those with an additional graphical user interface) and were bundled with some form of the BASIC programming language, so named because of its ease of use and suitability for learning. The learning curve for using home computers was steep when compared with today but popular computing literature at the time helped make the curve somewhat more graduated. Despite its significance, very few writers have lamented the disappearance of BASIC, perhaps the most well-known article is titled ‘Why Johnny Can’t Code’ (David Brin, 2006).

As Neal Stephenson put it, in the beginning there was the command-line and Microsoft had the odd idea of selling operating systems. It was Apple Macintosh however, who introduced the first commercially successful graphical user interface with drag and drop capabilities and WIMP (Windows, Icons, Menus, Pointers) interface in 1984. Just a year later, the Commodore Amiga 1000 made colour, animation, sound and multi-tasking affordable to home users. Although the desktop metaphor for graphical user interfaces was used by rivals, the Amiga offered an indicator of the ethos of the time. It used the metaphor of a deeply-customisable workbench for its operating system. The desktop metaphor prevailed partly because home computers in the West evolved out of the office at a time when industrial capital was on the decline.  But also, the desktop prevailed over the workbench metaphor, because empowering users to control the means of production was gradually becoming an alien notion.

No single business seemed to be able to establish a hardware monopoly, let alone a software monopoly, that was unchallenged by rivals. In January 1986 PC Magazine reviewed fifty-seven different programs for word-processing. Even the most popular application software such as WordStar, AmiPro and WordPerfect was largely produced by small teams and in some cases individuals. The spirit of the age was described as the era of the bedroom programmer, although this is possibly a little exaggerated. Sharing software was widespread, computing magazines distributed cover disks with public domain and shareware software and users exchanged software in classifieds advertisements. Software developers might not have liked it, but magazines were an important channel for distribution. Acceptable software costs to users were generally regarded as the cost of the disk and this was the attitude in businesses as well as at home. The limitations of the hardware of the time meant also that developers were expected to optimize code to be as fast as possible.

Windows 95

The personal computer industry grew rapidly over subsequent years. By 1992, Amigas had fallen by the wayside. Ataris were cheaper and in 1993 could boast multi-tasking but by then it was too late. A monopoly position had already been established by IBM-PC compatible hardware and Microsoft consolidated their monopoly in software with a $300m launch of Windows 95. Although users may have been reluctant to embrace planned obsolescence, this was a time when the vision of ‘a computer in every home’ still involved selling hardware to first-time buyers.

The truth behind the hype was a little different, RoughlyDrafted.com (5 February 2007) comments:

‘From the mid 80s to the mid 90s, Microsoft amassed fortunes as an application developer for the Mac. Even in 1996, Microsoft reported making more money from Office–$4.56bn–than it did from all of its Windows sales combined–$4.11bn. Tying sales of Windows 95 to Office helped to boost sales of both. Microsoft pushed the new version of Office as a reason to buy Windows 95, and Windows 95 helped kill sales of rival applications, including the then standard WordPerfect and Lotus 1-2-3, neither of which were available or optimized for Windows 95 at its launch. By the release of Windows XP in 2001, Microsoft had swallowed up 98percent of the OS market’

Innovation, but not for the masses

Although Windows 95 firmly established the desktop metaphor over rivals, this was the limit of its innovation and other enhancements were criticised as merely cosmetic. The successful introduction of encyclopedia software called Encarta on CD-Rom was regarded as cutting edge use of technology for encyclopedic content. That encyclopedias might not be traditionally editorially controlled and might instead be participatory by the next major Windows release was not anticipated by Bill Gates in his published book The Road Ahead in 1995 or its heavily revised 1996 reprint.

Many innovations after the achievement of software monopoly never reached the masses or if they did, many years later than when they first appeared. IBM OS/2 never replaced Windows 95, though some considered it more advanced. By 1997, an operating system called BeOS had been introduced with instant-on boot, 64-bit, journaling, indexing and metadata tags, but this too never reached the masses. The first 32-bit internet web browser, with FTP client, usenet group reader and internet relay chat (IRC) client was not from Microsoft but from Cyberjack in 1995. But by embedding Internet Explorer into Windows just as the internet was taking off, Microsoft was able to delay tabbed web browsing as standard (until 2006) which already existed in the relatively popular Netscape Navigator. Internet Explorer became so popular for about 5 years after 2001 that it felt no need to introduce a new version. By then it could no longer ignore the threat of Mozilla Firefox (loosely descended from Netscape) which was rapidly gaining market share.

At least, the marketing for new versions of Windows did claim to offer usability improvements and fix the many problems identified in previous versions rather than just eye-candy. What became clear beyond any doubt was that software was getting inflated at a rate roughly in proportion to each passing year (faster than Moore’s Law). Benchmarking tests are one way to test this, and are sometimes used in the independent computing press.

Bill Gates commented: ‘I’m saying we don’t do a new version to fix bugs […] We’d never be able to sell a release on that basis’ (Focus Magazine 23 October 1995).

The vision of ‘a computer in every home’ began to look dated. Instead, focus shifted to encouraging existing computer users to upgrade software. It suited hardware manufacturers that software updates should make older computers slower. Whereas the earlier trend was for first-time hardware sales to come packaged with software, now software sales (with artificial barriers)  would drive the need to buy new hardware.

Games also played a big part in driving early hardware sales of the first personal computers in the home. Games revenue eventually overtook the movie and music industry and games were even described as the leading artform of the era. The latest ‘Call of Duty’ game was the biggest entertainment launch ever in revenue terms. Games helped drive the industry upgrades but many users’ reluctance to upgrade persisted, and Windows sales through retail channels continued to decline. Planned obsolescence needed introducing more forcefully, and subscriber-computing and the internet was about to offer the opportunity to do it.

The emergence of viruses and malware on the burgeoning internet helped the software update industry. The idea of software spying on the user or otherwise compromising privacy, was something malware and viruses did, not legitimate software. Users owned their software and anything else was an alien concept. As one user on MSFN.org put it:

‘I will never understand why users tolerate or accept this. If an individual or company demanded that you prove that you did not steal your home or car, you’d eventually file some kind of complaint or harassment charges against them. If the same standards that are used for applications were applied to operating systems, XP and newer systems would be classified as spyware. Windows has been going in the opposite direction for some time, with each new version giving the user less control over what it does and less access to the data it stores.’

This comfortable position of around 90 percent market share could not be threatened by any rivals. Journalists of the computing press might have been tempted to describe the hardware and software monopolies as the end of home personal computing history. But to do so, would have been as foolish as Francis Fukuyama’s claim to have reached ‘The End of History’ a decade earlier.

DJW

Leave a Reply