A Brief History of how Microsoft (and others) Changed the World… Part 1.

In October of last year I pitched an idea to the editor of Backbone Magazine for an article about how Microsoft changed the world.  We were a month shy of the 25th anniversary of Microsoft Windows, and I thought it would be a fitting and timely piece.  He asked me to write a short piece on my view of it, but after sitting down at the keyboard for a few hours I realized that I was over 5,000 words… and had not nearly finished.  Peter and I agreed that the piece would be wrong for Backbone, but I have continued to work on it with the goal of publishing it here.  As it is a combination of history (although I have not checked most of the facts – they are from memory) and opinion, where better to publish it than on The World According to Mitch.

The following is only the first installment… there will be several parts to this article before it is done.  I hope you enjoy this trip down memory lane, and look forward to your comments and feedback! –M

In 1974, chiefly on the strength of an article about a kit-computer in a fringe magazine, a company was incorporated in New Mexico by a Harvard dropout and a couple of friends. Within five years nearly every home computer in the world would be running their software.

While that may seem like an incredible feat, we must put it into the perspective of the day, which is that while there were a lot of people trying to develop a market around the home computer, there weren’t really all that many such machines out there, and the ones that were out were not what we would recognize today as ‘personal computers.’ In fact they had a much closer resemblance to video game machines with keyboards – in fact some of the biggest names were exactly that.

It certainly would have been a far-out prediction to make in 1974 that not only would most households in the developed world have at least one and often several computers in it, but that they would be as ubiquitous as a toaster and in many cases begin to replace common devices such as television, radio, and the telephone.

How did we get from there to here? The music revolution can be attributed to a number of recognizable factors, such as the development of the MP3 file format and personal digital music players such as the iPod. What were once called ‘Record Stores’ are going bankrupt faster that you can take notice of an industry’s demise. Voice over IP (VoIP) has started replacing the Switched POTS (Plain Old Telephone System) in recent years, but it was fifteen years ago that people started connecting their microphones to their computers for the express purpose of speaking with people remotely. Before that – as far back as the early nineteen eighties – computer-savvy people (kids may have been the most prominent but there were plenty of wired-in adults as well) were connecting their computers over the tradition phone system (using a device called a MoDem) to chat with (and in many cases to make) friends on public and private Bulletin Board Systems (BBSes). The Department of Defense (DARPA, actually) had already collaborated with universities to develop an international network of interconnected mainframes that was originally called the ARPAnet but which today is open to all and is more commonly referred to as the Internet. Television was not meant to be a passing fad, but with the development of technologies that allowed highly compressed transmission of audio-video over the ‘Ether’ has forced traditional television to evolve without any identifiable plan, while tolling the death knell of the video rental store.

All of this was done independently (although admittedly the two industries that seem to be the common thread to the development of the enabling technologies are, strangely enough, the military and the adult entertainment industry. Strange bedfellows indeed!) and to attribute all of these technological evolutions to a single company or even a single movement would be folly. However it would be a lie to say that our world has not been changed by the advent of the personal computer industry.

In the 1970s the personal computer (a term that had not been coined yet) was strictly the domain of hobbyists and enthusiasts – often referred to derogatorily as ‘geeks’. We had come a long way since the only way we could interact with computers was with Machine Language (that world-changing leap is credited to Admiral Grace Hopper, USN) but that did not mean that we could still have an intelligent, fluid conversation with them. Many of us remember the first time a computer said Hello to us… however in order for that conversation to take place someone had to first create a program to do so… something like the following:

10 PRINT “Please type your name.”

20 INPUT Name$

30 PRINT “Hello, “ Name$ “! How are you today?”

40 INPUT How$

50 IF How$=”Good” THEN PRINT “That’s great! I hope it continues!”: GOTO 100

60 PRINT “That’s too bad. I hope your day gets better!

100 END

Enthusiasts all over North America were learning the Beginner’s All-purpose Symbolic Instruction Code (BASIC) that was created by that group of kids in Albuquerque. The first ‘creation’ that the Microsoft Corporation marketed was a resounding success… but still not nearly popular enough to actually change the world.

In 1979 there were several home computers on the market, but three of the most popular were the Atari 800, the Commodore VIC 20, and the Apple ][. Atari was a pioneer and industry leader in video games, and making the leap from one industry to the other was, at the time, not too much of a stretch. Commodore was better known for business machines and devices – they made the PET computer but also filing cabinets. It was the Apple Corporation – essentially a couple of guys named Steve working out of their garage – that was the dark horse; they were also hungrier than their competition, and had something that the established players lacked, in that they were just like their target market – hobbyist geeks. In all likelihood nobody reading this article is currently using a computer by Atari or Commodore today… certainly not as their primary computer. However statistically there is a decent chance that some of them are using an Apple, and if you expand that out to mobile devices that percentage shoots up from the single-digits to the very high double-digits. They have had good times and bad, but of the major players from the dawn of the personal computer era Apple is one of the few that remain.

Apple was so successful with the Apple ][ line (including the ][+ and the most popular //e) that companies started developing ‘clones’ of the platform – essentially copies of the computer that looked the same, felt the same, and ran the same programs. In the early 1980s while high-end computer stores were selling the real McCoy for around $3,000 smaller computer stores (and electronics stores and eventually department stores) began selling these clones for about half the price. They were so successful that they began to hurt Apple’s bottom line. When it was time to develop the next platform they would not make the same mistake… they would create a proprietary system that could not be copied so easily. Although rumours of Mac clones would occasionally come up, nobody ever successfully cloned Apple’s GUI-based system.

Microsoft’s founders decided early on that they could either make hardware or software, but to make both would mean to make neither well. They had done very well with the BASIC language that was on most computers of the day. When they were invited to Armonk, NY to discuss developing an operating system for what the giant of the day, International Business Machines, were calling the IBM PC, they made what might be one of the most fateful decisions in the history of the computer industry – rather than developing the OS and then selling it to IBM, they would license it to them, and every computer that IBM would sell would include that OS, and in turn they would pay Microsoft a licensing fee for. The leaders of IBM, who at the time were not convinced that the PC would ever amount to much, thought they were making a good deal. In retrospect, this decision would be one of the nails in the coffin of Big Blue, while at the same time would make a lot of millionaires and then billionaires at Microsoft.

While IBM was developing the PC they turned to another company – Mitch Kapor’s Lotus – to develop a couple of business applications for the platform. As they had traditionally made a fortune selling hardware, they missed the signs that the future leaders of the computer industry would not sell hardware but software. Like the Apple before them, the PC would be cloned for much less money – rather than developing their own CPU they turned to Intel and licensed the 8086, and did not think to write any non-compete agreements, which allowed Intel to sell their CPUs to companies like COMPAQ and others. Intel, who had not thought that anyone would really want to clone their CPUs, did not protect their designs very well.

Just as with the Apple //e, whose computers were cloned and sold cheaper by their competition, IBM’s PC was cloned and sold for a fraction of the cost. IBM was a giant that was too big to recognize the threat, and too set in its ways to notice the world passing them by. When COMPAQ was the first to market with e PC that was completely compatible with IBM’s, but which leveraged the power of Intel’s new 80386 processor, IBM was far too behind to catch up quickly. The once industry leader reinforced the industry view of them as a fading giant when they finally released the PS/2 line, and a number of them still ran the original 8088 and 80286 processors. Today, thirty years after the IBM PC was introduced, the vast majority of us have PCs that can trace their lineage back to that original machine made by IBM in Tampa, Florida. Very few of them are made by IBM, and even the ones who are direct descendants of the original are made by a Chinese company called Lenovo, who bought IBM’s PC business several years ago.

In the meantime Intel, whose CPUs were also being clone, was getting wise. While it was too late for the 80386 and 80486 lines, their next generation, the Pentium line, would be different enough in architecture that it would be difficult to properly clone, while at the same time changing the naming to an actual copyrightable name rather than a generic number ensured that their competition would have to begin to evolve on their own. One of those competitors (AMD) did and would again become real competition to Intel, but with their own product – not simply a clone of the original. (Although to this day we refer to 32-bit software as x86, from the original Intel design, it is important to note that it was AMD that was first to market a 64-bit processor, and while we do refer to that architecture as x64, the familiar i386 directory in Windows has quietly been retired, and the AMD64 is prevalent, if silent, in the Windows source and installs.

In the late 1970s XEROX (another major player in the business machines industry) dabbled with personal computers, but wanted to make the experience easier for the end user. They developed a prototype of a computer that did not need a keyboard for operation; rather they developed what would eventually be called a graphical user interface (GUI) that allowed us for the first time to interact with a computer spatially rather than with words and numbers. Rather than typing commands, the user would move a pointing device that would cause the cursor to move and control objects on the desktop (another term that would be coined later). While it was a brilliant idea, it was ahead of its time. XEROX determined that the projected cost of the PARC 9000 would be in excess of $13,000 – far too expensive to make it a successful product. They mothballed it, but would occasionally demonstrate it to visitors of their Palo Alto Research Center (PARC) in Silicon Valley. Two fateful visitors were Steve Jobs and Bill Gates, both of whom recognized possibilities that XEROX might not have seen.

If it is the widespread proliferation of the GUI that changed the world, then credit has to go to the Apple Corporation, who were first to market – first with the LISA and then with the Macintosh systems. They beat both Microsoft’s Windows and OS/2 (a joint development effort of IBM and Microsoft) by several years. When Microsoft did finally release Windows 3 (the first version that was commercially viable as a standalone product) in 1989 most people thought they were simply copying the hugely successful Macintosh, which was still only available to people willing to buy Apple’s computers. In fact they were still behind the trend, because by this time the most popular software for the PC was well and truly entrenched in businesses, and while they would run on Windows with a bit of effort, they were still text-mode applications that would only run using PIF files that would ‘wedge’ them into Windows.

During the course of the next few years, however, something phenomenal happened. Software companies started developing software to run on Microsoft Windows. At first it was the same desktop publishers that had been leveraging a run-time edition of Windows for several years, such as Aldus’ PageMaker and Ventura’s Publisher. It is easy to forget today that in those crossover days – when the GUI was not actually part of the operating system but rather loaded on top of it – that there was competition for the PC’s desktop. Digital Research released their version – GEM 3.0 – around the same time as Microsoft released theirs, and it was not immediately clear which platform would prevail. However Microsoft made it easier for third-party developers to create software that ran on Windows, and the battle did not last very long. It was Microsoft that came out on top. Soon those companies whose software led the industry – Lotus, dBASE, and WordPerfect to name a few large players – began developing GUI versions of their applications. The first iterations may not have been as easy to work with as their text-based versions, by running them on Windows the end-user could begin to do something they had never been able to do before… run several applications simultaneously. They could write a document in WordPerfect, reference their spreadsheet in Lotus 1-2-3, and then create a mail-merge with the dBASE database… without exiting any of them… just like some people did on the Macintosh computers that were very user-friendly, but had a much more limited number of available applications.

Microsoft had, of course, been creating software applications for several years, both on the PC and on the Mac. They had a word processor that was never as popular as WordPerfect, and what became Excel never took a lot of market share from Lotus 1-2-3. What they had going for them, however, was the vision of interoperability between the most common applications that were used in business. While other companies came to market with application bundles before Microsoft Office was even a vision, but the ability to not only work with the applications simultaneously, but to also be able to copy information from one and seamless paste it into another – while maintaining the proper format (whether source or destination) really was a game changer. That more than anything likely sealed the fate of fulfilling the prediction of a PC on every desktop.

To be continued… stay tuned!

Advertisements

3 thoughts on “A Brief History of how Microsoft (and others) Changed the World… Part 1.

  1. Pingback: 80386DX Microprocessor (386DX) « Jeinrev

  2. Pingback: A Brief History of how Microsoft (and others) Changed the World… Part 2 « The World According to Mitch

  3. Pingback: A Brief History of how Microsoft (and others) Changed the World… Part 2 « The World According to Mitch

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s