Showing posts with label IBM PC. Show all posts
Showing posts with label IBM PC. Show all posts

Thursday, September 19, 2019

The PC Reverse Cambrian Explosion

The Cambrian Explosion is a term from archaeology. It describes a massive increase in the diversity of life that occurred half a billion of years ago. Life on earth went from a measly few thousands of species to hundreds of millions of species in the blink of a geologic eye.

Personal Computers have what I call a "PC Reverse Cambrian Explosion" or PC-RCE. It occurred in the mid-1980s, which some might consider to be half a billion year ago. In the PC-RCE, computers went from hundreds of different designs to one: the IBM PC compatible.

In the late 1970s and very early 1980s, there were lots of designs for small computers. These included the Apple II, the Radio Shack TRS-80, the Commodore PET and CBM machines, and others. There was a great diversity of hardware and software, including processors and operating systems. Some computers had floppy disks, although most did not. Many computers used cassette tape for storage, and some had neither cassette nor floppy disk. Some computers had built-in displays, and others required that you get your own terminal.

By the mid 1980s, that diversity was gone. The IBM PC was the winning design, and the market wanted that design and only that design. (Except for a few stubborn holdouts.)

One might think that the IBM PC caused the PC-RCE, but I think it was something else.

While the IBM PC was popular, other manufacturers could not simply start making compatible machines (or "clones" as they were later called). The hardware for the IBM PC was "open" in that the connectors and buss specification were documented, and this allowed manufacturers to make accessories for IBM PCs. But the software (the operating system and importantly the ROM BIOS) was not open. While both had documentation for the interfaces, they could not be copied without running afoul of copyright law.

Other computer manufacturers could not make IBM PC clones. Their choices were limited to 1) sell non-compatible PCs in a market and did not want them, or 2) go into another business.

Yet we now have many vendors of PCs. What happened?

The first part of the PC-RCE was the weakening of the non-IBM manufacturers. Most went out of business. (Apple survived, by offering compelling alternate designs and focussing on the education market.)

The second part was Microsoft's ability to sell MS-DOS to other manufacturers. It made custom versions for non-compatible hardware by Tandy, Victor, Zenith, and others. While "compatible with MS-DOS" wasn't the same as "compatible with the IBM PC", it allowed other manufacturers to use MS-DOS.

A near-empty market allowed upstart Compaq to introduce its Compaq portable, which was the first system not made by IBM and yet compatible with the IBM PC. It showed that there was a way to build IBM PC "compatibles" legally and profitably. Compaq was successful because it offered a product not available from IBM (a portable computer) that was also compatible (it ran popular software) and used premium components and designs to justify a hefty price tag. (Several thousand dollars at the time.)

The final piece was the Phoenix BIOS. This was the technology that allowed other manufacturers to build compatible PCs at low prices. Compaq had built their own BIOS, making it compatible with the API specified in IBM's documents, but it was an expensive investment. The Phoenix BIOS was available to all manufacturers, which let Phoenix amortize the cost over a larger number of PCs, for a lower per-unit cost.

The market maintained demand for the IBM PC design, but it wasn't fussy about the manufacturer. Customers bought "IBM compatible PCs" with delight. (Especially if the price was lower than IBM's.)

Those events (weakened suppliers, an operating system, a legal path forward, and the technology to execute it) made the PC the one and only design, and killed off the remaining designs. (Again, except for Apple. And Apple came close to extinction on several occasions.)

Now, this is all nice history, but what does it have to do with us folks living today?

The PC-RCE gave us a single design for PCs. That design has evolved over the decades, and just about every piece of the original IBM PC has mutated into something else, but the marketed PCs have remained uniform. At first, IBM specified the design, with the IBM PC, the IBM PC XT, and the IBM PC AT. Later, Microsoft specified the design with its "platform specification" for Windows. Microsoft could do this, due to its dominance of the market for operating systems and office software.

Today, the PC design is governed by various committees and standards organizations. They specify the design for things like the BIOS (or its replacement the UEFI), the power supply, and connectors for accessories. Individual companies have sway; Intel designs processors and support circuitry used in all PCs. Together, these organizations provide a single design which allows for modest variation among manufacturers.

That uniformity is starting to fracture.

Apple's computers joined the PC design in the mid-2000s. The "white MacBook" with an Intel processor was a PC design -- so much so that Windows and Linux can run on it. Yet today, Apple is moving their Macs and MacBooks in a direction different from the mainstream market. Apple-designed chips control certain parts of their computers, and these chips are not provided to other manufacturers. (Apple's iPhones and iPads are unique designs, with no connection to the PC design.)

Google is designing its Chromebooks and slowing moving them away from the "standard" PC design.

Microsoft is building Surface tablets and laptops with its proprietary designs, close to PCs yet not quite identical.

We are approaching a time when we won't think of PCs as completely interchangeable. Instead, we will think of them in terms of manufacturers: Apple PCs, Microsoft PCs, Google PCs, etc. There will still be a mainstream design; Dell and Lenovo and HP want to sell PCs.

The "design your own PC" game is for serious players. It requires a significant investment not only in hardware design but also in software. Apple has been playing that game all along. Microsoft and Google are big enough that they can join. Other companies may get involved, using Linux (or NetBSD as Apple did) as a base for their operating systems.

The market for PCs is fragmenting. In the future, I see a modest number of designs, not the thousands that we had in 1980. The designs will be similar but not identical, and more importantly, not compatible - at least for hardware.

A future with multiple hardware platforms will be a very different place. We have enjoyed a single (evolving) platform for the past four decades. A world with multiple, incompatible platforms will be a new experience for many. It will affect not only hardware designers, but everyone involved with PCs, from programmers to network administrators to purchasing agents. Software may follow the fragmentation, and we could see applications that run on one platform and not others.

A fragmented market will hold challenges. Once committed to one platform, it is hard to move to a different platform. (Just as it is difficult today to move from one operating system to another.) Instead of just the operating system, one will have to change the hardware, operating system, and possibly applications.

It may also be a challenge for Linux and open source software. They have used the common platform as a means of expansion. Will we see specific versions of Linux for specific platforms? Will Linux avoid some platforms as "too difficult" to implement? (The Apple MacBooks, with their extra chips for security, may be a challenge for Linux.)

The fragmentation I describe is a possible future -- its not here today. I wouldn't panic, but I wouldn't ignore it, either. Keep buying PCs, but keep your eyes on them.

Tuesday, February 23, 2016

Why the IBM PC became a standard

The history of the PC is simple: After several companies, including Apple, Radio Shack, and Commodore, built and successfully sold microcomputers, IBM entered the market and its products were adopted as the standard. Yet I think there is more to it than that.

The IBM PC was a popular computer. Some historians claim that its open architecture and its superior design lead to its success. The architecture was open, more open than some but not all competing products. The design was good, but not superior -- and we are still paying for some of the compromises IBM made in those early days.

It was more than openness and design that made the IBM PC the standard. There were three other factors.

The first was a set of compelling applications. The "killer application" was the spreadsheet. Considered mundane today, the spreadsheet was a large step up from the manual methods of calculation that preceded the PC. It was far better than doing math by hand or even with the assistance of a mechanical or electronic calculator. Spreadsheets provided more automation, allowing for sophisticated formulas.

The other compelling (although perhaps not a "killer application") was word processing. Personal computers could be purchased for slightly more that electric typewriters, and word processors gave us the ability to compose, store, retrieve, and revise documents.

The second factor was a sense of urgency. Managers may not have understood personal computers or foreseen how they would change businesses, but they knew (or believed) that computers were the way of the future and they did not want to be left behind. Perhaps the fear was a remnant of the 1960s space race (with its jet motors, robots, and computers) or perhaps it was caused by the hype of IBM's advertising. Whatever the reason, there was a fear of becoming obsolete, of missing the computing boat. That fear pushed companies to adopt computers.

These two factors were enough to drive the PC revolution, but they were not enough to force the IBM PC as a standard. There were many competing systems, several which ran MS-DOS and Lotus 1-2-3, just like the IBM PC. There was a third factor, one that moved businesses into the IBM fold. That factor was the cost of PCs (IBM or otherwise).

Personal computers in the early 1980s were expensive devices. The price for a typical business system was in the $3000 to $5000 range. (And that was in 1980 dollars!)

The expense of a microcomputer meant that business had to think carefully about their investment. A wrong choice would mean a significant loss of capital. Businesses wanted to avoid that loss, so they went with a safe choice: IBM.

Those three factors (compelling applications, a sense of urgency, and risk avoidance) led to the IBM PC as a standard.

* * * * *

Looking at today's market, there is no analogous situation. Personal computers are inexpensive, standardized boxes. But what about new technologies?

Phones and tablets are inexpensive, with no compelling application for businesses. There may be a sense of urgency to build mobile apps for customers, but note that customer apps are outward-facing, not inward-facing like the spreadsheets and word processors of the PC revolution. Equipment for inward-facing applications can be standardized, as the equipment is under your control. Outward-facing apps cannot be easily standardized, as customers choose their hardware. That's a big difference between the PC revolution and the mobile revolution. There is not sufficient force to select one design as the standard -- and we have competing designs, Apple, Android, and Microsoft.

Cloud systems inward-facing, but not compelling. There is no sense of urgency to move one's systems to the cloud. Cloud systems are not expensive either -- at least not in the way that PCs were expensive. If anything, cloud providers focus on the ability to "pay for what you use". It's no surprise that there is no standard cloud system.

The same holds true for Big Data: inward-facing, but there is no compelling application, no sense of urgency. Sure enough, there are multiple "standards" for Big Data.

* * * * *

We don't have a standard for a smart phone, or tablet, or cloud system, or big data. Without the three factors of compelling applications, urgency, and risk avoidance, I think we will see multiple solutions for some time.

Tuesday, August 26, 2014

With no clear IT leader, expect lots of changes

The introduction of the IBM PC was market-wrenching. Overnight, the small, rough-and-tumble market of microcomputers with diverse designs from various small vendors became large and centered around the PC standard.

From 1981 to 1987, IBM was the technology leader. IBM lead in sales and also defined the computing platform.

IBM's leadership fell to Compaq in 1987, when IBM introduced the PS/2 line with its new (incompatible) hardware. Compaq delivered old-style PCs with a faster buss (the EISA buss) and notably the Intel 80386 processor. (IBM stayed with the older 80286 and 8086 processors, eventually consenting to provide 80386-based PS/2 units.) Compaq even worked with Microsoft to deliver newer versions of MS-DOS that recognized larger memory capacity and optical disc readers.

But Compaq did not remain the leader. It's leadership declined gradually, to the clone makers and especially Dell, HP, and Gateway.

The mantle of leadership moved from a PC manufacturer to the Microsoft-Intel duopoly. The popularity of Windows, along with marketing skill and software development prowess led to a stable configuration for Microsoft and Intel. Together, they out-competed IBM's OS/2, Motorola's 68000 processor, DEC's Alpha processor, and Apple's Macintosh line.

That configuration held for two decades, roughly from 1990 to 2010, when Apple introduced the iPhone. The genius move was not the iPhone hardware, but the App Store and iTunes, which let one easily find and install apps on your phone (and pay for them).

Now Microsoft and Apple have the same problem: after years of competing in a well-defined market (the corporate PC market) they struggle to move into the world of mobile computing. Microsoft's attempts at mobile devices (Zune, Kin, Surface RT) have flopped. Intel is desperately attempting to design and build processors that are suitable for low-power devices.

I don't expect either Microsoft or Intel to disappear. (At least not for several years, possibly decades.) The PC market is strong, and Intel can sell a lot of its traditional (heat radiator that happen to compute data) processors. Microsoft is a competent player in the cloud arena with its Azure services.

But I will make an observation: for the first time in the PC era, we find that there is no clear leader for technology. The last time we were leaderless was prior to the IBM PC, in the "microcomputer era" of Radio Shack TRS-80 and Apple II computers. Back then, the market was fractured and tribal. Hardware ruled, and your choice of hardware defined your tribe. Apple owners were in the Apple tribe, using Apple-specific software and exchanging data on Apple-specific floppy disks. Radio Shack owners were in the Radio Shack tribe, using software specific to the TRS-80 computers and exchanging data on TRS-80 diskettes. Exchanging data between tribes was one of the advanced arts, and changing tribes was extremely difficult.

There were some efforts to unify computing: CP/M was the most significant. Built by Digital Research (a software company with no interest in hardware), CP/M ran on many different configurations. Yet even that effort could not span the differences in processors, memory layout, and video configurations.

Today we see tribes forming around multiple architectures. For cloud computing, we have Amazon.com's AWS, Microsoft's Azure, Google's App Engine. With virtualization we see VMware, Oracle's VirtualBox, the aforementioned cloud providers, and newcomer Docker as a rough analog of CP/M. Mobile computing sees Apple's iOS, Google's Android, and Microsoft's Windows RT as a (very) distant third.

With no clear leader and no clear standard, I expect each vendor to enhance their offerings and also attempt to lock in customers with proprietary features. In the mobile space, Apple's Swift and Microsoft's C# are both proprietary languages. Google's choice of Java puts them (possibly) at odds with Oracle -- although Oracle seems to be focussed on databases, servers, and cloud offerings, so there is no direct conflict. Things are a bit more collegial in the cloud space, with vendors supporting OpenStack and Docker. But I still expect proprietary enhancements, perhaps in the form of add-ons.

All of this means that the technology world is headed for change. Not just change from desktop PC to mobile/cloud, but changes in mobile/cloud. The competition from vendors will lead to enhancements and changes, possibly significant changes, in cloud computing and mobile platforms. The mobile/cloud platform will be a moving target, with revisions as each vendor attempts to out-do the others.

Those changes mean risk. As platforms change, applications and systems may break or fail in unexpected ways. New features may offer better ways of addressing problems and the temptation to use those new features will be great. Yet re-designing a system to take advantage of new infrastructure features may mean that other work -- such as new business features -- waits for resources.

One cannot ignore mobile/cloud computing. (Well, I suppose one can, but that is probably foolish.) But one cannot, with today's market, depend on a stable platform with slow, predictable changes like we had with Microsoft Windows.

With such an environment, what should one do?

My recommendations:

Build systems of small components  This is the Unix mindset, with small tools to perform specific tasks. Avoid large, monolithic systems.

Use standard interfaces  Use web services (either SOAP or REST) to connect components into larger systems. Use JSON and Unicode to exchange data, not proprietary formats.

Hedge your bets  Gain experience in at least two cloud platforms and two mobile platforms. Resist the temptation of "corporate standards". Standards are good with a predictable technology base. The current base is not predictable, and placing your eggs in one vendor's basket is risky.

Change your position  After a period of use, examine your systems, your tools, and your talent. Change vendors -- not for everything, but for small components. (You did build your system from small, connected components, right?) Migrate some components to another vendor; learn the process and the difficulties. You'll want to know them when you are forced to move to a different vendor.

Many folks involved in IT have been living in the "golden age" of a stable PC platform. They may have weathered the change from desktop to web -- which saw a brief period of uncertainty. More than likely, they think that the stable world is the norm. All that is fine -- except we're not in the normal world with mobile/cloud. Be prepared for change.

Wednesday, September 11, 2013

Specialization can be good or bad

Technologies have targets. Some technologies, over time, narrow their targets. Two examples are Windows and .NET.

Windows was, at first, designed to run on multiple hardware platforms. The objective was an "operating environment" that would give Microsoft an opportunity to sell software for multiple hardware platforms. There were versions of Windows for the Zenith Z-100 and the DEC Rainbow; these computers had Intel processors and ran MS-DOS but used architectures different from the IBM PC. Later versions of Windows ran on PowerPC, DEC Alpha, and MIPS processors. Those variants have all ceased; Microsoft supports only Intel PC architecture for "real" Windows and the new Windows RT variant for ARM processors, and both of these run on well-defined hardware.

The .NET platform has also narrowed. Instead of machine architectures, the narrowing has been with programming languages. When Microsoft released .NET, it supplied compilers for four languages: C++, C#, Visual Basic, and Visual J#. Microsoft also made bold proclamations about the .NET platform supporting multiple languages; the implications were that other vendors would build compilers and that Java was a "one-trick pony".

Yet the broad support for languages has narrowed. It was clear from the start that Microsoft was supporting C# as "first among equals". The documentation for C# was more complete and better organized than the documentation for other languages. Other vendors did provide compilers for other languages (and some still do), but the .NET world is pretty much C# with a small set of VB fans. Microsoft's forays into Python and Ruby (the IronPython and IronRuby engines) have been spun off as separate projects; the only "expansion" language from Microsoft is F#, used for functional programming.

Another word for this narrowing of technology is "specialization". Microsoft focussed Windows on the PC platform; the code become specialized. The .NET ecosystem is narrowing to C#; our code is becoming specialized.

Specialization has its advantages. Limiting Windows to the PC architecture reduced Microsoft's costs and enabled them to optimize Windows for the platform. (Later, Microsoft would become strong enough to specify the hardware platform, and they made sure that advances in PC hardware meshed with improvements in Windows.)

Yet specialization is not without risk. When one is optimized for an environment (such as PC hardware or a language), it is hard to move to another environment. Thus, Windows is a great operating system for desktop PCs but a poor fit on tablets. Windows 8 shows that significant changes are needed to move to tablets.

Similarly, specializing in C# may lead to significant hurdles when new programming paradigms emerge. The .NET platform is optimized for C# and its object-oriented roots. Moving to another programming paradigm (such as functional programming) may prove difficult. The IronPython and IronRuby projects may provide some leverage, as may the F# language, but these are quite small compared to C# in the .NET ecosystem.

Interestingly, the "one-trick pony" environment for Java has expanded to include Clojure, Groovy, and Scala, as well as Jython and JRuby. So not all technologies narrow, and Sun's Oracle's Java may avoid the trap of over-specialization.

Picking the target for your technology is a delicate balance. A broad set of targets leads to performance issues and markets with little return. A narrow set of targets reduces costs but foregoes market penetration (and revenue) and may leave you ill-prepared for a paradigm shift. You have to chart your way between the two.

I didn't say it would be easy.

Monday, September 9, 2013

Microsoft is not DEC

Some have pointed out the comparisons of Microsoft to the long-ago champion of mini-computers DEC.

The commonalities seem to be:

  • DEC and Microsoft were both large
  • DEC and Microsoft had strong cultures
  • DEC missed the PC market; Microsoft is missing the mobile market
  • DEC and Microsoft changed their CEOs

Yet there are differences:

DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.

DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.

The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)

PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.

Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.

Sunday, August 18, 2013

The revulsion of old equipment

As a newbie programmer back in the dawn of the PC era, I joined other PC programmers in a general disdain of the IBM mainframes. We were young and arrogant. We were a elitist in our view of hardware: we considered the PC sleek and modern, the mainframe to be antiquated. Most of all, we looked on mainframe hardware as monstrous. Everything about mainframes was a grotesque, oversized ancestor of PC hardware, from the display terminals (the IBM 3270 was a behemoth) to the processor units (refrigerators) to even the cables that connected devices. Yes they worked, and yes people used them but I could not imagine anyone wanting to use such equipment.

We were also elitist in our view of software, but that is not important for this post.

Much of the IBM mainframe was designed in the System/360, the first general-purpose computer from IBM. It was introduced in 1964, seventeen years prior to the IBM PC in 1981. In that time, advances in technology shrank most devices, from processors to disk drives. The IBM PC was very different from the IBM System/360.

Yet the span from that first PC to today is almost twice the seventeen year span from System/360 to PC. Advances in technology have again shrank most devices.

Today's newbie programmers (young and possibly arrogant) must look on the aged PC design with the same revulsion that I felt for mainframe computers.

PC enthusiasts will point out that the PC has not remained static in the past thirty-plus years. Processors are more powerful, memory is significantly larger, disk drives have more capacity while becoming smaller, and the old serial and parallel connectors have been replaced with USB.

All of these are true, but one must still admit that, compared to tablets and smart phones, PCs are large, hulking monstrosities. And while they work and people use them, does anyone really want to?

The PC revolution happened because of four factors: PCs were cheap, PCs were easier to use than mainframes, the "establishment" of mainframe programmers and operators set up a bureaucracy to throttle requests from users, and PCs got the job done (for lots of little tasks).

Since that revolution, the "establishment" bureaucracy has absorbed PCs into the fold.

The tablet revolution sees tablets and smart phones that are: cheap, easier to use that PCs, outside of the establishment bureaucracy, and capable of getting the job done (for lots of little tasks).

Tablets are here to stay. The younger generation will see to that. Businesses will adopt them, just as they adopted PCs. In time, the establishment bureaucracy may absorb them.

PCs will stick around, too, just as mainframes did. They won't be the center of attention, though.

Thursday, July 11, 2013

Computers are bricks that compute

Lenovo is marketing their "ThinkCentre M72e Tiny Desktop", a desktop PC that runs Windows. Yet the Lenovo M72e is quite different from the IBM model 5150.

The Lenovo is a more capable machine than its predecessor of 30-plus years ago. It has a processor that is faster and much more powerful, It has more memory, and larger storage capacity. But those are not the differences that catch my attention.

Unlike the typical desktop PC, the Lenovo M72e is a small unit, just large enough to hold its contents. This small factor is not new; it was used for the Apple Mac Mini. Small and sleek, it almost disappears. Indeed, one can mount it behind a flat-screen monitor where it is out of sight.

The original IBM PC was expandable. It had a motherboard with five expansion slots and room for internal devices like disk drives.

The IBM PC (and PC clones) needed expansion options. The basic unit was very basic: operable but not necessarily useful. The typical purchase included a video card, a display monitor, the floppy controller card, floppy disks, and PC-DOS. (Hard disks would be available with the IBM PC XT, released two years later.)

Other expansion options included additional memory, a real-time clock, terminal emulator cards, and network cards. The PC was a base, a platform, on which to build a tailored system. You added the devices for your specific needs.

IBM designed the PC to be expandable. The idea fit the sensibilities of the time. IBM had sold mainframes and minicomputers with expansion capabilities; its business model was to sell low-end equipment and then "grow" customers into more expensive configurations. Selling a basic PC with expansion options appealed to IBM and was recognized as "the way to buy computing equipment" by businesses. (The model was also used by other computer manufacturers.)

Today, we have different sensibilities for computing equipment. Instead of base units with expansion adapters, we purchase a complete system. Smartphones, tablets, and laptops are self-contained. Video is included in the device. Memory (in sufficient quantity) is included. Storage (also in sufficient quantity) is included. Additional devices, when needed, are handled with USB ports rather than adapter cards.

Desktop PCs, for the most part, still contain expansion slots. Expansion slots that most people, I suspect, never use.

Today's view of a PC is not a platform for expansion but a box that performs computations. The smaller form-factors for PCs fit with this sensibility.

That is the difference with the Mac Mini and the Lenovo M72e. They are not platforms to build systems. They are complete, self-contained systems. When we need additional capacity, we replace them; we do not expand them.

Computers today are bricks that compute.

Tuesday, March 26, 2013

Linux is a parasite

Linux is, to put it harshly, a parasite.

I like Linux. I use Linux. I encourage people to use it.

But I have to admit that Linux is successful because of the success of the IBM PC, and successful because the IBM PC was an open system.

When IBM released the PC, it documented the device and made it accessible to others. In contrast to its larger systems, which were documented less thoroughly, the PC documentation described almost every aspect of the hardware.

(This documentation may have been the result of the consent decree, the result of anti-trust litigation brought by the US Justice Department. Regardless of the source, the documentation was there.)

The openness of the PC made it possible for others to build accessory cards and eventually even clone PCs. (Those clone PCs, made first by Compaq and later by many others, did require some litigation of their own.)

The operating system, PC-DOS, was not made open. But enough of the hardware was defined to allow different people to create operating systems for the PC. In addition to Microsoft's PC-DOS, Digital Research produced a copy of CP/M-86 for the PC, SofTech produced the UCSD p-System, and I'm pretty sure that there was a copy of Forth for the PC.

Which brings us to Linux.

Linux, like the operating systems before it, runs on the PC architecture.

I realize that the PC of today is quite different from the original PC. (So different that none of the original PC hardware will work with today's PCs!) Yet the architecture is close enough, and the designs are open enough, that a "foreign" operating system (one designed by someone other than the PC designer) can run on the PC.

Linux has succeeded in the world, running on many devices. Lots of these devices are PCs. ("PC" in the general sense, not the specific model made by IBM.) It is easy to install Linux on an older PC, as an experiment or as a way to eke additional use out of a device that cannot run the latest version of Windows.

A few manufacturers have attempted to bundle Linux with hardware, with little success. Just about every copy of Linux on a PC has been installed "after-market": after the PC was sold and used for some period of time. (Most of my Linux PCs had Windows installed by the manufacturer, and I had to install Linux.)

Linux is taking advantage of the openness of PC hardware. In that sense, it is a parasite. The phrase may be a bit harsh, but I think it is accurate. You may not agree.

Regardless of your feelings about the term "parasite", the changes in hardware threaten Linux. Newer hardware is less open than the IBM PC. The Surface tablets from Microsoft do not use the standard BIOS and will not boot Linux. (Well, not without some hacking,) New PCs use the UEFI loader which guards against malware by checking signatures on boot images. The Apple iPhone and iPad devices use similar technology to boot only iOS.

Hardware is becoming closed. These new devices make it difficult, not easy, to load a "foreign" operating system.

I think Linux will continue to exist. I think that some number of open, old-style PCs will continue to be made (for several years). Linux will continue to exist in the form of Android.

But the new world of closed hardware is definitely a challenge for Linux. It may have to become something more than an "after-market" option for PCs.

Tuesday, February 5, 2013

PC as status item

Today, the common PC is just that -- common. A typical office will have many PCs, all like. (Some offices will have variation. From my observations, the consistency of the PC set will be proportional to the size of the organization. The largest organizations have the most homogenous set of PCs.)

With the introduction of BYOD, we may see a resurgence in "the PC as status symbol".

What's that, you say? You've never seen a PC used as a status symbol?

Well, it has happened. Just after the release of the IBM PC, and when companies were experimenting with word processors and spreadsheets, PCs were *quite* the status symbol. A PC was a better typewriter, an expensive investment, and a thing to be admired. Some PCs had monochrome monitors and some had color monitors.

When the IBM PC AT was released, some people had the newer, faster PC and some had the older, slower PCs.

Even as late as 1995, PCs varied within organizations and were used as status symbols. After 1995, that changed. The price of PCs dropped to a level that made frequent replacements feasible. Large organizations (the ones with tech support groups) found it easier and cheaper to "refresh" PCs every few years. It meant they had the latest operating system, fewer hardware failures, and a consistent set of PCs with easier support.

When the PCs become consistent, the status from different PCs disappeared.

BYOD may introduce the status again. When individuals select and purchase their own equipment, the consistency of equipment will vanish. Some will purchase iPads, some Android devices, and some Microsoft Surface tablets. Some people will "refresh" their devices frequently, and some will hold on to them for years. (We may see a "reverse status" with reverence for an old-school device.)

The return of status may cause some friction and competition among team members. A good manager will look for the signs and focus the team on the objectives.

Monday, January 21, 2013

What is a PC?

It's a simple question -- "what is a PC?" -- yet the answer is complicated.

If we use Mr. Peabody's Wayback machine to travel to September 1981, the answer is simple. A "PC" (that is, a personal computer) is an IBM model 5150 with it's gray cover, detached keyboard (with 83 keys), and either an IBM Color Display (5153) or an IBM Monochrome Display (5151). It has an Intel 8088 processor, probably one or two floppy disk units, and a video adapter card.

At that time, that was a PC. Any other equipment was not. The PC name was strongly associated with IBM.

Over time, the concept of "PC" expanded. IBM introduced the IBM PC XT (model 5160), which meant that there were *two* models of IBM PC.

IBM introduced adapters for memory and ports. Other vendors did also. Compaq introduced their portable PC, fighting (and eventually winning) the battle for a compatible BIOS. Hercules made a video adapter that displayed graphics on monochrome displays (the IBM monochrome display adapter displayed only text).

In 1984 IBM introduced the IBM PC AT which used the Intel 80286 processor. Now there were three types of PCs from IBM, some with different processors, and bunches from other vendors. Some had more memory, some had different adapters. IBM introduced the Enhanced Graphics Adapter (EGA) with the IBM PC AT.

Through all of these changes, the two constants for PCs were this: they ran PC-DOS (or MS-DOS), and they ran Lotus 1-2-3. The operating system and that one application defined "PC". If the device ran PC-DOS and Lotus 1-2-3, it was a PC. If it did not, it was not. (And even this definition was not quite true, since several computers ran MS-DOS and special versions of Lotus 1-2-3, but were never considered to be "PC"s. The Zenith Z-100, for example.)

Moving forward to the early 1990s, our definition of PCs changed. It was no longer sufficient to run PC-DOS and Lotus 1-2-3. Instead, the criteria changed to Windows and Microsoft Office. Those were the defining characteristics of a PC. (Even in the late 1990s, when Compaq and Microsoft built the "Pocket PC", the device was considered a PC.)

Today, when we use the term "PC", we think of a set of devices. These include desktop computers, laptop computers, virtual computers running on servers, and now, with the Microsoft Surface, tablets. The operating system has expanded to include Linux (but not Mac OSX), and there is no definitive application. We use the phrases "Windows PC" and "Linux PC". Windows PCs must run Microsoft Windows and Microsoft Office, but a Linux PC needs only a version of Linux.

We have the puzzle of an Apple MacBook running Linux -- do we call this a PC? I am tending to think not. Apple's advertising and branding has been strong.

The one characteristic is that all of these devices require the user to be an administrator. The user must install new software, ensure updates are installed, and diagnose problems. This action separates a PC from a tablet. Tablets do not require the user to "install" software -- beyond selecting the software from a menu. Tablets do not require the user to be an administrator. Updates are applied automatically, or perhaps after a prompt. Network adapters do not need to be configured.

Let's take the dividing line between PCs and tablets as administration. Some might call it "ease of use".

Yet even this definition is less than clear. Apple's OSX is better at installing applications: just drag the install package to the "Applications" folder. Linux has made improvements too, with Ubuntu's "Software Center" that lets one pick an application and install it. Microsoft's Windows RT is quite close to Apple's iOS for iPhones and iPads, which are clearly not PCs.

Despite the lack of a bright line in devices and implementations, I believe that we will look back and consider PCs to require administration, and non-PCs (tablets, smartphones, etc.) to allow use without the administrator role.

So that's my answer: If you need an administrator, it's a PC. If you don't, then it isn't.

Maybe the answer isn't so complicated.


Wednesday, January 9, 2013

Tablets will not replace PCs -- at least not as you think they might

Tablets have seen quite a bit of popularity, and some people now ask: When will tablets replace PCs?

To which I answer: They won't, at least not in the way most folks think of replacing PCs.

The issue of replacement is subtle.

To examine the subtleties, let's review a previous revolution that saw one form of technology replaced by another.

Did PCs replace mainframes? In one sense, they have, yet in another they have not. PCs are the most common form of computing, used in businesses, schools, and homes. Yet mainframes remain, processing millions of transactions each day.

When microcomputers were first introduced, one could argue that PCs had replaced mainframes as the dominant form of computing. Even before the IBM PC, in the days of the Apple II and the Radio Shack TRS-80, there were more microcomputers than mainframes. (I don't have numbers, but I am confident that several tens of thousands of microcomputers were sold. I'm also confident that there were fewer than several tens of thousands of mainframes at the time.)

But sheer numbers is not an accurate measure. For one thing, a mainframe serves multiple users and a PC serves one. So maybe PCs replaced mainframes when the number of PC users exceeded the number of mainframe users.

The measure of users is not particularly clear, either. Many users of microcomputers were hobbyists, not serious day-in-day-out users. And certainly mainframes processed more business transactions than the pre-internet PCs.

Maybe we should measure not the number of devices or the number of users, but instead the conversations about the devices. When did people talk about PCs more than mainframes? We cannot count tweets or blog posts (neither had been invented) but we can look at magazines and publications. Prior to the IBM PC, most technology magazines discussed mainframes (or minicomputers) and ignored the microcomputer systems. There were some microcomputer-specific magazines, the most popular one being BYTE.

The introduction of the IBM PC generated a number of magazines. At this point the magazine content shifted towards PCs.

There are other measures: number of want ads, number of conferences and expos, number of attendees at conferences and expos. We could argue for quite some time about the appropriate metric.

But I think that the notion of one generation of computing technology replacing another is a slippery one, and perhaps a false one. I'm not sure that one technology really replaces an earlier one.

Consider: We have advanced from the mainframe era to the PC era, and then to the networked PC era, and then to the internet era, and now we are venturing into the mobile/cloud era. Each successive "wave" of technology has not replaced the previous "wave", but instead expanded the IT sphere. PCs did not replace mainframes; they replaced typewriters and dedicated word-processing systems and expanded the computational realm with spreadsheets. The web did not replace PCs, it expanded the computation realm with on-line transactions -- many of which are handled with back-end systems running on... mainframes!

A new technology expands our horizons.

Tablets will expand our computing realm. They will enable us to do new things. Some tasks of today that are handled by PCs will be handled by tablets, but not all of them.

So I don't see tablets replacing PCs in the coming year. Or the next year. Or the year after.

But if you insist on a metric, some measurement to clearly define the shift from PC to tablet, I suggest that we look at not numbers or users or apps but conversations. The number people who talk about tablets, compared to the number of people who talk about PCs. I would include blogs, tweets, web articles, advertisements, podcasts, and books as part of the conversation. You may want to include a few more.

Whatever you pick, look at them, take some measurements, and mark when the lines cross.

And then blog or tweet about it.

Wednesday, January 2, 2013

Windows is the de facto standard, and that's a bad thing

A simple phrase can conjure up interesting memories, and such is the case with the phrase "de facto standard".

From its 3.1 release, Windows has been the standard. It was a de facto standard -- it was not adopted by a standards body or codified in law -- but no one called it that -- they simple said "Windows".

The last product in tech that had the attribute "de facto standard" was the predecessor to PC-DOS: a small operating system known as CP/M. In the late 1970s and early 1980s, prior to the introduction of the IBM PC and PC-DOS, CP/M was the most popular operating system. It did not have the near-universal acceptance of Windows; operating systems like Radio Shack's TRS-DOS and Apple's DOS were major contenders, and there were a bunch of minor competitors. CP/M had a majority of the market but not an overwhelming majority, and people called CP/M "the de facto standard". *

The "de facto" label was, for CP/M, no guarantee of success. With the introduction of the IBM PC, CP/M was quickly replaced by PC-DOS. It became The Way Everyone Uses Computers. No one used the phrase "de facto standard". They simply called it "DOS", and there was no discussion or consideration of alternatives (except for a few eccentric Macintosh users.)

Later, PC-DOS was replaced by Windows. With release 3.1, Windows became The Way Everyone Uses Computers. No one used the term "de facto standard" for Windows, either. (The same group of eccentric Macintosh users were present, and folks mostly ignored them.)

This past year has seen alternate operating systems rise to challenge the dominance of Windows. Mac OSX has made inroads for desktops and laptops. Linux has taken some of the server market. For phones, iOS and Android far surpass Windows.


Now, people are calling Windows the de facto standard. I think this is a bad thing for Windows. It is an admission of competition, an acknowledgement of fallibility. The presence of the term means that people consider alternate operating systems a viable threat to Windows. The pervasive group-think has shifted from "Windows as only operating system" to "Windows is our operating system and we want to keep it that way". Windows is no longer The Way Everyone Uses Computers; now it is The Way We Use Computers.

Perhaps I am reading too much into the phrase "de facto standard". Perhaps the memory of IBM and Microsoft taking away our cherished microcomputer independence still stings. Perhaps nothing has changed in the mindset of programmers, consumers, and businesses.

Or perhaps people are ready to move to a new computing platform.


* The popularity of different operating systems in the pre-PC age is difficult to measure, and arguments can be made that specific operating systems were the most popular. Operating systems were sometimes sold separately and sometimes bundled with hardware, and the fans of Commodore C64 computers have a good case for their Microsoft BASIC as the most popular operating system. I have seen the term "de facto" applied only to CP/M and not to any competitors.

Tuesday, December 18, 2012

Windows RT drops some tech, and it hurts

Say what you will about Microsoft's innovation, licenses, or product quality, but one must admit that Microsoft has been quite good at supporting products and providing graceful upgrades. Just about every application that ran on Windows 3.1 will run on later versions of Windows, and most DOS programs will run in the "Command Prompt" application. Microsoft has provided continuity for applications.

The introduction of Windows RT breaks that pattern. With this new version of Windows, Microsoft has deliberately selected technologies into "keep" and "discard" piles -- and it has done so without even a "deprecated" phase, to give people some time to adjust.

The technologies in the "discard" pile are not insignificant. The biggest technology may be Silverlight, Microsoft's answer to Adobe's Flash. It is allowed in Windows 8 but not in Windows RT.

Such a loss is not unprecedented in the Microsoft community, but it is infrequent. Previous losses have included things like Microsoft Bob and Visual J#, but these were minor products and never gained much popularity.

The most significant losses may have been FoxPro and the pre-.NET version of Visual Basic. These were popular products and the replacements (Microsoft Access and VB.NET) were significantly different.

The loss of technologies hurts. We become attached to our favorite tech, whether it be Silverlight, Visual Basic, or earlier technologies such as Microsoft's BASIC interpreter (the one with line numbers), the 6502 processor, or DEC's PDP-11 systems.

Microsoft fans (with the exception of the FoxPro and Visual Basic enthusiasts) have not experienced a loss. Until Windows RT. Microsoft's strong support for backwards-compatibility in its operating systems, languages, and applications has sheltered its users.

Those of us from certain graduating classes, those of us who were around before the introduction of the IBM PC, have experienced loss. Just about everyone from those classes lost their favorite tech as the "new kid" of the IBM PC became popular, set standards, and drove out the other designs. The Apple II, the TRS-80, the Commodore systems, (and my favorite, the Heathkit H-89) were all lost to us. We had formed our loyalties and had to cope with the market-driven choices of new technology.

Folks who joined the tech world after the IBM PC have experienced no such loss. One may have started with PC-DOS and followed a chain of improved versions of DOS to Windows 3.1 to Windows NT to Windows XP, and a chain of upgrades for Word, Multiplan to Excel, and Access to SQL Server.

Windows RT marks the beginning of a new era, one in which Microsoft drops the emphasis on backwards-compatibility. The new emphasis will be on profitability, on selling Surface units and (more importantly) apps and content for Windows RT tablets.

To the Windows developers and users: I'm sorry for your loss, but I have gone through such losses and I can tell you that you will survive. It may seem like a betrayal -- and it is. But these betrayals happen in the tech world; companies make decisions on profit, not your happiness.

Thursday, October 18, 2012

No more PCs for me

This week I decided to never buy a PC again. I currently have four, plus a laptop. They will be enough for me to transition to the new world of tablets and virtual servers.

This is almost a bittersweet moment. I was there prior to the PC, when the Apple II and the Radio Shack TRS-80 were dominant (and my favorite, the Heathkit H-89 was ignored). I was there at the beginning of the PC age, when IBM introduced the PC (the model 5150) and everyone absolutely had to have them. PCs were the new thing, a rebel force against the established (IBM) mainframes and batch processing and centralized control. I resented IBM's power in the market.

I saw the rise of the PC clones, first the Compaq and later, everyone. I saw IBM's attempt to re-define the standard with the PS/2 and the market's reaction (buy into Compaq and other PC clones).

I saw the rise of Windows, and the change from command-line programs to GUI programs.

Now, I have seen the (Microsoft Windows) PC become the established computer, complete with centralized control. The new rebel force uses tablets and virtual servers.

I am uncomfortable with Apple's power over app suppliers for iOS devices, and Microsoft's power over app suppliers for Windows RT devices. I am leery of Google's power, although the Android ecosystem is a bit more open that iOS and Windows RT.

Yet I am swept along with the changes. I use an Android tablet in the morning, to check Facebook, Twitter, and news sites. (A Viewsonic gTablet, which received poor reviews for it's non-standard interface, yet I am coming to like it.) I use an Android smartphone during the day. I use a different Android tablet in the evening. (Although I am typing this on my Lenovo laptop with a Matias USB keyboard.) While I have not moved everything to the tablets, I have moved a lot and I expect to switch completely within a few months.

My existing PCs have been converted to the server version of Ubuntu Linux, with the one exception of a PC running Windows 7. I suspect that I will never convert that PC to Windows 8, but instead let it die with dignity.

I was there at the beginning, and I am here at the end. (Of the PC age.) Oh, I recognize that desktop and laptop PCs will be with us for a while, just as mainframes stayed with us. But the Cool New Stuff will be on tablets, not on PCs.

Sunday, July 8, 2012

Punctuated evolution

Biology has the theory of "Punctuated Evolution": periods of stable configurations of species interspersed with short periods of change. I think we can see similar patterns in technology.

Let's start with the IBM PC. It arrived in 1981 and introduced a new age of computing. While preceded by a number of microcomputer systems (the Apple II, the Radio Shack TRS-80, the Commodore PET, and others) the IBM PC gained wide acceptance and set a new standard for computing hardware. The Combination of IBM PC and PC-DOS was the norm from its introduction until 1990 when Microsoft Windows replaced PC-DOS and more importantly, networks arrived.

The PC/Windows/network combination maintained dominance from 1990 until just recently. The PC mutated from a desktop device to a laptop device, and Windows changed from its early incarnation to the Windows-95 and later the Windows Vista "skin". Networks were the thing that really defined this era of computing.

We now have a new transition, from PC/Windows/network to tablet/cloud/wireless. Each transition requires new ideas for processing, storage, and user interfaces, and this transition is no exception. In the PC/DOS era, the user interface was text, the storage was local, and the processing was local. In the PC/Windows/network era, the user interface was graphical, the storage was networked (reliably), and the processing was local.

In the tablet/cloud/wireless era, the user interface is graphical and oriented to touch, the storage is networked (over an unreliable wireless network), and the processing is remote.

The tech for tablet/cloud/wireless is different from the previous age, and requires a different approach to programming and systems design. Processing in the cloud gives us more capacity; communicating over an unreliable network means that our systems must be opportunistic (process when you can) and patient (wait while you cannot).

The PC/DOS era stood for almost twenty years. So did the PC/Windows/network era. If that trend continues, the tablet/cloud/wireless era will run for about the same. Look for tablet/cloud/wireless to run from 2010 to 2030.

Sunday, April 15, 2012

Forced transitions can work

Technology changes over time. Manufacturers introduce new versions of products, and sometimes introduce radically new products. When a manufacturer introduces a radically new product and discontinues the old product, its customers must make a decision: do the move to the new product or do they stay with the old? This is a forced transition, as it is often impractical to stay with the old product. (New copies or licenses are not available, replacement parts are not available, and support is not available.)

Forced transitions can sometimes succeed:

  • IBM transitioned customers from their early 704 and 1401 processors to the System/360 processors, and later the System/370 processors.
  • DEC transitioned customers from the PDP-11 line to the VAX processor line.
  • Microsoft transitioned customers from DOS to Windows, then to Windows NT, and then to .NET.
  • Apple transitioned customers from the Macintosh computers with Motorola processors to PowerPC processors, then to Intel processors.
  • Apple transitioned the Mac operating system from the original version to OSX.


Forced transitions do not always succeed:

  • IBM failed to convince customers to move from the IBM PC to the IBM PS/2.
  • DEC failed to convince customers to move from the VAX to the Alpha processor.

Now, Microsoft is looking to transition its desktop to the new model used by tablets and smartphones. (I call it "tap and swipe", since many of the actions are initiated by taps or swipes of the touchscreen.) Microsoft's vision is present in the Windows 8 "Metro" interface. The computing experience is quite different from classic Windows.

Will they succeed?


Microsoft has a lot going for it. They are big and have a commanding presence in the software market. Switching from Windows-based products to alternatives on other platforms is expensive, involving the acquisition of the software, conversion of data, and training of users. Specialized software may be unavailable on platforms other than Windows.

Microsoft also has a lot against its success at the transition. Users are familiar with the current Windows interface and the current tools. The Metro UI brings a very different experience to the desktop and to computing (well, it moves Windows into the realm of iPhones and Android tablets). There will be a lot of resistance to change.

I think Microsoft will succeed, because users have no where else to go. When IBM introduced the PS/2, users had the options of buying IBM PC clones -- and they exercised those options. When DEC introduced the Alpha processor, users had the options of moving to workstations from other vendors -- and they did.

The transition to Windows 8 and Metro forces people to adopt the new interface, but they have no option to replace Windows 7. Changing to Mac OSX will lead to a similar GUI change (I expect future versions of OSX to look more and more like iOS). Changing to Linux creates significant challenges for education and software replacements.

I *do* expect that some shops will move away from Windows. If they have no software that is specific to Windows, if their software is readily available on other platforms, they could move to those other platforms. Some will move to Linux and the LibreOffice suite of tools. Others will move to web-based and cloud-based services like Google Docs and Zoho documents. But I expect these to be a small number of customers. The majority of customers will shift, perhaps unwillingly, to Windows 8.