Showing posts with label PC revolution. Show all posts
Showing posts with label PC revolution. Show all posts

Thursday, November 5, 2020

Big is the new big

For PCs, big is the new big. It wasn't always this way.

When the PC revolution started, small was big. The advantage of PCs was smallness, in size, in cost, and in effort. PCs were small enough to fit on a desktop, cheap enough to be purchased as "word processors" (to avoid the data processing bureaucracy), and easy enough that one person could operate them.

The programmers of the PC revolution took pride in the smallness of hardware and of programming languages. BASIC, the original language of the PC revolution, could fit on a computer with 8K of RAM. (That's 8192 bytes.) Other programming languages were also small. Even the names emphasized smallness: Tiny-C, Tiny-Pascal.

We rebels of IT considered "big" an aspect of mainframe systems, and associated "big" with slow, lumbering processes that hindered productivity. The Agile movement was defined as the opposite of "BDUF" (Big Design Up Front). Big was what we did not want our systems to be.

Yet something has changed. We no longer want to be small. Now, we want to be big.

Notice the (now decade-old) desire for "big data". That was a full embrace of bigness. There was also "big analytics" which analyzes large datasets.

Other things have become big. Databases have become big. Spreadsheets have become big. Word processors have become big. Our PCs are not stand-alone systems, but nodes in a larger system and reliant on that system for authentication, updates, and basic processing.

Programming languages have become big, with C++, Java, Python, C#, and Visual Basic in the top slots of the Tiobe index for October 2020. The top position is held by C. While some consider C a small programming language, I have my doubts. C uses a small set of keywords, but has a large standard library. It may not be large, but it isn't small.

We have lost our desire for small systems. Our smart phones are physically small, yet their operating systems are complex and app development is for professionals. Even small hardware systems such as the Raspberry Pi run a full Linux distro, which requires a fair amount of administration.

The only small systems are the Arduino and similar systems, which run a slim monitor operating system and are designed for hardware control, not general-purpose computing.

Perhaps we were deluding ourselves. Perhaps we wanted large systems all along. We craved the power of fast processors, ample memory, and disk space that was not limiting -- that much is certain. Perhaps the cost of faster processors, more memory, large disk space, and interconnected systems is bigness. Perhaps it is not possible (or not cost-effective) to operate small systems loosely coupled.

Perhaps.

But perhaps the desire for small, simple systems remains, and perhaps, one day, it will resurface.


Thursday, January 30, 2020

The cloud revolution is different

The history of computing can be described as a series of revolutions. If we start the age of modern computing with the earliest electronic calculating machines, we have the following upheavals:

  • Standardized computers for sale (or lease)
  • General-purpose mainframes
  • Minicomputers
  • Personal computers
  • Web applications
  • Cloud applications

Each of these events were revolutionary -- they introduced new forms of computing. And all of these events (except one) saw an expansion of computing, an increase in the applications that could be performed by computers.

The first revolution (standardized computers) was in the days of the IBM 1401. Computers were large, expensive, and designed for specific purposes, but they were also consistent. One IBM 1401 was quite similar to another IBM 1401, ignoring differences in memory and tape drives. The similarity in computers made possible the idea of commonly used applications, and common programming languages such as FORTRAN and COBOL.

The second revolution (a general-purpose computer) was introduced by the IBM System/360. The System/360 was designed to run applications for different domains: scientific, commercial, and government. It built on the ideas of common applications and common programming languages.

The minicomputer revolution (minicomputers, or timesharing) expanded computing with interactive applications. Instead of batch jobs that could be run only when scheduled by operators, timesharing allowed for processing when users wanted it. In fact, timesharing expanded computing from operators to users. (Not everyone was a user, but the set of users was much larger than the set of operators.) Minicomputers were used to create the C language and write the Unix operating system.

The PC revolution brought computing to "the rest of us", or at least those who were willing to spend the thousands of dollars for a small computer. It applications were more interactive than those of timesharing, and more graphical. The "killer" app was the spreadsheet, but word processors, small databases, and project planning software was also popular, and made possible with PCs.

The web revolution introduced communication, and made applications available across a network.

Each of these changes -- revolutions, in my mind -- expanded the universe of computing. The expansions were sometimes competitive, with the "rebels" introducing new applications and the "old guard" attempting to copy the same applications onto the old platform. The expansions were sometimes divisive, with people in the "old" and "new" camps disagreeing on applications, programming languages, and techniques, and even what value the different forms of computing offered. But despite competition and disagreement, each camp had its own ground, and was relatively secure in that area.

There was no fear that minicomputers would replace mainframes. The forms of computing were too different. The efficiencies of the two forms were different. Mainframes excelled at transaction processing. Minicomputers excelled at interaction. Neither crossed into the other's territory.

When PCs arrived, there was no fear that PCs would replace mainframes. PCs would, after some time, replace stand-alone word processing systems and typewriters. But mainframes retained core business applications on big iron. (Minicomputers did die off, being caught between efficient mainframes and interactive PCs.)

When the web arrived, there was no fear that web servers would replace PCs. There was no fear that  web applications would replace desktop applications. The web was a new place, with new capabilities. Instead of replacing PCs, the web expanded the capabilities of mainframe systems, providing a user interface into banking and corporate systems. PC applications such as word processing and spreadsheets remained on PCs.

Which brings us to cloud computing.

The cloud revolution is different. The approach with cloud computing, the philosophy, has been to absorb and replace existing applications. We have any number of companies ready to help "move applications to the cloud". There are any number of books, magazines, and online resources that describe tips and tricks for migrating to the cloud. The message is clear: the cloud is the place to be, convert your old applications to the cloud.

This mindset is different from the mindset of previous revolutions. The cloud revolution wants to take over all computing. The cloud revolution is predatory. It is not content with an expansion of computing; it wants to own it all.

I do not know why this revolution is different from previous changes. Why should this change, which is simply another form of computing, push people to behave differently.

At the root, it is people who are behaving differently. Cloud computing is not a sentient being; it has no feelings, no desires, and no motivations. Cloud computing does not want to take over the computing world; it is us, the people in IT, the developers and designers and managers who want cloud computing to take over the world.

I think that this desire may be driven by two factors: economics and control. The economics of cloud computing is better (cheaper) than the economics of PCs, discrete web servers, and even mainframes. But only if the application is designed for the cloud. A classic web application, "lifted and shifted" into the cloud, has the same economics as before.

The other factor is control. I think that people think that they have more control over cloud-based applications than desktop applications or classic web applications. The first is undoubtedly true. Desktop applications, installed on users PCs, are difficult to manage. Each PC has its own operating system, its own hardware, its own set of other applications, any of which can interfere with the application. PCs can fail, they can run out of disk space, and -- worst of all -- let an old version of the application continue to run. The cloud does away with all of that: control moves from the user to the cloud administrator and support becomes much simpler.

So I can understand the desire for people to move applications to the cloud. But I think that people are missing opportunities. By focusing on moving existing applications into the cloud, we do not see the possible new applications, possible only in the cloud. Those opportunities include things such as big data and machine learning, and can include more.

Imagine the PC revolution, with small computers that fit on desktops, and applications limited to copies of existing mainframe applications. The new PCs would be running order entry systems and inventory systems and general ledger. Or at least we would be trying to get them to run those applications, and we would be ignoring the possibilities of word processing and spreadsheets.

Cloud computing is a form of computing, just as mainframes, PCs, and the web are all forms of computing. Each has its strengths (and weaknesses). Don't throw them away for efficiency, or for simpler support.

Wednesday, January 28, 2015

The mobile/cloud revolution has no center

In some ways, the mobile/cloud market is a re-run of the PC revolution. But not completely.

The PC revolution of the 1980s (which saw rise of the IBM PC, PC-DOS, and related technologies) introduced new hardware that was cheaper and easier to use than the previous technologies of mainframes and minicomputers. Today's mobile/cloud revolution shares that aspect, with cloud-based services and mobile devices cheaper than their PC-based counterparts. It's much easier to use a phone or tablet than it is to use a PC -- ask the person who installs software.

The early PC systems, while cheaper and easier to use, were much less capable than the mainframe and minicomputer systems. People ran large corporations on mainframes and small businesses on minicomputers; PCs were barely able to print and handle a few spreadsheets. It was only after PC-compatible networks and network-aware software (Windows 3.1, Microsoft Exchange) that one could consider running a business on PCs. Mobile/cloud shares this attribute, too. Phones and tablets are network-aware, of course, but the whole "mobile and cloud" world is too new, too different, too strange to be used for business. (Except for some hard-core folks who insist on doing it.)

Yet the two revolutions are different. The PC revolution had a definite center: the IBM PC at first, and Windows later. The 1980s saw IBM as the industry leader: IBM PCs were the standard unit for business computing. Plain IBM PCs at first, and then IBM PC XT units, and later IBM PC-compatibles. There were lots of companies offering personal computers that were not IBM-compatible; these offerings (and their companies) were mostly ignored. Everyone wanted "in" on the IBM PC bandwagon: software makers, accessory providers, and eventually clone manufacturers. It was IBM or nothing.

The mobile/cloud revolution has no center, no one vendor or technology. Apple devices are popular but no vendors are attempting to sell clones in the style of PC clones. To some extent, this is due to Apple's nature and their proprietary and closed designs for their devices. (IBM allowed anyone to see the specs for the IBM PC and invited other vendors to build accessories.)

Apple is not the only game in town. Google's Android devices compete handily with the Apple iPhone and iPad. Google also offers cloud services, something Apple does not. (Apple's iCloud product is convenient storage but it is not cloud services. You cannot host an application in it.)

Microsoft is competing in the cloud services area with Azure, and doing well. It has less success with it's Surface tablets and Windows phones.

Other vendors offer cloud services (Amazon.com, IBM, Oracle, SalesForce) and mobile devices (BlackBerry). Today's market sees lots of technologies. It is a far cry from the 1980s "IBM or nothing" mindset, which may show that consumers of IT products and services have matured.

When there is one clear leader, the "safe" purchasing decision is easy: go with the leader. If your project succeeds, no one cares; if your project fails you can claim that even "big company X" couldn't handle the task.

The lack of a clear market leader makes life complicated for those consumers. With multiple vendors offering capable but different products and services, one must have a good understanding of the projects before selecting a vendor. Success is still success, but failure allows others to question your ability.

Multiple competing technologies also means competition at a higher level. In the PC revolution, IBM and Compaq competed on technology, but the basic platform (the PC) was a known quantity. In mobile/cloud, we see new technologies from start-up companies such as containers and new technologies from the established vendors such as cloud management and the Swift programming language.

The world of mobile and cloud has no center, and as such it can move faster than the old PC world. Keep that in mind when building systems and selecting vendors. Be prepared for bumps and turns.

Monday, May 26, 2014

Tablets have it harder than PCs

As a new technology, tablets have a much harder job than PCs had.

When individuals and companies started using personal computers, there was either no established IT infrastructure, or the established infrastructure was separate from the realm in which PCs operated. For an individual, that realm was the home, and the PC was the first computing device. (It may have been something other than an IBM PC; perhaps a Commodore C-64 or a Radio Shack TRS-80. But it was the only computer in the house.)

Companies may have had mainframe computers, or timesharing services, or even minicomputers. Some may have had no computers. For companies with no computers, the PC was the first computer. For companies with larger, "real" computers, the PC occupied a different computing area.

Mainframes and minicomputers were used for financial applications. PCs were used, initially, as replacements for typewriters and word processing systems. Over time we expanded the role of the PC into a computing workstation, but they were still isolated from each other and processing data that was not on the mainframe.

PCs and their applications could grow without interference from the high priests of the mainframe. The mainframe programmers and system analysts were busy with "real" business applications and not concerned with fancy electric typewriters. (And as long as PCs were fancy electric typewriters, the mainframe programmers and analysts were right to ignore them.)

PC applications grew, in size and number. Eventually they started doing "real" work. And shortly after we used PCs to do real business work, we wanted to share data with other PCs and other systems -- such as those that ran on mainframes.

That desire lead to a large change in technology. We moved away from the mainframe model of central processing with simple terminals. We looked for ways to bridge the "islands of automation" that PCs made. We built networking for PCs, scavenging technologies and creating a few. We connected PCs to other PCs. We connected PCs to minicomputers. We connected PCs to mainframes.

Our connections were not limited to hardware and low-level file transfers. We wanted to connect a PC application to a mainframe application. We wanted to exchange information, despite PCs using ASCII and mainframes using EBCDIC.

After decades of research, experiment, and work, we arrived at our current model of connected computing. Today we use mostly PCs and servers, with mainframes performing key functions. We have the hardware and networks. We have character sets (UNICODE, usually) and protocols to exchange data reliably.

It is at this point that tablets arrive on the scene.

Where PCs had a wide open field with little oversight, tablets come to the table with a well-define infrastructure, technically and bureaucratically. The tablet does not replace a stand-alone device like a typewriter; it replaces (or complements) a connected PC. The tablet does not have new applications of its own; it performs the same functions as PCs.

In some ways, the existing infrastructure makes it easy for tablets to fit in. Our networking is reliable, flexible, and fast. Tablets can "plug in" to the network quickly and easily.

But tablets have a harder job than PCs. The "bureaucracy" ignored PCs when they arrived; it is not ignoring tablets. The established IT support groups define rules for tablets to follow, standards for them to meet. Even the purchasing groups are aware of tablets; one cannot sneak a tablet into an organization below the radar.

Another challenge is the connectedness of applications. Our systems talk to each other, sending and receiving data as they need it. Sometimes this is through plain files, somethings through e-mail, and sometimes directly. To be useful, tablets must send and receive data to those systems. They cannot be a stand-alone device. (To be fair, a stand-alone PC with no network connection would be a poor fit in today's organizations too.)

But the biggest challenge is probably our mindset. We think of tablets as small, thin, mouseless PCs, and that is a mistake. Tablets are small, they are thin, and they are mouseless. But they are not PCs.

PCs are much better for the composition of data, especially text. Tablets are better for the collection of certain types of data (photographs, location) and the presentation of data. These are two different spheres of automation.

We need new ideas for tablets, new approaches to computation and new expectations of systems. We need to experiment with tablets, to let these new ideas emerge and prove themselves. I fully expect that most new ideas will fail. A few will succeed.

Forcing tablets into the system designed for PCs will slow the experiments. Tablets must "be themselves". The challenge is to change our bureaucracy and let that happen.

Monday, March 17, 2014

Mobile changes how we think about computing

The rise of mobile computers, while not a revolution, does introduce a significant change in our thinking of computing. I believe that this change generates angst for many.

The history of computing has seen three major waves of technology. Each of these waves has had a specific mindset, a specific way that we view computing.

Mainframes The first wave of computing was the mainframe era. Computers were large, expensive, magical boxes that were contained in sealed rooms (temples?) and attended by technicians (priests?). The main tasks of computers was to calculate numbers for the company (or government) and most jobs were either accounting or specific mathematical calculations (think "ballistics tables").

Minicomputers The second wave of computing was the minicomputer era. Computers were the size of refrigerators or washing machines and could be purchased by departments within a company (or a university). They did not need a sealed room with special air conditioning, although they were usually stored in locked rooms to prevent someone from wheeling them away. The main tasks were still corporate accounting, inventory management, order processing, and specific mathematical calculations.

Personal computers The third wave of computing saw a major shift in our mindset of computing. Personal computers could be purchased (and run) by individuals. They could be used at home or in the office (if you carried it in yourself). The mindset for personal computing was very different from the corporate-centered computing of the previous eras. Personal computing could be used for ... anything. The composition and printing of documents was handled by word processors. Spreadsheets let us calculate our own budgets. Small databases (and later larger databases) let us store our own transaction data. If off-the-shelf software was not suitable to the task, we could write our own programs.

The mindset of personal computing has been with us for over thirty years. The size and shape of personal computers has been roughly the same: the same CPU box, the same keyboard, the same monitor. We know what a PC looks like, The software has seen one major change, from DOS to Windows, but Windows has been with us for the past twenty years. We know what programs look like.

The introduction of tablets has caused us to re-think our ideas of computing. And we're not that good at re-thinking. We see tablets and phones and they seem strange to us. The size and shape are different (and therefore "wrong"); the user interface is different (and therefore "wrong"); the way we purchase applications is different (and therefore "wrong"); even the way we call applications ("apps") is different (and therefore... you get the idea).

I observe that mobile devices caused little discomfort while they remained in the consumer market. Phones that could play music and games were not a problem. Tablets that let one scroll through Facebook or read books were not a problem. These were extensions to our existing technology.

Now phones and tablets are moving into the commercial sphere, and their application is not obvious. It is clear that they are not personal computers -- their size and shape prove that. But there are more differences that cause uncertainty.

Touch interface The user interface for phones and tablets is not about keyboards and mice but about taps and swipes.

Small screen Tablets have small-ish screens, and phones have tiny screens. How can anyone work on those?

Different operating systems Personal computers run Windows (except for a few in the marketing groups that use Mac OS). Tablets run something called "Android" or something else called "iOS".

Something other than Microsoft Microsoft's entries in the phone and tablet market are not the market leaders and their operating systems have not been accepted widely.

Even Microsoft isn't Microsoft-ish Microsoft's operating system for phones and tablets isn't really Windows, it is this thing called "Windows 8". The user interface looks completely different. Windows RT doesn't run "classic" Windows programs at all (except for Microsoft Office).

The changes coming from mobile are only one front; changes to the PC are also coming.

The typical PC is shrinking Display screens have become flat. The CPU box is shrinking, losing the space for expansion cards and empty disk bays. Apple's Mac Mini, Intel's New Unit of Computing, and other devices are changing how we look at computers.

Windows is changing Windows 8 is very different from "good old Windows". (My view is that Windows 8's tiles are simply a bigger, better "Start" menu, but many disagree.)

These changes mean that one cannot stay put with Windows. You either advance into the mobile world or you advance into the new Windows world.

The brave new worlds of mobile  and Windows look and feel very different from the old world of computing. Many of our familiar techniques are replaced with something new (and strange).

We thought we knew what computers were and what computing was. Mobile changes those ideas. After thirty years of a (roughly) constant notion of personal computing, many people are not ready for a change.

I suspect that the people who are hardest hit by the changes of mobile are those aged 25 to 45; old enough to know PCs quite well but not old enough to remember the pre-PC days. This group never had to go through a significant change in technology. Their world is changing and few are prepared for the shock.

The under-25 crowd will be fine with tablets and computers. It's what they know and want.

Interestingly, the over-45 folks will probably weather the change. They have already experienced a change in computing, either from mainframes or minicomputers to personal computers, or from nothing to personal computers.

Monday, March 10, 2014

IBM makes... mainframes

IBM, that venerable member of the technology world, built its reputation on mainframe computers. And they are still at it.

In the 1940s and 1950s, computing devices were specific to the task. We didn't have general purpose computers; we had tabulators and sorters and various types of machines. The very early electronic calculators were little more than adding machines -- addition was their only operation. The later machines were computers, albeit specialized, usually for military or commercial needs. (Which made some sense, as only the government and large corporations could afford the machines.)

IBM's System/360 changed the game. It was a general purpose machine, suitable for use by government, military, or commercial organizations. IBM's System/370 was a step up with virtual memory, dual processors, and built-in floating point arithmetic.

But these were still large, expensive machines, and these large, expensive machines defined the term "mainframe". IBM was the "big company that makes big computers".

Reluctantly, IBM entered the minicomputer market to compete with companies like DEC and Data General.

Also reluctantly, IBM entered the PC market to compete with Apple, Radio Shack, and other companies that were making inroads into the corporate world.

But I think, in its heart, IBM remained a mainframe company.

Why do I think that? Because over the years IBM has adjusted its product line. Look at what they have stopped producing:

  • Typewriters
  • Photocopiers
  • Disk drives
  • Tape drives
  • Minicomputers
  • Microcomputers (PCs)
  • Laptop computers
  • Printers for PCs

And look at what they have kept in their product line:

  • Mainframe computers
  • Servers
  • Cloud-based services
  • Watson

The last item, Watson, is particularly telling. Watson is IBM's super-sized information storage and retrieval system. It is quite sophisticated and has appeared (successfully) on the "Jeopardy!" TV game show.

Watson is a product that IBM is marketing to large companies (and probably the government). They do not offer a "junior" version for smaller companies or university departments. They do not offer a "personal" version for individuals. IBM's Watson is today's equivalent of the System/360 computer: large, expensive, and made for wealthy clients.

So IBM has come full circle, from the System/360 to minicomputers to personal computers and back to Watson. Will they ever offer smaller versions of Watson? Perhaps, if other companies enter the market and force IBM to respond.

We PC revolutionaries wanted to change the world. We wanted to bring computing to the masses. And we wanted to destroy IBM (or at least take it down a peg or two). Well, we did change the world. We did bring computing to the masses. We did not destroy IBM, or its mainframes. IBM is still the "big company that makes big computers".

Thursday, February 27, 2014

The Mobile Evolution is different from the PC Revolution

When PCs entered the scene, it was a revolution. The entry of mobile devices (smartphones and tablets) is less revolution and more evolution.

What is the difference? Exclusive applications.

When PCs entered the market, they shared some applications with minicomputers and mainframes (assemblers, compilers, and text editors, mostly) but a large percentage of applications were unique to PCs. Word processors, spreadsheets, and games were the popular applications, and none of them were from mainframes.

Mobile applications, for the most part, are extensions of existing PC and web applications. Twitter, Facebook, even Google Maps all live on the web and in the mobile world.

There are some apps that are (or were) mobile-only. "Angry Birds" was first released for the iPhone and later moved to other platforms. FourSquare is a mobile-only app, given its "check in" function.

But the number of mobile-only apps is small, compared to the total. This is quite different from the PC revolution, which saw thousands of PC-specific applications and a small number of mainframe "crossover" applications.

The PC world was filled with new applications, and new types of applications. That's why it was a revolution. The mobile world -- so far -- is filled with extensions to web applications. That's why the mobile world is an evolution.

Sunday, February 9, 2014

Mobile/cloud chips away at batch processing

Consider a particular type of PC application, one that I call "the spreadsheet app". It processes information. It accepts large quantities of data in (a spreadsheet, or multiple spreadsheets), processes the data, and then provides the results in large quantities of data (another spreadsheet with multiple pages).

In some ways, it is a design of mainframe batch processing: collect all of the input data up front, process it in one large calculation, and provide the results in one large batch.

The PC revolution was supposed to change all of that. The PC revolution was supposed to slay the batch processing beast and make data processing interactive. Yet here we are, thirty years later, still processing data in large batches.

I think that tablets (or more specifically, mobile/cloud) will succeed where the PC revolution failed.

Moving a spreadsheet app to mobile cloud is not easy. A direct port makes little sense: tablets have small screens and data entry into spreadsheets requires fine control for the selection of cells.

A "native" tablet app would take advantage of the strengths of mobile/cloud (interactive displays and fast processing on the cloud servers) and avoid the weaknesses (constrained data entry). Instead of data entry into a large grid of numbers, data must be entered in smaller units. This is possible with lots of spreadsheet apps; their data is often structured into collections (and sometimes collections of collections). Tablets can handle data entry in smaller chunks.

The shift from the spreadsheet grid to a collection of units is the same as the shift from batch processing to interactive processing. The app must accept small units of data, incorporating each smaller unit into the larger whole.

It's possible to build the entire large batch of data in this manner, and then process the batch at once, but its also possible to process each unit of data (a small batch) as it is entered.

Even if each unit of input requires a complete re-calculation of the data, that may be okay. The calculation would be performed on a server (or a set of cloud-based servers) and computing is getting faster and faster. Pushing data to cloud servers for calculations makes sense.

Tablets still have a limited size, and displaying the results may be problematic. One cannot display a raft of numbers on a tablet. (Well, one can display a raft of numbers, by using a very small typeface size. But the results would be undesirable.)

Instead of a text display, apps for tablets will (most likely) use graphical representations for data, with the ability to focus on small sections and see the underlying numbers. We could use this technique today with PC applications, but spreadsheets are limited in their abilities to present information graphically. (I've used Microsoft Excel for years, and the charts and graphic capabilities have remained static for quite a while.)

So that's what I see. Mobile/cloud apps will accept small units of data, process them on fast servers, and present the results in graphical form. The shift from batch processing to interactive processing. The PC revolution will eventually occur and liberate us from the tyranny of batch processing. Ironically, it will occur not on PCs but on tablets and cloud servers.

Wednesday, January 29, 2014

The PC revolution was about infrastructure

Those of us who lived through the PC revolution like to think that PCs were significant advances in technology. They were advances, but in retrospect they were simply infrastructure.

Let's review the advances in PC technology:

Stand-alone PCs The original PCs were brought in as replacements for typewriters and calculators. This was a tactical use of PCs, one that improved the efficiency of the company but did not change the internal organization or the products and services offered by the company. The PC, with only word processors and spreadsheets, is not strong enough to make a strategic difference for a company.

Databases After some time, people figured out that PCs could be more than typewriters and calculators. There were custom PC applications, but more importantly there were the early databases (dBase II, dBase III, R:Base) and database languages (Clipper, Paradox) that let people store and retrieve data. These databases were single-user and stand-alone.

Networks The original PC networks (Novell, Banyan, Corvus) were introduced to share resources such as disks and printers. Printers (especially letter-quality printers) were expensive. Disks (large disks, say 40 MB) were also expensive. Sharing a common resource made economic sense. But the early networks were LANs (Local Area Networks) and confined to a single building.

Servers Initially part of "client/server systems", servers were database engines that handled requests from multiple clients. Client/server systems gave networks a significant purpose for existing: the ability to update a single database from multiple locations made it possible to migrate mainframe applications onto the cheaper PC platform.

The Internet Connecting networks made it possible for businesses to exchange information. The first big use of internet connections was e-mail; calendars followed quickly. Strictly speaking, the Internet is not a PC technology -- it was built mostly with minicomputers and Unix. The sockets libraries (WinSock) for PCs made the Internet accessible.

Web servers Built on these previous layers, the web is (now) a combination of PCs, minicomputers, mainframes, and rack-mounted servers. It is this layer that enables strategic as well as tactical advantages. Companies can provide self-service web pages (more of a tactical change, I think) and new services (strategic). New companies can form (Facebook, Twitter).

Virtualization The true advantage of virtualization is not consolidation of servers, but the ability to create or destroy machines quickly.

Cloud computing Once virtual machines were available and cheap, we created the cloud paradigm. Using an array of virtual computers, we can design applications that are distributed across multiple servers and are resistant to failure of any one of those servers. The distribution of work allows for scaling (up or down) as needed, adding or removing servers to handle the current load.

All of these technologies are now infrastructure. They are well-understood and easily available.

New technologies are plugging in to this infrastructure. Smartphones, tablets, and big data are all sitting on top of this (impressive) stack of technology. Smartphone and tablet apps use low-wattage user interfaces and connect to cloud computing systems for processing. Big data system use a similar design, with cloud computing engines providing the data for visualization software on PCs (or in web browsers).

When we built the first microcomputers, when we installed DOS on PCs, when we used modems to connect to bulletin-board systems, we thought we were creating the crest of technology. We thought we were building the top dog. But it didn't turn out that way. The PC and its later technologies let us build a significant computing stack.

Now that we have that stack, I think we can discard the traditional PC. The next decade should see the replacement of PCs. Not all at once, and at different rates in different environments. I expect PCs to exist in businesses for quite some time.

But disappear they shall.

Monday, September 9, 2013

Microsoft is not DEC

Some have pointed out the comparisons of Microsoft to the long-ago champion of mini-computers DEC.

The commonalities seem to be:

  • DEC and Microsoft were both large
  • DEC and Microsoft had strong cultures
  • DEC missed the PC market; Microsoft is missing the mobile market
  • DEC and Microsoft changed their CEOs

Yet there are differences:

DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.

DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.

The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)

PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.

Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.

Tuesday, July 9, 2013

The Last Picture Show

The PC revolution brought many changes, but the biggest was distribution of computing power. Personal computers smashed the centralization of mainframe processing and allowed individuals to define when (and where, to the modest extent that PCs were portable) computing would be done.

Yet one application (a tool of the PC era, ironically) follows a centralized model. It requires people to come and meet in a single location, and to wait for a single individual to provide data. It is a mainframe model, with central control over the location and time of processing.

That application is PowerPoint.

Or, more generally, it is presentations.

Presentations are the one application that requires people to come together into a single space, at a specific time, and observe a speaker or group of speakers. The attendees have no control over the flow of information, no control over the timing, and no control over the content. (Although they usually have an idea of the content in advance.)

The presentation is a hold-over from an earlier era, when instructors had information, students desired information, and the technology made mass attendance the most efficient form of distribution. Whether it be a college lecture, a sermon at a religious service, or a review of corporate strategy, the model of "one person speaks and everyone listens" has remained unchanged.

Technology is is a position to change that. We're seeing it with online classes and new tools for presentations. Business meetings can be streamed to PCs and tablets, eliminating the need for employees to travel and meet in a large (probably rented) space. Lectures can be recorded and viewed at the leisure of the student. Sermons can be recorded and provided to those who are unable to attend, perhaps due to infirmities or other commitments.

We don't need presentations on large screens (and on large screens only). We need the information, on a large or small screen.

We don't need presentations in real time (and in real time only). We need information at the right time, with the ability to replay sections to clarify questions.

Look for successors to PowerPoint and its colleagues that combine presentation (video and audio) with time-shifting, multiple devices (PCs, web browsers, and tablets), annotations (written and oral), and indexing for private searches.

I think of the new presentation technologies as an enhanced notebook, one with multimedia capabilities.

Sunday, May 19, 2013

The real reason we are angry with Windows 8

Windows 8 has made a splash, and different people have different reactions. Some are happy, some are confused, and some are angry.

The PC revolution was about control, and about independence. PCs, in the early days, were about "sticking it to the man" -- being independent from the big guys. Owning a PC (or a microcomputer) meant that we were the masters of our fate. We controlled the machine. We decided what to do with it. We decided when to use it.

But that absolute control has been eroded over time.

  • With CP/M (and later with MS-DOS and even later with Windows), we agreed to use a common operating system in exchange for powerful applications.
  • With Wordstar (and later with Lotus 1-2-3 and even later with Word and Excel) we agreed to use common applications in exchange for the ability to share documents and spreadsheets.
  • With Windows 3.1, we agreed to use the Microsoft stack in exchange for network drivers and access to servers.
  • With Windows 2000 SP3, we had to accept updates from Microsoft. (The license specified that.)

We have gradually, slowly, gave up our control in exchange for conveniences.

Now, we have come to the realization that we are not in control of our computers. Our iPads and Android tablets update themselves, and we lack total control (unless we jail-break them).


I think what really makes people mad is the realization that they are not in control. We thought that we were in control, but we're not. The vendor calls the shots.

We thought that we were in control. We thought that we called the shots.

Windows 8, with its new user interface and its new approach to apps, makes it clear that we are not.

And we're angry when we realize it.

Sunday, April 7, 2013

Mobile/cloud apps will be different than PC apps

As a participant in the PC revolution, I was comfortable with the bright future of personal computers. I *knew* -- that is, I strongly believed -- that PCs were superior to mainframes.

It turned out that PCs were *different* from mainframes, but not necessarily superior.

Mainframe programs were, primarily, accounting systems. Oh, there were programs to compute ballistics tables, and programs for engineering and astronomy, and system utilities, but the big use of mainframe computers was accounting (general ledger, inventory, billing, payment processing, payables, receivables, and market forecasts). These uses were shaped by the entities that could afford mainframe computers (large corporations and governments) and the data that was most important to those organizations.

But the data was also shaped by technology. Computers read input on punch cards and stored data on magnetic tape. The batch processing systems were useful for certain types of processing and made efficient use of transactions and master files. Even when terminals were invented, the processing remained in batch mode.

Personal computers were more interactive than mainframes. They started with terminals and interactive applications. From the beginning, personal computers were used for tasks very different than the tasks of mainframe computers. The biggest applications for PCs were word processors and spreadsheets. (They still are today.)

Some "traditional" computer applications were ported to personal computers. There were (and still are) systems for accounting and database management. There were utility programs and programming languages: BASIC, FORTRAN, COBOL, and later C and Pascal. But the biggest applications were the interactive ones, the ones that broke from the batch processing mold of mainframe computing.

(I am simplifying greatly here. There were interactive programs for mainframes. The BASIC language was designed as an interactive environment for programming, on mainframe computers.)

I cannot help but think that the typical mainframe programmer, looking at the new personal computers that appeared in the late 1970s, could only puzzle at what possible advantage they could offer. Personal computers were smaller, slower, and less capable than mainframes in every degree. Processors were slower and less capable. Memory was smaller. Storage was laughably primitive. PC software was also primitive, with nothing approaching the sophistication of mainframe operating systems, database management systems, or utilities.

The only ways in which personal computers were superior to mainframes were the BASIC language (Microsoft BASIC was more powerful than mainframe BASIC), word processors, and spreadsheets. Notice that these are all interactive programs. The cost and size of a personal computer made it possible for a person to own one, but the interactive nature of applications made it sensible for a person to own one.

That single attribute of interactive applications made the PC revolution possible. The success of modern-day PCs and the Microsoft empire was built on interactive applications.

I suspect that the success of cell phones and tablets will be built on a single attribute. But what that attribute is, I do not know. It may be portability. It may be location-aware capabilities. It may be a different level of interactivity.

I *know* -- that is, I feel very strongly -- that mobile/cloud is going to have a brilliant future.

I also feel that the key applications for mobile/cloud will be different from traditional PC applications, just as PC applications are different from mainframe applications. Any attempt to port PC applications to mobile/cloud will be doomed to failure, just as mainframe applications failed to port to PCs.

Mainframe applications live on, in their batch mode glory, to this day. Large companies and governments need accounting systems, and will continue to need them. PC applications will live through the mobile/cloud revolution, although some may fade; PowerPoint-style presentations may be better served on synchronized mobile devices than with a single PC and a projector.

Expect mobile/cloud apps to surprise us. They will not be word processors and spreadsheets. (Nor will they be accounting systems.) They will be more like Twitter and Facebook, with status updates and connections to our network of people.