- Standardized computers for sale (or lease)
- General-purpose mainframes
- Minicomputers
- Personal computers
- Web applications
- Cloud applications
Each of these events were revolutionary -- they introduced new forms of computing. And all of these events (except one) saw an expansion of computing, an increase in the applications that could be performed by computers.
The first revolution (standardized computers) was in the days of the IBM 1401. Computers were large, expensive, and designed for specific purposes, but they were also consistent. One IBM 1401 was quite similar to another IBM 1401, ignoring differences in memory and tape drives. The similarity in computers made possible the idea of commonly used applications, and common programming languages such as FORTRAN and COBOL.
The second revolution (a general-purpose computer) was introduced by the IBM System/360. The System/360 was designed to run applications for different domains: scientific, commercial, and government. It built on the ideas of common applications and common programming languages.
The minicomputer revolution (minicomputers, or timesharing) expanded computing with interactive applications. Instead of batch jobs that could be run only when scheduled by operators, timesharing allowed for processing when users wanted it. In fact, timesharing expanded computing from operators to users. (Not everyone was a user, but the set of users was much larger than the set of operators.) Minicomputers were used to create the C language and write the Unix operating system.
The PC revolution brought computing to "the rest of us", or at least those who were willing to spend the thousands of dollars for a small computer. It applications were more interactive than those of timesharing, and more graphical. The "killer" app was the spreadsheet, but word processors, small databases, and project planning software was also popular, and made possible with PCs.
The web revolution introduced communication, and made applications available across a network.
Each of these changes -- revolutions, in my mind -- expanded the universe of computing. The expansions were sometimes competitive, with the "rebels" introducing new applications and the "old guard" attempting to copy the same applications onto the old platform. The expansions were sometimes divisive, with people in the "old" and "new" camps disagreeing on applications, programming languages, and techniques, and even what value the different forms of computing offered. But despite competition and disagreement, each camp had its own ground, and was relatively secure in that area.
There was no fear that minicomputers would replace mainframes. The forms of computing were too different. The efficiencies of the two forms were different. Mainframes excelled at transaction processing. Minicomputers excelled at interaction. Neither crossed into the other's territory.
When PCs arrived, there was no fear that PCs would replace mainframes. PCs would, after some time, replace stand-alone word processing systems and typewriters. But mainframes retained core business applications on big iron. (Minicomputers did die off, being caught between efficient mainframes and interactive PCs.)
When the web arrived, there was no fear that web servers would replace PCs. There was no fear that web applications would replace desktop applications. The web was a new place, with new capabilities. Instead of replacing PCs, the web expanded the capabilities of mainframe systems, providing a user interface into banking and corporate systems. PC applications such as word processing and spreadsheets remained on PCs.
Which brings us to cloud computing.
The cloud revolution is different. The approach with cloud computing, the philosophy, has been to absorb and replace existing applications. We have any number of companies ready to help "move applications to the cloud". There are any number of books, magazines, and online resources that describe tips and tricks for migrating to the cloud. The message is clear: the cloud is the place to be, convert your old applications to the cloud.
This mindset is different from the mindset of previous revolutions. The cloud revolution wants to take over all computing. The cloud revolution is predatory. It is not content with an expansion of computing; it wants to own it all.
I do not know why this revolution is different from previous changes. Why should this change, which is simply another form of computing, push people to behave differently.
At the root, it is people who are behaving differently. Cloud computing is not a sentient being; it has no feelings, no desires, and no motivations. Cloud computing does not want to take over the computing world; it is us, the people in IT, the developers and designers and managers who want cloud computing to take over the world.
I think that this desire may be driven by two factors: economics and control. The economics of cloud computing is better (cheaper) than the economics of PCs, discrete web servers, and even mainframes. But only if the application is designed for the cloud. A classic web application, "lifted and shifted" into the cloud, has the same economics as before.
The other factor is control. I think that people think that they have more control over cloud-based applications than desktop applications or classic web applications. The first is undoubtedly true. Desktop applications, installed on users PCs, are difficult to manage. Each PC has its own operating system, its own hardware, its own set of other applications, any of which can interfere with the application. PCs can fail, they can run out of disk space, and -- worst of all -- let an old version of the application continue to run. The cloud does away with all of that: control moves from the user to the cloud administrator and support becomes much simpler.
So I can understand the desire for people to move applications to the cloud. But I think that people are missing opportunities. By focusing on moving existing applications into the cloud, we do not see the possible new applications, possible only in the cloud. Those opportunities include things such as big data and machine learning, and can include more.
Imagine the PC revolution, with small computers that fit on desktops, and applications limited to copies of existing mainframe applications. The new PCs would be running order entry systems and inventory systems and general ledger. Or at least we would be trying to get them to run those applications, and we would be ignoring the possibilities of word processing and spreadsheets.
Cloud computing is a form of computing, just as mainframes, PCs, and the web are all forms of computing. Each has its strengths (and weaknesses). Don't throw them away for efficiency, or for simpler support.
No comments:
Post a Comment