Thursday, January 30, 2020

The cloud revolution is different

The history of computing can be described as a series of revolutions. If we start the age of modern computing with the earliest electronic calculating machines, we have the following upheavals:

  • Standardized computers for sale (or lease)
  • General-purpose mainframes
  • Minicomputers
  • Personal computers
  • Web applications
  • Cloud applications

Each of these events were revolutionary -- they introduced new forms of computing. And all of these events (except one) saw an expansion of computing, an increase in the applications that could be performed by computers.

The first revolution (standardized computers) was in the days of the IBM 1401. Computers were large, expensive, and designed for specific purposes, but they were also consistent. One IBM 1401 was quite similar to another IBM 1401, ignoring differences in memory and tape drives. The similarity in computers made possible the idea of commonly used applications, and common programming languages such as FORTRAN and COBOL.

The second revolution (a general-purpose computer) was introduced by the IBM System/360. The System/360 was designed to run applications for different domains: scientific, commercial, and government. It built on the ideas of common applications and common programming languages.

The minicomputer revolution (minicomputers, or timesharing) expanded computing with interactive applications. Instead of batch jobs that could be run only when scheduled by operators, timesharing allowed for processing when users wanted it. In fact, timesharing expanded computing from operators to users. (Not everyone was a user, but the set of users was much larger than the set of operators.) Minicomputers were used to create the C language and write the Unix operating system.

The PC revolution brought computing to "the rest of us", or at least those who were willing to spend the thousands of dollars for a small computer. It applications were more interactive than those of timesharing, and more graphical. The "killer" app was the spreadsheet, but word processors, small databases, and project planning software was also popular, and made possible with PCs.

The web revolution introduced communication, and made applications available across a network.

Each of these changes -- revolutions, in my mind -- expanded the universe of computing. The expansions were sometimes competitive, with the "rebels" introducing new applications and the "old guard" attempting to copy the same applications onto the old platform. The expansions were sometimes divisive, with people in the "old" and "new" camps disagreeing on applications, programming languages, and techniques, and even what value the different forms of computing offered. But despite competition and disagreement, each camp had its own ground, and was relatively secure in that area.

There was no fear that minicomputers would replace mainframes. The forms of computing were too different. The efficiencies of the two forms were different. Mainframes excelled at transaction processing. Minicomputers excelled at interaction. Neither crossed into the other's territory.

When PCs arrived, there was no fear that PCs would replace mainframes. PCs would, after some time, replace stand-alone word processing systems and typewriters. But mainframes retained core business applications on big iron. (Minicomputers did die off, being caught between efficient mainframes and interactive PCs.)

When the web arrived, there was no fear that web servers would replace PCs. There was no fear that  web applications would replace desktop applications. The web was a new place, with new capabilities. Instead of replacing PCs, the web expanded the capabilities of mainframe systems, providing a user interface into banking and corporate systems. PC applications such as word processing and spreadsheets remained on PCs.

Which brings us to cloud computing.

The cloud revolution is different. The approach with cloud computing, the philosophy, has been to absorb and replace existing applications. We have any number of companies ready to help "move applications to the cloud". There are any number of books, magazines, and online resources that describe tips and tricks for migrating to the cloud. The message is clear: the cloud is the place to be, convert your old applications to the cloud.

This mindset is different from the mindset of previous revolutions. The cloud revolution wants to take over all computing. The cloud revolution is predatory. It is not content with an expansion of computing; it wants to own it all.

I do not know why this revolution is different from previous changes. Why should this change, which is simply another form of computing, push people to behave differently.

At the root, it is people who are behaving differently. Cloud computing is not a sentient being; it has no feelings, no desires, and no motivations. Cloud computing does not want to take over the computing world; it is us, the people in IT, the developers and designers and managers who want cloud computing to take over the world.

I think that this desire may be driven by two factors: economics and control. The economics of cloud computing is better (cheaper) than the economics of PCs, discrete web servers, and even mainframes. But only if the application is designed for the cloud. A classic web application, "lifted and shifted" into the cloud, has the same economics as before.

The other factor is control. I think that people think that they have more control over cloud-based applications than desktop applications or classic web applications. The first is undoubtedly true. Desktop applications, installed on users PCs, are difficult to manage. Each PC has its own operating system, its own hardware, its own set of other applications, any of which can interfere with the application. PCs can fail, they can run out of disk space, and -- worst of all -- let an old version of the application continue to run. The cloud does away with all of that: control moves from the user to the cloud administrator and support becomes much simpler.

So I can understand the desire for people to move applications to the cloud. But I think that people are missing opportunities. By focusing on moving existing applications into the cloud, we do not see the possible new applications, possible only in the cloud. Those opportunities include things such as big data and machine learning, and can include more.

Imagine the PC revolution, with small computers that fit on desktops, and applications limited to copies of existing mainframe applications. The new PCs would be running order entry systems and inventory systems and general ledger. Or at least we would be trying to get them to run those applications, and we would be ignoring the possibilities of word processing and spreadsheets.

Cloud computing is a form of computing, just as mainframes, PCs, and the web are all forms of computing. Each has its strengths (and weaknesses). Don't throw them away for efficiency, or for simpler support.

Tuesday, January 14, 2020

Another view of mainframes, minicomputers, and PCs

When thinking of the history of computing, one often thinks of hardware by size: mainframes, minicomputers, personal computers, smartphones, and servers. (I suppose one can think of a sixth, "elder days" era of mechanical and electromechanical devices. That's okay; it doesn't affect this post.)

In this classical view, computers are classified by size. Mainframes were large machines that filled rooms, minicomputers were smaller (the size of refrigerators), personal computers could fit on (or under) a desk, and smartphones can fit in your pocket. Servers pose a bit of a problem for this classification system, as servers are about the size of personal computers. They occupy the same "space" as personal computers. What distinguishes a server is that it can be mounted in a rack. 

I have been pondering a different classification system. This new view looks beyond the hardware and the peripheral devices, and considers the major concerns of the people operating (or purchasing)  computers.

Mainframes were big, but simple. The main concerns were processor speed, memory size, and storage. Storage was usually magnetic tape, but could have been magnetic disk (or drum) or possibly punch cards. Computer systems were rated based on the number of jobs they could run (either at one time or over a single night).

Minicomputers were smaller and still had processors and memory and storage, but the rating system had changed. Instead of the "number of jobs", people were now concerned with the "number of users". Minicomputer had terminals, something that mainframe computers initially lacked. (Mainframes did acquire terminals as purchasers wanted on-line systems, but the first and primary purpose of a mainframe was to run jobs efficiently.) The people who purchased minicomputers wanted to know how many users it could support.

Personal computers saw a different configuration for hardware. Computers kept processors, memory, and storage, but dropped the notion of "terminal". Personal computers were one-per-user, and instead of a terminal (a remote display and keyboard connected usually via a serial line) they used a keyboard attached to the main board and a display attached to a video board. While purchasers were interested in memory, storage, and processor speed, the main concern became video -- screen size, screen resolution, number of colors, and hardware dedicated to video.

Smartphones (and if we want, tablets) are different from personal computers physically, being smaller and portable. But it is not size that is the major issue with smartphones. Nor is the main concern the number of jobs, or the number of terminals, or the size and resolution of the screen. No, the big question in the purchase of a smartphone is battery life. How long does the battery last? How long to recharge? Screen size and weight are also factors, and some people are loyal to brands, so these are also factors. For smartphones, people think about quite different qualities than those for laptop computers.

Servers are different from all of these categories because people who buy servers want to run web sites, or (now more often) virtual machines. Purchasers want to complete requests (either on the physical server or on virtual machines on the physical server). They don't care about video -- that's for the requesting client to worry about. Thus, while servers are about the same size as personal computers, people think about them in very different ways. (In my classification system, servers are closer to mainframes than personal computers.)

What do these different concerns tell us? Well, it is interesting to see how concerns changed over time. Our interest in computers is not constant, we focus on different aspects of different classes of hardware.

Secondly, we should note that our interest in computers is not in the hardware or the software, but instead what the computer can do for us. We care more about how we use computers than how computers look or run.

Also, notice that how we store computers has changed over time, from large mainframes that are guarded in secure rooms to devices that are casually slipped into a pocket. (Much of this is a result of advances in technology.)

Now let's switch from hardware to software. Like hardware, software has changed over time, and our views of software have changed over time. Hardware definitely preceded software; indeed, the first computers were hard-wired to perform calculations and there was no software as we think of it.

Software for mainframes was designed for business purposes (accounting, inventory management, billing, etc.) and military purposes (ballistics tables were among the first applications). Software for minicomputers was designed for data analysis, which could leverage the interactivity of minicomputers with their terminals. The "killer app" for personal computers was the spreadsheet (Visicalc at first, then Lotus 1-2-3, and later Excel). For servers, databases and the web server; for smartphones, maps and GPS systems.

Our use of computers (hardware and software) has changed over time. Our expectations have changed. Initial uses of computers were "obvious": calculations better handled by machine. Later uses were not necessarily obvious but perhaps even more compelling -- including games such as "Space Invaders" and "Angry Birds".

This trend, I think, may tell us something about computing in the future. We may be more interested in computers (and applications) that serve us, that do things for us, than computers that don't. Or, we may be interested in computers that meet specific needs better than the computers we have today.

But that may not be a surprise.

Wednesday, January 8, 2020

Predictions for 2020

I have some predictions for 2020. They may (or may not) be correct.

Hardware: Virtual desktops and the cloud

I expect 2020 to be the "year of the cloud", in a sense. While cloud computing is popular, I see the phrase "the cloud" becoming more popular in the upcoming year, even more popular than cloud computing itself. How can this be? How can the term be used more than the actual thing?

I expect that lots of people with use the term "the cloud" to mean online (or web-based) computing, even in situations that do not use actual, honest-to-goodness cloud computing.

We will see an interest in virtual desktops, specifically for Windows. Today's PCs are real PCs with operating systems and applications. In 2020, look for a push (by Microsoft) for virtual desktops - instances of Windows hosted on servers and accessed by a browser or by Microsoft's Remote Desktop program.

Most folks will call this "Windows in the cloud" or "cloud computing". The former is a more accurate term, but the second is not too wrong. Virtual desktops will be hosted on servers, with some applications built for cloud computing and others not. Microsoft's Office products (Word, Excel, and Outlook) will all reside in the cloud as try cloud applications. Applications from other vendors will run on virtual desktops but won't be true cloud applications.

Virtual Windows desktops offer several advantages to Microsoft: they are paid as subscriptions, so Microsoft gets a steady cash flow. Second, Microsoft can move customers to the latest version Windows 10. Third, and perhaps most importantly, Apple is not prepared to offer a matching service. (Apple remains in the world of "fat desktops" which run the operating system and the applications. They cannot move that experience to virtual workstations hosted in the cloud.)

Cloud-based Windows is not for everyone. Some will be unwilling to move to virtual desktops, and some will be unable to move. Anyone who insists on running an older version of Windows will remain in "fat desktop" land. And anyone who cannot install their software (perhaps because they have mislaid the install CD-ROM) will stay with their current desktop computers.

Those who do move to virtual desktops will be able to use simpler, lightweight computers (perhaps Chromebooks, perhaps computers with "Windows in S mode") that run little more than the software necessary to access the virtual desktop. The physical desktop computer will be a "terminal" to the virtual desktop. The benefits for those users will be cheaper hardware, more reliable desktops, invisible backups, versioning of data files, and the ability to shift work from one lightweight desktop "terminal" to another.

Programming languages: More Python, less Perl

Perl was perhaps the first programming language to be designated a "scripting language". (It won the designation, although other languages such as Awk and Csh predate it.)

But Perl has fallen on hard times. Developers have left Perl for Python, and the "Perl 6" effort, delayed and poorly advertised, has now re-branded itself as "Raku" to allow Perl 5 to continue without the cloud of eventual shutdown. The change comes late, and I fear that many have abandoned Perl in favor of Python.

Ruby is in a similar situation, with Python appealing to many developers and interest in Ruby is waning.

Python is winning the "scripting race". Many schools teach it, and many experienced developers recommend it as a first language to learn. (I'm in that group.) It has good support with libraries, tools, and documentation. Microsoft supports it in "Visual Studio Code" and in "Visual Studio".

I expect Python to gain in popularity, and Perl and Ruby to decline.

Programming languages: Smaller languages

I expect to see a shift from object-oriented languages (Java, C#, and C++) to smaller scripting languages (Python and perhaps Ruby).

Object-oriented languages are effective for large, complex systems. They were just what we needed for the large, complex applications that ran on PCs and servers -- before cloud computing. With cloud computing, we build systems (large or small) out of services, and we strive to make services small. (A large application can built of many small services.)

Java, C#, and C++ can be used to build small services, but often Python and other languages can be a better fit. The "big" object-oriented languages carry a lot of baggage to make object-oriented programming work; the smaller languages carry less.

Two possible contenders for building services may be Swift and Visual Basic. Swift is relatively new, and still undergoing changes. Visual Basic has evolved through several generations and Microsoft may create a smaller, lighter version for services. (These are guesses; I have no indication from Apple or Microsoft that such projects are underway or even under consideration.)

Programming languages: Safer languages

The languages Rust and Go are getting a lot of attention. Both are compiled languages, letting one build fast and efficient applications.

Rust and Go may challenge C and C++ as the premiere languages for high-performance systems. C and C++ have had a good run, from the 1970s to today. But more and more, safety in programs is becoming important. The design of languages such as Rust and Go give them an advantage over C and C++.

C and C++ will stay with us, of course. Many large-scale and popular projects are written in them. I don't expect those projects to convert to Rust, Go, or any other language.

I do expect new projects to consider Rust and Go as candidate languages. I also expect projects to convert existing systems from their current technology (which could be COBOL, Fortran, or even C and C++) to think about Rust and Go.

Containers: Nice for deployment, little else

Containers were quite popular in the past year, and I expect that they will remain popular. They are convenient ways of deploying (or sharing with others) a complete application, all ready to go.

Containers offer no benefits beyond that, however, and they are helpful only to the people who must deploy applications. Therefore, containers, like virtual machines, will quietly move to the realm of sysadmins.

Development: better tech than ever

For the year 2020, the picture for development looks rather nice. We have capable languages (and lots of them); stable networks, servers, and operating systems; powerful tools for testing, deployment, and monitoring; and excellent tools for communication and collaboration. Problems in development will not be technical in nature -- which means that the challenges we face will be with people and interactions.

For developers (and project managers), I expect to see interest in collaboration tools. That includes the traditional tools such as version control, item tracking, and messaging tools such as Slack. We might see interest in new tools that have not been introduced as yet.

Summary

I see a bright future for development. We have good tools and good practices. The challenges will be people-oriented, not technology-oriented. Let's enjoy this year!