Thursday, November 13, 2014

Cloud and agile change the rules

The history of computer programming is full of attempts to ensure success, or more specifically, to avoid failure. The waterfall method of separating analysis, design, and coding (with reviews after each step) is one such technique. Change reviews are another. System testing is another. Configuration management (especially for production systems) is another.

It strikes me that cloud computing and agile development techniques are yet more methods in our quest to avoid failure. But they change the rules from previous efforts.

Cloud computing tolerates failures of equipment. Agile development guards against failures in programming.

Cloud computing uses multiple instances of servers. It also uses stateless transactions, so any server can handle any request. (Well, any web server can handle any web request, and any database server can handle any database request.) If a server fails, another server (of the same type) can pick up the work.

Cloud computing cannot, however, handle a failure in code. If I write a request handler and get the logic wrong, then each instance of the handler will fail.

Agile development handles the code failures. Agile development ensures that the code is always correct. With automated tests and small changes, the code can grow and programmers (and managers) can know that the added features are correct.

These two techniques (cloud and agile) let us examine some of the strategies we have used to ensure success.

For hardware, we had long product life cycles. We selected products that were known to be reliable. For mainframes and early PCs, this was IBM. (For later PCs it was Compaq.) The premium brands commanded premium prices, because we valued the reliability of the equipment and the vendor. And since the equipment was expensive, we planned to use it for a long time.

For software, we practiced "defensive coding" and had each function check its inputs for invalid values or combinations of values. We held code reviews. We made the smallest changes possible, to reduce risk. We avoided large changes that would improve the readability of the code because we could not be sure that the revised code would work as expected in all cases.

In light of cloud computing's cheap hardware and agile development's pair programming and automated testing, these strategies may no longer be the best practice. Our servers are virtual, and while we want the underlying "big iron" to be reliable and long-lived, the servers themselves may have short lives. If that is the case, the "standard" server configuration may change over time, more frequently than we changed our classic, non-virtual servers.

The automated testing of agile development changes our approach to program development. Before comprehensive automated testing, minimal changes were prudent, as we could not know that a change would have an unintended effect. A full set of automated tests provides complete coverage of a program's functionality, so we can be bolder in our changes to a program. Re-factoring a small section of code (or a large section) is possible; our tests will verify that we have introduced no defects.

Cloud computing and agile development change the rules. Be aware of the changes and change your procedures to keep up.

Thursday, November 6, 2014

A New Microsoft And A New Tech World

Things are just not what they used to be.

In the good old days, Microsoft defined technology for business and set the pace for change. They had built an empire on Windows and products that worked with Windows.

Not only did Microsoft build products, they built their own versions of things to work in their world. Microsoft adopted the attitude of "not invented here": they eschewed popular products and built their own versions.

They built their own operating system (DOS at first, then Windows). They built their own word processor, their own spreadsheet (two actually: Multiplan was their first attempt), their own database manager, their own presentation software. They built their own browser. They even constructed their own version of a "ZIP" file: OLE Structured Storage.

All of these technologies had one thing in common: they worked within the Microsoft world. Microsoft Office ran on Windows - and nothing else. Internet Explorer worked on Windows - and nothing else. Visual Studio ran on Windows - and... you get the idea. Microsoft technology worked with Microsoft technology and nothing else.

For two decades this strategy worked. And then the world changed.

Microsoft has shifted away from the "all things Microsoft" approach. Consider:

  • Microsoft Word uses an open (well, open-ish) format of ZIP and XML
  • So does Microsoft Excel
  • Visual Studio supports projects that use JavaScript, HTML, and CSS
  • Microsoft Azure supports Linux, PHP, Python, and node.js
  • Office 365 apps are available for Android and iOS

These are significant changes. Microsoft is no longer the self-centered (one might say solipsistic) entity that it once was.

We must give up our old prejudices. The idea that Microsoft technology is always good ("No one was fired for buying Microsoft") is not true. It and never was. The weak reception of the Surface tablet and Windows phones shows that. (The anemic reception of Windows RT also shows that.)

We must also give up the notion that all Microsoft technology is large, expensive, bug-ridden, and difficult to maintain. It may be fun to hate on Microsoft, but it is not practical. Microsoft Azure is a capable set of tools. Their 'Express' products may be limited in functionality but they do work, and without much effort or expense.

The bigger change is the shift away from monoculture technology. We're entering an age of diverse technology. Instead of servers running Microsoft Windows and Microsoft databases and Microsoft applications with clients running Microsoft Windows and Microsoft browsers using Microsoft authentication, we have Microsoft applications running in Amazon.com's cloud with users holding Android tablets and Apple iPads.

Microsoft is setting a new standard for IT: multiple vendors, multiple technologies, and interoperability. What remains to be seen is how other vendors will follow.

Tuesday, October 21, 2014

Cloud systems are the new mainframe

The history of computers can be divided (somewhat arbitrarily) into six periods. These are:

Mainframe
Timeshare (on mainframes)
Minicomputers
Desktop computers (includes pre-PC microcomputers, workstations, and laptops)
Servers and networked desktops
Mobile devices (phones and tablets)

I was going to add 'cloud systems' to the list as a seventh period, but I got to thinking.

My six arbitrary periods of computing show definite trends. The first trend is size: computers became physically smaller in each successive period. Mainframe computers were (and are) large systems that occupy rooms. Minicomputers were the sizes of refrigerators. Desktop computers fit on (or under) a desk. Mobile devices are small enough to carry in a shirt pocket.

The next trend is cost. Each successive period has a lower cost than the previous one. Mainframes cost in the hundreds of thousands of dollars. Minicomputers in the tens of thousands. Desktop computers were typically under $3000 (although some did edge up near $10,000) and today are usually under $1000. Mobile device costs range from $50 to $500.

The third trend is administrative effort or "load". Mainframes needed a team of well-trained attendants. Minicomputers needed one knowledgeable person to act as "system operator" or "sysop". Desktop computers could be administered by a geeky person in the home, or for large offices a team of support persons (but less than one support person per PC). Mobile devices need... no one. (Well, technically they are administered by the tribal chieftains: Apple, Google, or Microsoft.)

Cloud systems defy these trends.

By "cloud systems", I mean the cloud services that are offered by Amazon.com, Microsoft, Google, and others. I am including all of the services: infrastructure as a service, platform as a service, software as a service, machine images, queue systems, compute engines, storage engines, web servers... the whole kaboodle.

Cloud systems are large and expensive. They also tend to be limited in number, perhaps because they are large and expensive. They also have a sizable team of attendants. Cloud systems are complex and a large team is needed to keep everything running.

Cloud systems are much like mainframe computers.

The cloud services that are offered by vendors are much like the timesharing services offered by mainframe owners. With timesharing, customers could buy just as much computing time as they needed. Sound familiar? It's the model used by cloud computing.

We have, with cloud computing, returned to the mainframe era. This period has many similarities with the mainframe period. Mainframes were large, expensive to own, complex, and expensive to operate. Cloud systems are the same. The early mainframe period saw a number of competitors: IBM, NCR, CDC, Burroughs, Honeywell, and Univac, to name a few. Today we see competition between Amazon.com, Microsoft, Google, and others (including IBM).

Perhaps my "periods of computing history" is not so much a linear list as a cycle. Perhaps we are about to go "around" again, starting with the mainframe (or cloud) stage of expensive systems and evolve forward. What can we expect?

The mainframe period can be divided into two subperiods: before the System/360 and after. Before the IBM System/360, there was competition between companies and different designs. After the IBM System/360, companies standardized on that architecture. The System/360 design is still visible in mainframes of today.

An equivalent action in cloud systems would be the standardization of a cloud architecture. Perhaps the Open Stack software, perhaps Microsoft's Azure. I do not know which it will be. The key is for companies to standardize on one architecture. If it is a proprietary architecture, then that architecture's vendor is elevated to the role of industry leader, as IBM was with the System/360 (and later System/370) mainframes.

While companies are busy modifying their systems to conform to the industry standard platform, innovators develop technologies that allow for smaller versions. In the 1960s and 1970s, vendors introduced minicomputers. These were smaller than mainframes, less expensive, and easier to operate. For cloud systems, the equivalent would be... smaller than mainframe clouds, less expensive, and easier to operate. They would be less sophisticated than mainframe clouds, but "mini clouds" would still be useful.

In the late 1970s, technology advances lead to the microcomputer which could be purchased and used by a single person. As with mainframe computers, there were a variety of competing standards. After IBM introduced the Personal Computer, businesses (and individuals) elevated it to industry standard. Equivalent events in cloud would mean the development of individual-sized cloud systems, small enough to be purchased by a single person.

The 1980s saw the rise of desktop computers. The 1990s saw the rise of networked computers, desktop and server. An equivalent for cloud would be connecting cloud systems to one another. Somehow I think this "inter-cloud connection" will occur earlier, perhaps in the "mini cloud" period. We already have the network hardware and protocols in place. Connecting cloud systems will probably require some high-level protocols, and maybe faster connections, but the work should be minimal.

I'm still thinking of adding "cloud systems" to my list of computing periods. But I'm pretty sure that it won't be the last entry.

Monday, October 6, 2014

Innovation in mobile and cloud; not in PCs

The history of IT is the history of innovation. But innovation is not evenly distributed, and it does not stay with one technology.

For a long time, innovation focussed on the PC. The "center of gravity" for innovation was, for a long time, the IBM PC and PC-DOS. Later it became the PC (not necessarily from IBM) and Windows. Windows NT, Windows 2000, and Windows XP all saw significant expansions of features.

With the rise of the Web, the center of gravity shifted to web servers and web browsers. I think that it is no coincidence that Microsoft offered Windows XP with no significant changes. People accepted Windows XP as "good enough" and looked for innovation in other areas -- web browsers, web servers, and databases.

This change broke Microsoft's business model. That business model (selling new versions of Windows and Office to individuals and corporations every so often) was broken when users decided that Windows XP was good enough, that Microsoft Office was good enough. They moved to newer versions reluctantly, not expectantly.

Microsoft is changing its business model. It is shifting to a subscription model for Windows and Office. It has Azure for cloud services. It developed the Surface tablet for mobile computing. Microsoft's Windows RT was an attempt at an operating system for mobile devices, an operating system that had reduced administrative tasks for the user. These are the areas of innovation.

We have stopped wanting new desktop software. I know of no new projects that target the desktop. I know of no new projects that are "Windows only" or "PC only". New projects are designed for mobile/cloud, or possibly web browsers and servers. With no demand for new applications on the desktop, there is no pressure to improve the desktop PC - or its operating system.

With no pressure to improve the desktop, there is no need to change the hardware or operating system. We see changes in three areas: larger memory and disks (mostly from inertia), smaller form factors, and prettier user interfaces (Windows Vista and Windows 8 "Metro"). With each of these changes, users can (rightfully) ask: what is the benefit to me?

It is a question that newer PCs and operating systems have not answered. But tablets and smartphones answer it quite well.

I think that Windows 10 is the "last hurrah" for Windows -- at least the desktop version. Innovations to Windows will be modifications for mobile/cloud technologies: better interactions with virtualization hypervisors and container managers. Aside from those, look for little changes in desktop operating systems.

Thursday, September 25, 2014

Tablets are controlled by corporations

I admit I was wrong. In my previous post, I claimed that mobile devices would be free of corporate bureaucracy (and control). That's not true.

It's true in the sense that when the Acme corporation buys PCs it can control them with ActiveDirectory and group policies, and that similar infrastructure is not in place for tablets and smartphones. (I'm ignoring the third-party Mobile Device Management software.)

But it's false in the sense that corporations do control the mobile devices. The corporations are not Acme or whoever buys the devices. The controlling corporations are the owners of the walled gardens: Apple, Google, Amazon.com, and Microsoft. These corporations control the software available and the updates that occur automatically. (Yes, you can turn some updates off, but only while those corporations let you.)

The control that these companies exert is indisputable. Apple just recently placed a copy of a U2 album on every iPod and iPhone. Some time ago, Amazon.com deleted books from various Kindle e-readers. These companies are the "tribal chieftains", with immense power over the devices.

Android and iOS are popular in part because they are easy to use. That ease of use comes from the absence of administration tasks. The administration has not disappeared, it has moved from the "owner" of the device to the controlling company. Apple builds the updates for iOS and distributes those updates (along with updates to apps) to iPhones and iPads. Google does the same for Android devices. Microsoft does the same for "Metro" apps.

It may be this control that makes corporations reluctant to use tablets. They may know, deep down, that they are not in control of the devices. They may realize that at any moment the tribal chieftains may change the software, or worse, read or modify (or possibly delete) data on the devices. They may grant other individuals access to mobile devices.

All of this does not mean that corporations (the Acme variety, who are using the devices) should avoid mobile devices. It *does* mean that corporations should use them intelligently. They should not manage tablets and smartphones in the same way that they manage PCs, and they should not use tablets and smartphones in the same way as they use PCs. The model for mobile devices is very different from PCs.

Business can use tablets and smartphones, but differently than PCs. Data should be handled by specific apps, not generic applications like Microsoft Word and Excel. Mobile apps should authenticate users, retrieve a limited set of data from servers, present that data, manipulate that data, and then store the data on the server. Apps should not store data on the local device. (This is also good for the scenario of a lost device -- if it has no data, there can be no data "leakage" to unauthorized parties.)

Mobile devices are controlled by the tribal chieftains. Yet they can still be used by corporations -- and individuals.

Wednesday, September 24, 2014

Mobile devices may always be independent from corporate bureaucracy

The mobile revolution is different from the PC revolution.

The PC revolution saw the IBM PC adopted as the standard for personal computing. It was adopted by businesses and consumers, but most spending was from businesses.

The mobile revolution, in contrast, is driven by consumers. Individuals are buying smart phones and tablets. Businesses may be purchasing some mobile devices, but the bulk of the spending is on the consumer side.

Why is this distinction important?

To answer that, let's look at PCs and their history. Personal computers in corporations are anything but personal. They are purchased by the corporation and controlled by the corporation. The people using PCs rarely have administrator privileges for those PCs. Instead, the ability to install software and make significant changes is governed by the local copy of Windows and configurations in a central ActiveDirectory server.

The infrastructure of ActiveDirectory and Windows group policies was not built overnight, and was not part of the original PC. The first PCs ran PC-DOS and had no administrative controls at all -- any user could do anything, see anything, and change anything. Microsoft worked on PC-DOS for IBM, then MS-DOS for non-IBM computers, then Windows, and finally server software and ActiveDirectory. It took about twenty years to create, from the introduction of the IBM PC in 1981 to the introduction of ActiveDirectory in 1999.

That work was done by Microsoft because corporations wanted it. They wanted mechanisms to control the PCs and the access to data on PCs and servers. (And even with all of that interest, it took two decades to "enterprise-ify" PCs and make them part of the bureaucracy.)

Corporations were interested in PCs from the introduction of the IBM PC. (Some corporations were interested in earlier microcomputers, but they were a minority.) Corporations were interested in PCs because PCs ran Lotus 1-2-3, the popular spreadsheet at the time.

Now let's look at mobile devices. Corporations have a mild interest in mobile devices. It is only a fraction of the interest in PCs. There is no killer app for tablets, no must-have app for smart phones. (At least, not for corporations.) It is quite possible that phones and tablets are too personal for corporations.

It is telling that the Microsoft Surface tablet, with its ready-to-use connections to ActiveDirectory, has seen little interest. For consumers, the Surface (and other Windows tablets) are more expensive and not as useful as the iPad and Android tablets. But even corporations have little interest in the Microsoft offerings.

Without corporate interest (and corporate spending), neither Apple nor Google have incentive to make their tablets "safe for the enterprise" -- that is, controlled through a central administration point. (Yes, there are "mobile device management" packages, but they have little interest.)

Apple and Google will invest their efforts in other areas, such as better hardware and improved reliability of apps in their stores (and maybe higher profits).

Corporations will use tablets for small, isolated projects, if at all. I suspect most corporations view their proven and familiar desktops and laptops as sufficient, with little benefit from tablets.

But all is not lost for tablets and smart phones. Some folks will use them for critical business purposes. These folks will not be the large corporations with established IT infrastructure. They will be the start-ups, the small companies who will build completely new apps to solve completely new business problems.

Sunday, September 21, 2014

Keeping our keyboards

Tablets are quite different from desktop PCs and laptop PCs. (Obviously.)

PCs have large displays, keyboards, mice, and wired network connections. They often have CD or DVD drives. Tablets, in contrast, have small displays, virtual keyboards, a touch screen (so no mice), no wired network connection, and media storage (if any) is limited to memory cards.

So we can view the transition from PC to tablet as a shift in peripherals. The "old school" PC used physical keyboards, mice, and disks; the "new school" tablets use touch screens, virtual keyboards, and no mice tor disks.

Except for one small detail.

Tablet users are using keyboards.

Not mice.

Not printers.

Keyboards.

I do understand that some people are using tablets with mice and printers. A small minority of people, but nowhere near the sizable number of people are using physical keyboards.

The appeal of keyboards is such that people continue to use them as an input device. They carry a keyboard with their tablet. They buy tablet covers that have built-in keyboards.

I think this tells us something about keyboards.