Friday, January 31, 2014

Faster update cycles mean PC apps become expensive

Ah, for the good old days of slow hardware upgrades. It used to be that one could buy a computer system and use it for years, possibly even a decade. The software would be upgraded, but the hardware would last. One could run a business knowing the future of its IT (hardware and software) was predictable.

Today we have faster cycles for hardware upgrades. Cell phones, tablets, and some PCs (Apple) are updated in a matter of months, not decades. The causes are multiple: competition (especially among phone vendors), changes in technology, and a form of planned obsolescence (Apple) that sees existing customers buying new versions.

I expect that these faster cycles will move to the PC realm.

The change in the life span of PC hardware will affect consumers and businesses, with the greater impact on businesses. I expect individual consumers to move away from PCs and switch to phones, tablets, game consoles, and internet TV appliances.

Businesses have a challenge ahead. Corporate users typically don't want PCs; they want computing power. Specifically, they want computing power with a user interface that is consistent over time. (When a new version of Windows is introduced to a corporate environment, one of the first actions is to configure the user interface to look like the old version. The inability of Windows 8 to emulate Windows 7 exactly is probably the cause for corporate discomfort with it.)

But the challenge to business goes beyond the user interface. Corporations want stable computing platforms to hold their applications. They want to build a system (or buy one) and use it for a long time. Switching from one vendor's system to another's is an expensive proposition, and corporations amortize the conversion cost over a long life. A new system, or even a new version of a system, can impose changes to the user interface, interfaces to other systems, and interactions with the operating system and drivers. All of these changes are part of the cost of implementation.

In the corporation's mind, the fewer conversions, the better.

That philosophy is colliding with the faster pace of hardware. Apple is not alone in its rapid release of hardware and operating systems; Microsoft is releasing new versions of Windows at a rate much faster than the ten-year gap between Windows XP and Windows 7. (I'm ignoring Windows Vista.)

To adapt to the faster change, I expect corporations to shift from the PC platform to technologies that allow it to retain longer lifespans: virtual PCs and cloud computing. Virtual PCs are the easier change, allowing applications to be shifted directly onto the new platform. With remote access, a (fast-changing) real PC can access the (slow changing) "get the work done" virtual PC. In this case, virtualization and remote access act as a shock absorber for the change in technology.

Cloud computing offers a more efficient platform, but only after re-designing the application. The large monolithic PC applications must split into multiple services coordinated by (relatively) simple applications running on tablets and phones. In this case, the use of small, simple components on multiple platforms (server and tablet/phone) act as the buffer to changes in technology.

The PC platform will see faster update cycles and shorter life spans. Applications on this platform will be subject to more changes. A company's customer base will use more platforms, driving up the cost of development, testing, and support.

Moving to virtual PCs or to the cloud is a way of avoiding that increase in costs.

Wednesday, January 29, 2014

The PC revolution was about infrastructure

Those of us who lived through the PC revolution like to think that PCs were significant advances in technology. They were advances, but in retrospect they were simply infrastructure.

Let's review the advances in PC technology:

Stand-alone PCs The original PCs were brought in as replacements for typewriters and calculators. This was a tactical use of PCs, one that improved the efficiency of the company but did not change the internal organization or the products and services offered by the company. The PC, with only word processors and spreadsheets, is not strong enough to make a strategic difference for a company.

Databases After some time, people figured out that PCs could be more than typewriters and calculators. There were custom PC applications, but more importantly there were the early databases (dBase II, dBase III, R:Base) and database languages (Clipper, Paradox) that let people store and retrieve data. These databases were single-user and stand-alone.

Networks The original PC networks (Novell, Banyan, Corvus) were introduced to share resources such as disks and printers. Printers (especially letter-quality printers) were expensive. Disks (large disks, say 40 MB) were also expensive. Sharing a common resource made economic sense. But the early networks were LANs (Local Area Networks) and confined to a single building.

Servers Initially part of "client/server systems", servers were database engines that handled requests from multiple clients. Client/server systems gave networks a significant purpose for existing: the ability to update a single database from multiple locations made it possible to migrate mainframe applications onto the cheaper PC platform.

The Internet Connecting networks made it possible for businesses to exchange information. The first big use of internet connections was e-mail; calendars followed quickly. Strictly speaking, the Internet is not a PC technology -- it was built mostly with minicomputers and Unix. The sockets libraries (WinSock) for PCs made the Internet accessible.

Web servers Built on these previous layers, the web is (now) a combination of PCs, minicomputers, mainframes, and rack-mounted servers. It is this layer that enables strategic as well as tactical advantages. Companies can provide self-service web pages (more of a tactical change, I think) and new services (strategic). New companies can form (Facebook, Twitter).

Virtualization The true advantage of virtualization is not consolidation of servers, but the ability to create or destroy machines quickly.

Cloud computing Once virtual machines were available and cheap, we created the cloud paradigm. Using an array of virtual computers, we can design applications that are distributed across multiple servers and are resistant to failure of any one of those servers. The distribution of work allows for scaling (up or down) as needed, adding or removing servers to handle the current load.

All of these technologies are now infrastructure. They are well-understood and easily available.

New technologies are plugging in to this infrastructure. Smartphones, tablets, and big data are all sitting on top of this (impressive) stack of technology. Smartphone and tablet apps use low-wattage user interfaces and connect to cloud computing systems for processing. Big data system use a similar design, with cloud computing engines providing the data for visualization software on PCs (or in web browsers).

When we built the first microcomputers, when we installed DOS on PCs, when we used modems to connect to bulletin-board systems, we thought we were creating the crest of technology. We thought we were building the top dog. But it didn't turn out that way. The PC and its later technologies let us build a significant computing stack.

Now that we have that stack, I think we can discard the traditional PC. The next decade should see the replacement of PCs. Not all at once, and at different rates in different environments. I expect PCs to exist in businesses for quite some time.

But disappear they shall.

Sunday, January 26, 2014

Where has all the COBOL gone?

COBOL.

The language of the ancients. The workhorse of accounting systems. The technology that runs business. The most popular programming language in the 1960s (and possibly 1970s and 1980s, depending on how one measures popularity).

Where has it gone? More specifically, where have the systems written in COBOL gone?

Demand for COBOL programmers has been declining. Not just this year, but over the better part of the past ten years -- and possibly longer than that. Consider the statistics reported by the site indeed.com:



COBOL Job Trends graph


COBOL Job Trends
Cobol jobs



This link renders an image created in real-time, so your view may differ from mine. But my view shows a steady decrease in the requests for COBOL programmers, from 2006 to 2014.

COBOL was the language used for business systems. It powered accounting systems, payroll systems, invoicing systems, inventory, and reporting for businesses ranging from insurance to telephone to banking to manufacturing. Those businesses are still here (although perhaps spread around the world) and I expect that those accounting systems, payroll systems, and invoicing systems are still with us. (At least, I still get paychecks and bills still arrive.)

I have two theories about this decline:

1) The demand for COBOL remains steady in absolute terms; it is declining as a percentage of the overall technology market.

2) The demand for COBOL is truly declining; the use of COBOL is declining.

With the first theory, systems written in COBOL remain in use. They are maintained as usual, and the demand for COBOL programmers remains steady. The decline of COBOL is a decline in market share, as other languages grow. Certainly the use of C++, C#, and Java have increased over the past decades. One can clearly see the market for Visual Basic in the 1990s. These other languages expand the "pie" and COBOL is growing at a rate less than the market growth.

The second theory is, I believe, the correct one: The demand for COBOL is declining. This can be caused by only one thing: a reduction in the number of COBOL systems.

Systems written in COBOL need a certain amount of maintenance. This maintenance is caused by forces exterior to the system owners: changes in market, changes in technology, and changes in legislation. Thus with a decline of COBOL demand, I can reasonably conclude that there is a decline in the number of COBOL-based systems.

But the functions performed by those old COBOL systems remain. We're still getting bills, in case you hadn't noticed. The bills may be electronic notifications and not paper invoices, but some system is generating them.

Given that the functions are being performed, there must be a system to perform them. (I highly doubt that companies have abandoned computers in favor of people.) Those systems are written in something other than COBOL. The question then becomes... which language?

A better question may be: is there a single language for the business community? Have companies gravitated to a new leader in technology? Or is the business community becoming fragmented?

I've seen nothing of a "new business standard", so I assume that the business community is splitting. The new languages for business are probably C# and Java. Small portions of the business world may be using other languages such as C, Objective-C, JavaScript and node.js, Python, or even F# or Haskell.

Getting back to COBOL. If I'm right about these new languages, then we may have seen "peak COBOL", with the glory of COBOL behind it. That gives us in the industry an opportunity.

The IT age is more than half a century old. We've seen technologies come and go. Not just hardware, but also software. The rise and decline of the big languages (COBOL, BASIC, Visual Basic) may tell us about the rise and decline of other popular technologies, some which are still in the "rise" phase (perhaps tablets).

Perhaps COBOL can still teach us a thing or two.

Tuesday, January 21, 2014

From general to specific

We are entering a new age of computing.

The change is the shift from general to specific. PC hardware has been, since the initial PC up to now, standard and generic. One PC was very much like another PC, in terms of architecture and capacity. This standardization made the PC market possible, with PC manufacturers, accessory vendors, and software providers all working to a common standard.

To be sure, there was always some variation among PCs. Some had faster processors; some had more memory. Enthusiasts added super-large hard drives and super-fast video cards. But they all revolved around the PC standard. (A standard that evolved over time, moving from the original PC to the IBM PC AT to the Compaq Deskpro 386 to today's Intel-based desktops.)

Now we see the standard-issue technology splitting into distinct markets with distinct hardware. Many businesses have traded their desktop PCs for laptops and shifted "back end" work to cloud servers. Game consoles are not quite PCs: they contain specialized hardware and one cannot replace the operating system (at least not easily). The home PC is being replaced by tablets and smartphones. Hobbyists are experimenting with small-board systems like the Raspberry Pi and the BeagleBone.

For each of these uses, we are replacing the desktop PC with a smaller, specialized device.

The change is not limited to hardware. While businesses still run Microsoft Windows, other devices are moving to different operating systems. Game consoles run their own operating systems; even Microsoft's Xbox runs an operating system that is based on Windows but not quite the same as Windows on the desktop PC. Tablets and phones run iOS or Android. The hobbyists are using Linux.

The good old days of standard PCs saw PCs (and Windows) everywhere. The new age of specialization sees a fragmentation of that world, with separate hardware and software for the different types of users. This differentiation will allow the different markets to develop distinct pricing for hardware and software; already competition is driving down the prices of tablets to ranges unreachable by classic PCs.

I expect the job market to fracture alone similar lines. Office applications will stay within the classic PC realm and move slowly to cloud-based solutions. The development of games is already distinct. The consumer market of apps is almost fully emerged. As the hardware and software of these markets diverge, I expect the development tools and techniques, the advertising, and the pay scales to diverge.

Eventually, we will not have "an IT jobs market" or "an IT career path". Instead, we will have career paths in business, in games and interactive entertainment, and in consumer products. Each will include IT as we think of it today (hardware, software, development, testing, etc.) as part of a larger whole. The hobbyists will perhaps be different in that they will have not a market for the exchange of dollars but a community for the exchange of information. They too will use IT for larger goals -- perhaps education or research.

We will lose the PC standard. In its place will be a standard for business, another standard for games, yet other standards for consumers, and (most likely) a collection of diverse hardware for hobbyists. I will not mourn the passing of the PC standard. It served its purpose, letting us develop a strong set of technology for diverse challenges. Now we can move to the next level and use technology that is better suited to specific tasks.

Monday, January 20, 2014

Windows 7 is "good enough" - and that's a problem

A large portion of the Windows user community has complained -- loudly -- about Windows 8 and its new user interface. People, as individuals or as members of a corporation, have made their displeasure known. They have written articles in trade magazines. They have posted blog entries. They have given presentations. (I suspect that there are anti-Windows-8 videos on YouTube.)

Most folks care more about getting their work done and less about the operating system. They don't want Windows 8, or even Microsoft Word or Microsoft Excel. They want their invoices, they want their estimates, they want their analyses.

The technology stack of PC hardware, Microsoft Windows, Microsoft Office, and specialty software are tools. They are a means to the end, not the end.

Changes to that technology stack, when invisible, are unimportant. An update that fixes a security hole is good, especially when it has no effect on the workflow. After the update, Windows boots as usual, applications run as usual, and the work gets done.

Visible changes, such as removing the "Start" button, affect the workflow. The introduction of "ribbon" menus in Microsoft Office were also met with complaints.

The problem facing Microsoft is that their software (Windows, Office, SQL Server, etc.) has become good enough for use in the workplace. It has been good enough for years, which is why people use old versions.

In the good old days, new versions of software were clearly better. Windows 3.1 was much better than DOS. Windows 95 was better than Windows 3.1. Windows XP was better than Windows 95. People could see the benefit and were willing to move to the later version.

But once software becomes good enough, the benefits of a later version are less clear. Windows Vista was not clearly better than Windows XP. Windows 7 was better than Windows Vista, but perhaps not that much better than Windows XP.

Windows 8 is clearly different from Windows 7 (and Windows XP). But is it better? People perceive Windows XP and Windows 7 as good enough.

Which is ironic, as Microsoft built their empire on software that was good enough. They shipped products when those products were good enough to compete in the market. They improved products to become good enough to deliver revenue.

Microsoft Windows 8 must compete against Windows 7, and Windows 7 is good enough.

Two observations:

The current users of Windows believe their current systems to be good enough, and they are unwilling to change without clear benefits. The features of Windows 8 are insufficient to warrant a change.

A vendor cannot force products upon the market. (This lesson was made earlier with Microsoft "Bob" and IBM "Topview".) Users must see benefits, not merely features, in a product.

Monday, January 13, 2014

In the Post-PC era, different people use different hardware

The post-PC is upon us, but perhaps is not quite what we were expecting. Instead of a replacement for the PC, the post-PC age sees the PC co-existing with other computing devices.

In the post-PC era, the Personal Computer loses its position as the standard unit of computing. Now we have (in addition to PCs) phones, tablets, game consoles, virtual servers, and wearables. But not everyone uses everything. Some people use tablets, others use game consoles, and some use traditional PCs.

I see the following types of users:

Consumers Ordinary folks like you and me when we're at home. We don't need much in terms of computing power; we simply want to:

  • read and write e-mail
  • update Facebook
  • chat with friends
  • keep appointments
  • play lightweight games
  • watch videos
  • do some online banking
  • read e-books

For these tasks we use phones, tablets, and wearables (Google Glass or the Apple iWatch).

Gamers The serious game players want the high-end games, and maybe some videos. For them, game consoles (Xbox, PS4) are the way to go. (Gamers may also use tablets when they are being ordinary folks, too.)

Office managers and workers In the office, managers and workers will be performing the same tasks that they have been performing for the past decade:

  • read and write e-mail
  • calendar
  • review and compose documents
  • review and compose spreadsheets
  • browse the web
  • project management and scheduling

For them, traditional PCs will be necessary. These tasks need the rapid input of real keyboards (although mouse operations may be replaced by touch operations).

Office executives Also in the office but focussed more on meetings and personal interactions, executives will:

  • read and write e-mail
  • update calendars
  • review documents
  • review spreadsheets
  • review presentations

Notice that they do little in the way of composition. Executives will find phones and tablets (most likely Windows) more useful than traditional PCs.

Looking at this list, it seems that PCs are limited to the office. Not quite true; I think a number of specialists will want PCs. For example, developers will want IDE and version control systems on PCs. Graphic designers will want to use Photoshop or Gimp on PCs.

The post-PC era does not mean the end of the PC. It does mean the end of the PC as the default choice for a computing device. We are entering an age of varied computing devices, each with strengths (and weaknesses).

Sunday, January 12, 2014

Android PCs will clobber Linux, on the desktop

The big PC operating systems (Windows, MacOS, and Linux) leave a lot of administration work to the user. Over the years, all have made improvements. Windows has focussed on domain management, giving corporate support teams administrative control over PCs. MacOS has hidden a lot of configuration from the user and made the installation of applications easy with drag-and-drop operations. Linux has leveraged open source, package managers, and repositories of software to reduce the cost of installing software.

Android already has a firm beachhead in the phone and tablet market. Now it can expand onto the home PC.

A number of manufacturers have introduced "Android PCs", full-sized computers running Android. I think that they have a bright future, and may supplant Linux on the home desktop PC.

But isn't Android really Linux, configured for phones and tablets?

Linux is at the heart of Android, true. But Android is more than Linux with a smaller user interface. Android handles more device management and software management than Linux.

Android's model is closer to Apple's iOS/iTunes method of managing software. Android (and iOS) let the user purchase and download software quickly and easily from centrally-managed repositories. They are not exactly the same: Apple locks you into their repository; Android lets you add software from other repositories. Yet both have unified updating mechanisms, and both let you move to new devices and keep your apps (and data). Android (and iOS) manage the updates for all of your apps. Buy a new phone, and (once you register) you can quickly re-install your apps without incurring additional charges (Android and iOS remember that you have purchased the apps).

The traditional PC operating systems, in contrast, force you to upgrade your software and (possibly) acquire new licenses. In Windows, the Microsoft update service handles Microsoft products, but products from other vendors need their own systems. Buy a new Windows PC, and you have to install all of your applications, including the Microsoft ones. This arrangement works for the corporate environment, with a support team that assigns, licenses, and installs corporate-approved applications. It offers little for the individual PC owner.

Linux distros come close to Android's functionality, with an update service that handles all software. Yet Android's update system is easier to use and more convenient, especially for the individual user.

Windows will maintain its dominance in the corporate world, and Android will gain in the home market. It is the enthusiast, the computer geek, who wants to tinker with Linux and settings. The average person wants a computing appliance, one that needs as much attention as a toaster. Android delivers a better toaster than Linux (or Windows).

That's for user PCs. Servers will remain unaffected by the rise of Android. The people running servers (the sysadmins) are geeks, and they want (need) the ability to tune those servers.

Android (or some operating system that offers a toaster-like appliance level) for servers may happen in the future, as individuals (non-sysadmins) want servers. But that's some time away.