Showing posts with label new technology. Show all posts
Showing posts with label new technology. Show all posts

Monday, January 16, 2023

The end of more

From the very beginning, PC users wanted more. More pixels and more colors on the screen. More memory. Faster processors. More floppy disks. More data on floppy disks. (Later, it would be more data on hard disks.)

When IBM announced the PC/XT, we all longed for the space (and convenience) of its built-in hard drive. When IBM announced the PC/AT we envied those with the more powerful 80286 processor (faster! more memory! protected mode!). When IBM announced the EGA (Enhanced Graphics Adapter) we all longed for the higher-resolution graphics. With the PS/2, we wanted the reliability of 3.5" floppy disks and the millions of colors on a VGA display.

The desire for more didn't stop in the 1980s. We wanted the 80386 processor, and networks, and more memory, and faster printers, and multitasking. More programs! More data!

But maybe -- just maybe -- we have reached a point that we don't need (or want) more.

To quote a recent article in MacWorld:

"Ever since Apple announced its Apple silicon chip transition, the Mac Pro is the one Mac that everyone has anxiously been awaiting. Not because we’re all going to buy one–most of the people reading this (not to mention me, my editor, and other co-workers) won’t even consider the Mac Pro. It’s a pricey machine and the work that we do is handled just as well by any Mac in the current lineup".

Here's the part I find interesting:

"the work that we do is handled just as well by any Mac in the current lineup"

Let that sink in a minute.

The work done in the offices of MacWorld (which I assume is typical office work) can be handled by any of Apple's Mac computers. That means that the lowliest Apple computer can handle the work. Therefore, Macworld, being a commercial enterprise and wanting to reduce expenses, should be equipping its staff with the low-end MacBook Air or Mac mini PCs. To do otherwise would be wasteful.

It is not just the Apple computers that have outpaced computing needs. Low end Windows PCs also handle most office work. (I myself am typing this on a Dell desktop that was made in 2007.)

The move from 32-bit processing to 64-bit processing had a negligible affect on many computing tasks. Microsoft Word, for example, ran just as well in 32-bit Windows as it did in 64-bit Windows. The move to 64-bit processing did not improve word processing.

There are some who do still want more. People who play games want the best performance from not only video cards but also central processors and memory. Folks who edit video want performance and high-resolution displays.

But the folks who need, really need, high performance are a small part of the PC landscape. Many of the demanding tasks in computation can be handled better by cloud-based systems. It is only a few tasks that require local, high-performance processing.

The majority of PC users can get by with a low-end PC. The majority of PC users are content. One may look at a new PC with more memory or more pixels, but the envy has dissipated. We have enough colors, enough pixels, and enough storage.

If we have reached "peak more" in PCs, what does that mean for the future of PCs?

An obvious change is that people will buy PCs less frequently. With no urge to upgrade, people will keep their existing equipment longer. Corporations that buy PCs for employees may continue on a "replace every three years" schedule, but that was driven by depreciation rules and tax laws. Small mom-and-pop businesses will probably keep computers until a replacement is necessary (I suspect that they have been doing that all along). Some larger corporations may choose to defer PC replacements, noting that cash outlays for new equipment are still cash outlays, and should be minimized.

PC manufacturers will probably focus on other aspects of their wares. PC makers will strive for better battery life, durability, or ergonomic design. They may even offer Linux as an alternative to Windows.

It may be that our ideas about computing are changing. It may be that instead of local PCs that do everything, we are now looking at cloud computing (and perhaps older web applications) and seeing a larger expanse of computing. Maybe, instead of wanting faster PCs, we will shift our desires to faster cloud-based systems.

If that is true, then the emphasis will be on features of cloud platforms. They won't compete on pixels or colors, but they may compete on virtual processors, administration services, availability, and supported languages and databases. Maybe we won't be envious of new video cards and local memory, but envious instead of uptime and automated replication. 

Thursday, October 20, 2022

The Next Big Thing

What will we see as the next big thing?

Let's look at the history of computer technology -- or rather, a carefully curated version of the history of computer technology.

The history of computing can be divided into eras: The mainframe era, the minicomputer era, the micro/PC era, and so forth. And, with careful editing, we can see that these eras have similar durations: about 15 years each.

Let's start with mainframe computers. We can say that it ran from 1950 to 1965. Mainframe computers were (and still are) large, expensive computers capable of significant processing. They are housed in rooms with climate control and dedicated power. Significantly, mainframe computers are used by people only indirectly. In the mainframe age, programmers submitted punch cards which contained source code; the cards were fed into the computer by an operator (one who was allowed in the computer room); the computer compiled the code and ran the program; output was usually on paper and delivered to the programmer some time later. Mainframe computers also ran batch jobs to read and process data (usually financial transactions). Data was often read from magnetic tape and output could be to magnetic tape (updated data) or paper (reports).

Minicomputers were popular from 1965 to 1980. Minicomputers took advantage of newer technology; they were smaller, less expensive, and most importantly, allowed for multiple users on terminals (either paper-based or CRT-based). The user experience for minicomputers was very different from the experience on mainframes. Hardware, operating systems, and programming languages let users interact with the computer in "real time"; one could type a command and get a response.

Microcomputers and Personal Computers (with text displays, and without networking) dominated from 1980 to 1995. It was the age of the Apple II and the IBM PC, computers that were small enough (and inexpensive enough) for an individual to own. They inherited the interactive experience of minicomputers, but the user was the owner and could change the computer at will. (The user could add memory, add disk, upgrade the operating system.)

Personal Computers (with graphics and networking) made their mark from 1995 to 2010. They made the internet available to ordinary people. Graphics made computers easier to use.

Mobile/cloud computers became dominant in 2010. Mobile devices without networks were not enough (the Palm Pilot and the Windows pocket computers never gained much traction). Even networked devices such as the original iPhone and the Nokia N800 saw limited acceptance. It was the combination of networked mobile device and cloud services that became the dominant computing model.

That's my curated version of computing history. It omits a lot, and it fudges some of the dates. But it shows a trend, one that I think is useful to observe.

That trend is: computing models rise and fall, with their typical life being fifteen years.

How is this useful? Looking at the history, we can see that the mobile/cloud computing model has been dominant for slightly less than fifteen years. In other words, its time is just about up.

More interesting is that, according to this trend (and my curated history is too pretty to ignore), something new should come along and replace mobile/cloud as the dominant form of computing.

Let's say that I'm right -- that there is a change coming. What could it be?

It could be any of a number of things. Deep-fake tech allows for the construction of images, convincing images, of any subject. It could be virtual reality, or augmented reality. (The difference is nontrivial: virtual reality makes full images, augmented reality lays images over the scene around us.) It could be watch-based computing. 

My guess is that it will be augmented reality. But that's a guess.

Whatever the new thing is, it will be a different experience from the current mobile/cloud model. Each of the eras of computing had its own experience. Mainframes had an experience of separation and working through operators. Minicomputers had interactive experience, although someone else controlled the computer. Personal computers had interaction and the user owned the computer. Mobile/cloud let people hold computers in their hand and use them on the move.

Also, the next big thing does not eliminate the current big thing. Mobile/cloud did not eliminate web-based systems. Web-based systems did not eliminate desktop applications. Even text-mode interactive applications continue to this day. The next big thing expands the world of computing.

Monday, September 25, 2017

Web services are the new files

Files have been the common element of computing since at least the 1960s. Files existed before disk drives and file systems, as one could put multiple files on a magnetic tape.

MS-DOS used files. Windows used files. OS/2 used files. (Even the p-System used files.)

Files were the unit of data storage. Applications read data from files and wrote data to files. Applications shared data through files. Word processor? Files. Spreadsheet? Files. Editor? Files. Compiler? Files.

The development of databases saw another channel for sharing data. Databases were (and still are) used in specialized applications. Relational databases are good for consistently structured data, and provide transactions to update multiple tables at once. Microsoft hosts its Team Foundation on top of its SQL Server. (Git, in contrast, uses files exclusively.)

Despite the advantages of databases, the main method for storing and sharing data remains files.

Until now. Or in a little while.

Cloud computing and web services are changing the picture. Web services are replacing files. Web services can store data and retrieve data, just as files. But web services are cloud residents; files are for local computing. Using URLs, one can think of a web service as a file with a rather funny name.

Web services are also dynamic. A file is a static collection of bytes: what you read is exactly was was written. A web service can provide a set of bytes that is constructed "on the fly".

Applications that use local computing -- desktop applications -- will continue to use files. Cloud applications will use web services.

Those web services will be, at some point, reading and writing files, or database entries, which will eventually be stored in files. Files will continue to exist, as the basement of data storage -- around, but visited by only a few people who have business there.

At the application layer, cloud applications and mobile applications will use web services. The web service will be the dominant method of storing, retrieving, and sharing data. It will become the dominant method because the cloud will become the dominant location for storing data. Local computing, long the leading form, will fall to the cloud.

The default location for data will be the cloud; new applications will store data in the cloud; everyone will think of the cloud. Local storage and local computing will be the oddball configuration. Legacy systems will use local storage; modern systems will use the cloud.

Monday, June 5, 2017

The Demise of Apple

Future historians will look back at Apple, point to a specific moment, and say "Here, at this point, is when Apple started its decline. This event started Apple's fall.". That point will be the construction of their new spaceship-inspired headquarters.

Why do I blame their new building? I don't, actually. I think others -- those future historians -- will. They will get the time correct, but point to the wrong event.

First things first. What do I have against Apple's shiny new headquarters?

It's round.

Apple's new building is large, elegant, expensive, and ... the wrong shape. It is a giant circle, or wheel, or doughnut, and it works poorly with human psychology and perception. The human mind works better with a grid than a circle.

Not that humans can't handle circular objects. We can, when they are small or distant. We have no problem with the moon being round, for example. We're okay with clocks and watches, and old-style speedometers in cars.

We're good when we are looking at the entire circle. Watches and clocks are smaller than us, so we can view the entire circle and process it. (Clocks in towers, such as the "Big Ben" clock in London or the center of town, are also okay, since we view them from a distance and they appear small.)

The problems occur when we are inside the circle, when we are navigating along the circumference. We're not good at keeping track of gradual changes in direction. (Possibly why so many people get lost in the desert. They travel in a circle without realizing it.)

Apple's building looks nice, from above. I suspect the experience of working inside the building will be one of modest confusion and discomfort. Possibly at such a minor level that people do not realize that something is wrong. But this discomfort will be significant, and eventually people will rebel.

It's ironic that Apple, the company that designs and builds products with the emphasis on "easy to use", got the design of their building wrong.

So it may be that historians, looking at Apple's (future) history, blame the design of the new headquarters for Apple's (future) failures. They will (rightly) associate the low-level confusion and additional brain processing required for navigation of such a building as draining Apple's creativity and effectiveness.

I think that they (the historians) will be wrong.

The building is a problem, no doubt. But it won't cause Apple's demise. The true cause will be overlooked.

That true cause? It is Apple's fixation on computing devices.

Apple builds (and sells) computers. They are the sole company that has survived from the 1970s microcomputer age. (Radio Shack, Commodore, Cromemco, Sol, Northstar, and the others left the market decades ago.) In that age, microcomputers were stand-alone devices -- there was no internet, no ethernet, no communication aside from floppy disks and a few on-line bulletin board systems (BBS) that required acoustic coupler modems. Microcomputers were "centers of computing" and they had to do everything.

Today, computing is changing. The combination of fast and reliable networks, cheap servers, and easy virtual machines allows the construction of cloud computing, where processing is split across multiple processors. Google is taking advantage of this with its Chromebooks, which are low-end laptops that run a browser and little else. The "real" processing is performed not on the Chromebook but on web servers, often hosted in the cloud. (I'm typing this essay on a Chromebook.)

All of the major companies are moving to cloud technology. Google, obviously, with Chromebooks and App Engine and Android devices. Microsoft has its Azure services and versions of Word and Excel that run entirely in the cloud, and they are working on a low-end laptop that runs a browser and little else. It's called the "Cloudbook" -- at least for now.

Amazon.com has its cloud services and its Kindle and Fire tablets. IBM, Oracle, Dell, HP, and others are moving tasks to the cloud.

Except Apple. Apple has no equivalent of the Chromebook, and I don't think it can provide one. Apple's business model is to sell hardware at a premium, providing a superior user experience to justify that premium. The superior user experience is possible with local processing and excellent integration of hardware and software. Apps run on the Macs, MacBooks, and iPhones. They don't run on servers.

A browser-only Apple laptop (a "Safaribook"?) would offer little value. The Apple experience does not translate to web sites.

When Apple does use cloud technology, they use it as an accessory to the PC. The processing for Siri is done in a a big datacenter, but its all for Siri and the user experience. Apple's iCloud lets users store data and synchronize it across devices, but it is simply a big, shared disk. Siri and iCloud make the PC a better PC, but don't transform the PC.

This is the problem that Apple faces. It is stuck in the 1970s, when individual computers did everything. Apple has made the experience pleasant, but it has not changed the paradigm.

Computing is changing. Apple is not. That is what will cause Apple's downfall.

Monday, May 1, 2017

That old clunky system -- the smart phone

Mainframes probably have first claim on the title of "that old large hard-to-use system".

Minicomputers were smaller, easier to use, less expensive, and less fussy. Instead of an entire room, they could fit in the corner of an office. Instead of special power lines, they could use standard AC power.

Of course, it was the minicomputer users who thought that mainframes were old, big, and clunky. Why would anyone want that old, large, clunky thing when they could have a new, small, cool minicomputer?

We saw the same effect with microcomputers. PCs were smaller, easier to use, less expensive, and less fussy than minicomputers.

And of course, it was the PC users who thought that minicomputers (and mainframes) were old, big, and clunky. Why would anyone want that old, large, clunky thing when they could have a new, small, cool PC?

Here's the pattern: A technology gets established and adopted by a large number of people. The people who run the hardware devote time and energy to learning how to operate it. They read the manuals (or web pages), they try things, they talk with other administrators. They become experts, or at least comfortable with it.

The second phase of the pattern is this: A new technology comes along, one that does similar (although often not identical) work as the previous technology. Many times, the new technology does a few old things and lots of new things. Minicomputers could handle data-oriented applications like accounting, but were better at data input and reporting. PCs could handle input and reporting, but were really good at word processing and spreadsheets.

The people who adopt the later technology look back, often in disdain, at the older technology that doesn't do all of the cool new things. (And too often, the new-tech folks look down on the old-tech folks.)

Let's move forward in time. From mainframes to minicomputers, from minicomputers to desktop PCs, from desktop PCs to laptop PCs, from classic laptop PCs to MacBook Air-like laptops. Each transition has the opportunity to look back and ask "why would anyone want that?", with "that" being the previous cool new thing.

Of course, such effects are not limited to computers. There were similar feelings with the automobile, typewriters (and then electric typewriters), slide rules and pocket calculators, and lots more.

We can imagine that one day our current tech will be considered "that old thing". Not just ultralight laptops, but smartphones and tablets too. But what will the cool new thing be?

I'm not sure.

I suspect that it won't be a watch. We've had smartwatches for a while now, and they remain a novelty.

Ditto for smart glasses and virtual reality displays.

Augmented reality displays such as Microsoft's Halo, show promise, but also remain a diversion.

What the next big thing needs is a killer app. For desktop PCs, the killer app was the spreadsheet. For smartphones, the killer app was GPS and maps (and possibly Facebook and games). It wasn't the PC or the phone that people wanted, it was the spreadsheet and the ability to drive without a paper map.

Maybe we've been going about this search for the next big thing in the wrong way. Instead of searching for the device, we should search for the killer app. Find the popular use first, and then you will find the device.

Thursday, April 14, 2016

Technology winners and losers

Technology has been remarkably stable for the past three decades. The PC dominated the hardware market and Microsoft dominated the software market.

People didn't have to use PCs and Microsoft Windows. They could choose to use alternative solutions, such as Apple Macintosh computers with Mac OS. They could use regular PCs with Linux. But the people using out-of-mainstream technology *chose* to use it. They knew what they were getting into. They knew that they would be a minority, that when they entered a computer shop that most of the offerings would be for the other, regular Windows PCs and not their configuration.

The market was not always this way. In the years before the IBM PC, different manufacturers provided different systems: the Apple II, the TRS-80, DEC's Pro-325 and Pro-350, the Amiga, ... there were many. All of those systems were swept aside by the IBM PC, and all of the enthusiasts for those systems knew the pain of loss. They had lost their chosen system to the one designated by the market as the standard.

In a recent conversation with a Windows enthusiast, I realized that he was feeling a similar pain in his situation. He was dejected at the dearth of support for Windows phones -- he owned such a phone, and felt left out of the mobile revolution. Windows phones are out-of-mainstream, and many apps do not run on them.

I imagine that many folks in the IT world are feeling the pain of loss. Some because they have Windows phones. Others because they have been loyal Microsoft users for decades, perhaps their entire career, and now Windows is no longer the center of the software world.

This is their first exposure to loss.

The grizzled veterans who remember CP/M or Amiga DOS have had our loss; we know how to cope. The folks who used WordPerfect or Lotus 1-2-3 had to switch to Microsoft products, they know loss too. But no technology has been forced from the market for quite some time. Perhaps the last was IBM's OS/2, back in the 1990s. (Or perhaps Visual Basic, when it was modified to VB.NET.)

But IT consists of more than grizzled veterans.

For someone entering the IT world after the IBM PC (and especially after the rise of Windows), it would be possible -- and even easy -- to enjoy a career in dominant technologies: stay within the Microsoft set of technology and everything was mainstream. Microsoft technology was supported and accepted. Learning Microsoft technologies such as SQL Server and SharePoint meant that you were on the "winning team".

A lot of folks in technology have never known this kind of technology loss. When your entire career has been with successful, mainstream technology, the change is unsettling.

Microsoft Windows Phone is a technology on the edge. It exists, but it is not mainstream. It is a small, oddball system (in the view of the world). It is not the "winning team"; iOS and Android are the popular, mainstream technologies for phones.

As Microsoft expands beyond Windows with Azure and apps for iOS and Android, it competes with more companies and more technologies. Azure competes with Amazon.com's AWS and Google's Compute Engine. Office Online and Office 365 compete with Google Docs. OneDrive competes with DropBox and BOX. Microsoft's technologies are not the de facto standard, not always the most popular, and sometimes the oddball.

For the folks confronting a change to their worldview that Microsoft technology is always the most popular and most accepted (to a worldview that different technologies compete and sometimes Microsoft loses), a example to follow would be ... Microsoft.

Microsoft, after years of dominance with the Windows platform and applications, has widened its view. It is not "the Windows company" but a technology company that supplies Windows. More than that, it is a technology company that supplies Windows, Azure, Linux, and virtual machines. It is a company that supplies Office applications on Windows, iOS, and Android. It is a technology company that supplies SQL Server on Windows and soon Linux.

Microsoft adapts. It changes to meet the needs of the market.

That's a pretty good example.

Thursday, October 22, 2015

Windows 10 means a different future for PCs

Since the beginning, PCs have always been growing.  The very first IBM PCs used 16K RAM chips (for a maximum of 64K on the CPU board); these were quickly replaced by PCs with 64K RAM chips (which allowed 256K on the CPU board).

We in the PC world are accustomed to new releases of bigger and better hardware.

It may have started with that simple memory upgrade, but it continued with hard drives (the IBM PC XT), enhanced graphics, higher-capacity floppy disks, and a more capable processor (the IBM PC AT), and an enhanced buss, even better graphics, and even better processors (the IBM PS/2 series).

Improvements were not limited to IBM. Compaq and other manufacturers revised their systems and offered larger hard drives, better processors, and more memory. Every year saw improvements.

When Microsoft became the market leader, it played an active role in the specification of hardware. Microsoft also designed new operating systems for specific minimum platforms: you needed certain hardware to run Windows NT, certain (more capable) hardware for Windows XP, and even more capable hardware for Windows Vista.

Windows 10 may change all of that.

Microsoft's approach to Windows 10 is different from previous versions of Windows. The changes are twofold. First, Windows 10 will see a constant stream of updates instead of the intermittent service packs of previous versions. Second, Windows 10 is "it" for Windows -- there will be no later release, no "Windows 11".

With no Windows 11, people running Windows 10 on their current hardware should be able to keep running it. Windows Vista forced a lot of people to purchase new hardware (which was one of the objections to Windows Vista); Windows 11 won't force that because it won't exist.

Also consider: Microsoft made it possible for just about every computer running Windows 8 or Windows 7 (or possibly Windows Vista) to upgrade to Windows 10. Thus, Windows 10 requires just as much hardware as those earlier versions.

What may be happening is that Microsoft has determined that Windows is as big as it is going to be.

This makes sense for desktop PCs and for servers running Windows.

Most servers running Windows will be in the cloud. (They may not be now, but they will be soon.) Cloud-based servers don't need to be big. With the ability to "spin up" new instances of a server, an overworked server can be given another instance to handle the load. A system can provide more capacity with more servers. It is not necessary to make the server bigger.

Desktop PCs, either in the office or at home, run a lot of applications, and these applications (in Microsoft's plan) are moving to the cloud. You won't need a faster machine to run the new version of Microsoft Word -- it runs in the cloud and all you need is a browser.

It may be that Microsoft thinks that PCs have gotten as powerful as they need to get. This is perhaps not an unreasonable assumption. PCs are powerful and can handle every task we ask of them.

As we shift our computing from PCs and discrete servers to the cloud, we eliminate the need for improvements to PCs and discrete servers. The long line of PC growth stops. Instead, growth will occur in the cloud.

Which doesn't mean that PCs will be "frozen in time", forever unchanging. It means that PC *growth* will stop, or at least slow to a glacial pace. This has already happened with CPU clock frequencies and buss widths. Today's CPUs are about as fast (in terms of clock speed) as CPUs from 2009. Today's CPUs use a 64-bit data path, which hasn't changed since 2009. PCs will grow, slowly. Desktop PCs will become physically smaller. Laptops will become thinner and lighter, and battery life will increase.

PCs, as we know them today, will stay as we know them today.

Thursday, July 30, 2015

Tablets for consumption, cloudbooks for creation

Tablets and cloudbooks are mobile devices of the mobile/cloud computing world.

Tablets are small, flat, keyboardless devices with a touchscreen, processor, storage, and an internet connection. The Apple iPad is possibly the most well-known tablet. The Microsoft Surface is possibly the second most well-known. Other manufacturers offer tablets with Google's Android.

Cloudbooks are light, thin laptops. They contain a screen (possibly a touchscreen, but touch isn't a requirement), processor, storage, and internet connection, and the one thing that separates them from tablets: a keyboard. They look and feel like laptop computers, yet they are not laptops in the usual sense. They have a low-end processor and a custom operating system designed to do one thing: run a browser. The most well-known cloudbook computers are Google's Chromebooks.

I'm using the term "cloudbook" here to refer to the generic lightweight, low-powered, single-purpose laptop computer. A simple search shows that the phrase "cloudbook" (or a variation on capitalization) has been used for specific products, including an x86 laptop, a brand of e-books, a cloud services broker, and even an accounting system! Acer uses the name "cloudbook" for its, um, cloudbook devices.

Tablets and cloudbooks serve two different purposes. Tablets are designed for the consumption of data and cloudbooks are designed for the creation of data.

Tablets allow for the installation of apps, and there are apps for all sorts of things. Apps to play games. Apps to play music. Apps to chat with friends. Apps for e-mail (generally effective for reading e-mail and writing brief responses). Apps for Twitter. Apps for navigation.

Cloudbooks allow for the installation of apps too, although it is the browser that allows for apps and not the underlying operating system. On a Chromebook, it is Chrome that manages the apps. Google confuses the issue by listing web-based applications such as its Docs word processor and Sheets spreadsheet as "apps". The separation of web-based apps and browser-based apps is made more complex by Google's creation of duplicate apps for each environment to support off-line work. For off-line work, you must have a local (browser-based) app.

The apps for cloudbooks are oriented toward the composition of data: word processor, spreadsheet, editing photographs, and more.

I must point out that these differences are in orientation and not complete capabilities. One can consume data on a cloudbook. One can, with the appropriate tools and effort, create on a tablet. The two types of devices are not exclusive. In my view it is easier to consume on a tablet and easier to create on a cloudbook.

Tablets are already popular. I expect that cloudbooks will be popular with people who need to create and manage data. Two groups I expect to use cloudbooks are developers and system administrators. Cloudbooks are a convenient size for portability and capable enough to connect to cloud-based development services such as Cloud9, Codeanywhere, Cloud IDE, or Sourcekit.

Wednesday, January 28, 2015

The mobile/cloud revolution has no center

In some ways, the mobile/cloud market is a re-run of the PC revolution. But not completely.

The PC revolution of the 1980s (which saw rise of the IBM PC, PC-DOS, and related technologies) introduced new hardware that was cheaper and easier to use than the previous technologies of mainframes and minicomputers. Today's mobile/cloud revolution shares that aspect, with cloud-based services and mobile devices cheaper than their PC-based counterparts. It's much easier to use a phone or tablet than it is to use a PC -- ask the person who installs software.

The early PC systems, while cheaper and easier to use, were much less capable than the mainframe and minicomputer systems. People ran large corporations on mainframes and small businesses on minicomputers; PCs were barely able to print and handle a few spreadsheets. It was only after PC-compatible networks and network-aware software (Windows 3.1, Microsoft Exchange) that one could consider running a business on PCs. Mobile/cloud shares this attribute, too. Phones and tablets are network-aware, of course, but the whole "mobile and cloud" world is too new, too different, too strange to be used for business. (Except for some hard-core folks who insist on doing it.)

Yet the two revolutions are different. The PC revolution had a definite center: the IBM PC at first, and Windows later. The 1980s saw IBM as the industry leader: IBM PCs were the standard unit for business computing. Plain IBM PCs at first, and then IBM PC XT units, and later IBM PC-compatibles. There were lots of companies offering personal computers that were not IBM-compatible; these offerings (and their companies) were mostly ignored. Everyone wanted "in" on the IBM PC bandwagon: software makers, accessory providers, and eventually clone manufacturers. It was IBM or nothing.

The mobile/cloud revolution has no center, no one vendor or technology. Apple devices are popular but no vendors are attempting to sell clones in the style of PC clones. To some extent, this is due to Apple's nature and their proprietary and closed designs for their devices. (IBM allowed anyone to see the specs for the IBM PC and invited other vendors to build accessories.)

Apple is not the only game in town. Google's Android devices compete handily with the Apple iPhone and iPad. Google also offers cloud services, something Apple does not. (Apple's iCloud product is convenient storage but it is not cloud services. You cannot host an application in it.)

Microsoft is competing in the cloud services area with Azure, and doing well. It has less success with it's Surface tablets and Windows phones.

Other vendors offer cloud services (Amazon.com, IBM, Oracle, SalesForce) and mobile devices (BlackBerry). Today's market sees lots of technologies. It is a far cry from the 1980s "IBM or nothing" mindset, which may show that consumers of IT products and services have matured.

When there is one clear leader, the "safe" purchasing decision is easy: go with the leader. If your project succeeds, no one cares; if your project fails you can claim that even "big company X" couldn't handle the task.

The lack of a clear market leader makes life complicated for those consumers. With multiple vendors offering capable but different products and services, one must have a good understanding of the projects before selecting a vendor. Success is still success, but failure allows others to question your ability.

Multiple competing technologies also means competition at a higher level. In the PC revolution, IBM and Compaq competed on technology, but the basic platform (the PC) was a known quantity. In mobile/cloud, we see new technologies from start-up companies such as containers and new technologies from the established vendors such as cloud management and the Swift programming language.

The world of mobile and cloud has no center, and as such it can move faster than the old PC world. Keep that in mind when building systems and selecting vendors. Be prepared for bumps and turns.

Tuesday, June 10, 2014

Slow and steady wins the race -- or does it?

Apple and Google run at a faster pace than their predecessors. Apple introduces new products often: new iPhones, new iPad tablets, new versions of iOS; Google does the same with Nexus phones and Android.

Apple and Google's quicker pace is not limited to the introduction of new products. They also drop items from their product line.

The "old school" was IBM and Microsoft. These companies moved slowly, introduced new products and services after careful planning, and supported their customers for years. New versions of software were backwards compatible. New hardware platforms supported the software from previous platforms. When a product was discontinued, customers were offered a path forward. (For example, IBM discontinued the System/36 minicomputers and offered the AS/400 line.)

IBM and Microsoft were the safe choices for IT, in part because they supported their customers.

Apple and Google, in contrast, have dropped products and services with no alternatives. Apple dropped .Mac. Google dropped their RSS reader. (I started this rant when I learned that Google dropped their conversion services from App Engine.)

I was about to chide Google and Apple for their inappropriate behavior when I thought of something.

Maybe I am wrong.

Maybe this new model of business (fast change, short product life) is the future?

What are the consequences of this business model?

For starters, businesses that rely on these products and services will have to change. These businesses can no longer rely on long product lifetimes. They can no longer rely on a guarantee of "a path forward" -- at least not with Apple and Google.

Yet IBM and Microsoft are not the safe havens of the past. IBM is out of the PC business, and getting out of the server business. Microsoft is increasing the frequency of operating system releases. (Windows 9 is expected to arrive in 2015. The two years of Windows 8's life are much shorter than the decade of Windows XP.) The "old school" suppliers of PC technology are gone.

Companies no longer have the comfort of selecting technology and using it for decades. Technology will "rev" faster, and the new versions will not always be backwards compatible.

Organizations with large IT infrastructures will find that their technologies are less homogeneous. Companies can no longer select a "standard PC" and purchase it over a period of years. Instead, every few months will see new hardware.

Organizations will see software change too. New versions of operating systems. New versions of applications. New versions of online services (software as a service, platform as a service, infrastructure as a service, web services) will occur -- and not always on a convenient schedule.

More frequent changes to the base upon which companies build their infrastructure will mean that companies spend more time responding to those changes. More frequent changes to the hardware will mean that companies have more variations of hardware (or they spend more time and money keeping everyone equipped with the latest).

IT support groups will be stressed as they must learn the new hardware and software, and more frequently. Roll-outs of internal systems will become more complex, as the target base will be more diverse.

Development groups must deliver new versions of their products on a faster schedule, and to a broader set of hardware (and software). It's no longer acceptable to deliver an application for "Windows only". One must include MacOS, the web, tablets, and phones. (And maybe Kindle tablets, too.)

Large organizations (corporations, governments, and others) have developed procedures to control the technology within (and to minimize costs). Those procedures often include standards, centralized procurement, and change review boards (in other words, bureaucracy). The outside world (suppliers and competitors) cares not one whit about a company's internal bureaucracy and is changing.

The slow, sedate pace of application development is a thing of the past. We live in faster times.

"Slow and steady" used to win. The tortoise would, in the long run, win over the hare. Today, I think the hare has the advantage.

Wednesday, May 14, 2014

Cloud computing is good for some apps, but not all

The rush to cloud-based systems has clouded (if you forgive the pun) the judgement of some. Cloud computing is the current shiny new technology, and there is a temptation to move everything to it. But should we?

We can get a better idea of migration strategies by looking at previous generations of new technology. Cloud computing is the latest of a series of technology advances. In each case, the major applications stayed on their technology platform, and the new technology offered new applications.

When PCs arrived, the big mainframe applications (financial applications like general ledger, payroll, inventory, and billing) stayed on mainframes. The applications on PCs were word processing and spreadsheets. Games, too. Later desktop publishing, e-mail, and presentations emerged. All of these applications were specific to PCs. While traditional mainframe applications were written for PCs, they saw little popularity.

When the web arrived, the popular PC applications (word processing, etc.) stayed on PCs. The applications on the web were static web pages and e-commerce. Later, blogging and music sharing apps (Napster) joined the scene.

When smartphones and tablets arrived, Facebook and Twitter jumped from the web to them (sometimes apps do move from one platform to another) but most applications stayed on the web. The popular apps for phones (after Facebook and Twitter) include photography, maps and location (GPS), texting, music, and games.

The pattern is clear: a new technology allows for new types of applications and old applications tend to stay on their original platforms. Sometimes an application will move to a new platform; I suspect that most applications, developed on a platform, are particularly well-suited to that platform. (A form of evolution and specialization, perhaps.)

The analogy to evolution is perhaps not all that inappropriate. New technologies do kill off older technologies. PCs and word processing software killed off typewriters and dedicated word processing systems. PCs and networks killed off minicomputers. Cell phone networks are slowing killing wired telephony.

What does all of this tell us about cloud computing?

There is a lot of interest in cloud computing, and there should be. It is a new model of computing, one that offers reliability, modular system design, and the ability to scale. Forward-looking individuals and organizations are experimenting with it and learning about it. They are running pilot projects, some which succeed and some which fail.

Some among us will try to move everything to the cloud. Others will resist cloud computing. Both extremes are futile. Some applications should remain on the web (or even on PCs). Some applications will do well in the cloud.

Let's move forward and learn!

Wednesday, January 29, 2014

The PC revolution was about infrastructure

Those of us who lived through the PC revolution like to think that PCs were significant advances in technology. They were advances, but in retrospect they were simply infrastructure.

Let's review the advances in PC technology:

Stand-alone PCs The original PCs were brought in as replacements for typewriters and calculators. This was a tactical use of PCs, one that improved the efficiency of the company but did not change the internal organization or the products and services offered by the company. The PC, with only word processors and spreadsheets, is not strong enough to make a strategic difference for a company.

Databases After some time, people figured out that PCs could be more than typewriters and calculators. There were custom PC applications, but more importantly there were the early databases (dBase II, dBase III, R:Base) and database languages (Clipper, Paradox) that let people store and retrieve data. These databases were single-user and stand-alone.

Networks The original PC networks (Novell, Banyan, Corvus) were introduced to share resources such as disks and printers. Printers (especially letter-quality printers) were expensive. Disks (large disks, say 40 MB) were also expensive. Sharing a common resource made economic sense. But the early networks were LANs (Local Area Networks) and confined to a single building.

Servers Initially part of "client/server systems", servers were database engines that handled requests from multiple clients. Client/server systems gave networks a significant purpose for existing: the ability to update a single database from multiple locations made it possible to migrate mainframe applications onto the cheaper PC platform.

The Internet Connecting networks made it possible for businesses to exchange information. The first big use of internet connections was e-mail; calendars followed quickly. Strictly speaking, the Internet is not a PC technology -- it was built mostly with minicomputers and Unix. The sockets libraries (WinSock) for PCs made the Internet accessible.

Web servers Built on these previous layers, the web is (now) a combination of PCs, minicomputers, mainframes, and rack-mounted servers. It is this layer that enables strategic as well as tactical advantages. Companies can provide self-service web pages (more of a tactical change, I think) and new services (strategic). New companies can form (Facebook, Twitter).

Virtualization The true advantage of virtualization is not consolidation of servers, but the ability to create or destroy machines quickly.

Cloud computing Once virtual machines were available and cheap, we created the cloud paradigm. Using an array of virtual computers, we can design applications that are distributed across multiple servers and are resistant to failure of any one of those servers. The distribution of work allows for scaling (up or down) as needed, adding or removing servers to handle the current load.

All of these technologies are now infrastructure. They are well-understood and easily available.

New technologies are plugging in to this infrastructure. Smartphones, tablets, and big data are all sitting on top of this (impressive) stack of technology. Smartphone and tablet apps use low-wattage user interfaces and connect to cloud computing systems for processing. Big data system use a similar design, with cloud computing engines providing the data for visualization software on PCs (or in web browsers).

When we built the first microcomputers, when we installed DOS on PCs, when we used modems to connect to bulletin-board systems, we thought we were creating the crest of technology. We thought we were building the top dog. But it didn't turn out that way. The PC and its later technologies let us build a significant computing stack.

Now that we have that stack, I think we can discard the traditional PC. The next decade should see the replacement of PCs. Not all at once, and at different rates in different environments. I expect PCs to exist in businesses for quite some time.

But disappear they shall.

Tuesday, January 21, 2014

From general to specific

We are entering a new age of computing.

The change is the shift from general to specific. PC hardware has been, since the initial PC up to now, standard and generic. One PC was very much like another PC, in terms of architecture and capacity. This standardization made the PC market possible, with PC manufacturers, accessory vendors, and software providers all working to a common standard.

To be sure, there was always some variation among PCs. Some had faster processors; some had more memory. Enthusiasts added super-large hard drives and super-fast video cards. But they all revolved around the PC standard. (A standard that evolved over time, moving from the original PC to the IBM PC AT to the Compaq Deskpro 386 to today's Intel-based desktops.)

Now we see the standard-issue technology splitting into distinct markets with distinct hardware. Many businesses have traded their desktop PCs for laptops and shifted "back end" work to cloud servers. Game consoles are not quite PCs: they contain specialized hardware and one cannot replace the operating system (at least not easily). The home PC is being replaced by tablets and smartphones. Hobbyists are experimenting with small-board systems like the Raspberry Pi and the BeagleBone.

For each of these uses, we are replacing the desktop PC with a smaller, specialized device.

The change is not limited to hardware. While businesses still run Microsoft Windows, other devices are moving to different operating systems. Game consoles run their own operating systems; even Microsoft's Xbox runs an operating system that is based on Windows but not quite the same as Windows on the desktop PC. Tablets and phones run iOS or Android. The hobbyists are using Linux.

The good old days of standard PCs saw PCs (and Windows) everywhere. The new age of specialization sees a fragmentation of that world, with separate hardware and software for the different types of users. This differentiation will allow the different markets to develop distinct pricing for hardware and software; already competition is driving down the prices of tablets to ranges unreachable by classic PCs.

I expect the job market to fracture alone similar lines. Office applications will stay within the classic PC realm and move slowly to cloud-based solutions. The development of games is already distinct. The consumer market of apps is almost fully emerged. As the hardware and software of these markets diverge, I expect the development tools and techniques, the advertising, and the pay scales to diverge.

Eventually, we will not have "an IT jobs market" or "an IT career path". Instead, we will have career paths in business, in games and interactive entertainment, and in consumer products. Each will include IT as we think of it today (hardware, software, development, testing, etc.) as part of a larger whole. The hobbyists will perhaps be different in that they will have not a market for the exchange of dollars but a community for the exchange of information. They too will use IT for larger goals -- perhaps education or research.

We will lose the PC standard. In its place will be a standard for business, another standard for games, yet other standards for consumers, and (most likely) a collection of diverse hardware for hobbyists. I will not mourn the passing of the PC standard. It served its purpose, letting us develop a strong set of technology for diverse challenges. Now we can move to the next level and use technology that is better suited to specific tasks.

Thursday, November 21, 2013

The excitement of new tech

GreenArrays has introduced the GA144 chip, which contains 144 F18 processors. They also have a prototyping circuit board for the GA144. These two offerings intrigue me.

The F18 is a processor that uses Forth as its instruction set. That in itself is interesting. Forth is a small, stack-oriented language, initially developed in the 1960s (Wikipedia asserts the origin at 1958) and created to run on diverse architectures. Like C, it is close to hardware and has a small set of native operations. The Forth language lets the user define new "words" and build their own language.

The GA144 has 144 of these processors.

The F18 and the GA144 remind me of the early days of microcomputers, when systems like the Mark-8 and the Altair were available. These "homebrew" systems existed prior to the "commercial" offerings of the Apple II and the Radio Shack TRS-80. They were new things in the world, unlike anything we had seen before.

We were excited by these new microcomputers. We were also ignorant of their capabilities. We knew that they could do things; we didn't know how powerful they would become. Eventually, the commercial systems adopted the IBM PC architecture and the MS-DOS operating system (later, Windows) and became ubiquitous.

I'm excited by the GA144. It's new, it's different, and it's potent. It is a new approach to computing. I don't know where it will take us (or that it will succeed at taking us anywhere) -- but I like that it offers us new options.

Sunday, August 18, 2013

The revulsion of old equipment

As a newbie programmer back in the dawn of the PC era, I joined other PC programmers in a general disdain of the IBM mainframes. We were young and arrogant. We were a elitist in our view of hardware: we considered the PC sleek and modern, the mainframe to be antiquated. Most of all, we looked on mainframe hardware as monstrous. Everything about mainframes was a grotesque, oversized ancestor of PC hardware, from the display terminals (the IBM 3270 was a behemoth) to the processor units (refrigerators) to even the cables that connected devices. Yes they worked, and yes people used them but I could not imagine anyone wanting to use such equipment.

We were also elitist in our view of software, but that is not important for this post.

Much of the IBM mainframe was designed in the System/360, the first general-purpose computer from IBM. It was introduced in 1964, seventeen years prior to the IBM PC in 1981. In that time, advances in technology shrank most devices, from processors to disk drives. The IBM PC was very different from the IBM System/360.

Yet the span from that first PC to today is almost twice the seventeen year span from System/360 to PC. Advances in technology have again shrank most devices.

Today's newbie programmers (young and possibly arrogant) must look on the aged PC design with the same revulsion that I felt for mainframe computers.

PC enthusiasts will point out that the PC has not remained static in the past thirty-plus years. Processors are more powerful, memory is significantly larger, disk drives have more capacity while becoming smaller, and the old serial and parallel connectors have been replaced with USB.

All of these are true, but one must still admit that, compared to tablets and smart phones, PCs are large, hulking monstrosities. And while they work and people use them, does anyone really want to?

The PC revolution happened because of four factors: PCs were cheap, PCs were easier to use than mainframes, the "establishment" of mainframe programmers and operators set up a bureaucracy to throttle requests from users, and PCs got the job done (for lots of little tasks).

Since that revolution, the "establishment" bureaucracy has absorbed PCs into the fold.

The tablet revolution sees tablets and smart phones that are: cheap, easier to use that PCs, outside of the establishment bureaucracy, and capable of getting the job done (for lots of little tasks).

Tablets are here to stay. The younger generation will see to that. Businesses will adopt them, just as they adopted PCs. In time, the establishment bureaucracy may absorb them.

PCs will stick around, too, just as mainframes did. They won't be the center of attention, though.

Wednesday, June 26, 2013

The Red Queen's Race requires awareness

Does your software development project use mainstream technology? (Let's assume that you care about your technology.) Some project managers want to stay in the mainstream, others want to stay ahead of the crowd, and some want leading edge tech.

Starting a project at a specific position in technology is easy. Keeping that position, on the other hand, is not so easy.

Over time, languages and compilers and libraries change. There are new versions with enhanced features and bug fixes.


Source code, once written, is a "stake in the ground". It is fixed in place, tied to a language, possibly a compiler, and probably several libraries. Keeping up with those changes requires effort. Just how much effort will vary from language to language. The C++ language has been fairly stable over its thirty-year life; Visual Basic changed dramatically in the 1990s.

Thus we have a variant of the Red Queen's Race, in which one must run just to stay in place. (In the proper race, described in "Alice Through the Looking Glass", one must run as fast as one can. I've reduced the mandate for this discussion.) A software development project must, over time, devote some effort towards "running in place", that is, keeping up with the toolset.

This effort may be small (installing a new version of the compiler) or large (a new compiler and changes to a majority of the source modules). Sometimes the effort is very large: converting a project from Perl to Python is a major effort.

Failing to move to the current tools means that you slowly drift back, and possibly fall out of your desired position. A project that starts in the leading edge drifts to the mainstream, and a project in the mainstream becomes a laggard.

The Red Queen's Race for software requires not just changes to technology (updates to compilers and such) but also an awareness of technology. In the day-to-day activities of a software project, it is easy to focus inwards, looking at new requirements and defect reports. Maintaining one's position within tech requires looking outward, at updates and new technologies and techniques. You must be aware of updates to your toolset. You must be aware of new tools for testing and collaboration. You must be aware of other groups and their technologies.

When running in a herd, it's good to look at the herd, at least once in a while.


Saturday, June 22, 2013

Functional programming has no loud-mouth advocate

I'm reading "The Best of Booch", a collection of essays written by Grady Booch in the 1990s.

In the 1990s, the dominant programming method was structured programming. Object-oriented programming was new and available in a few shops, but the primary programming style was structured. (Structured programming was the "better" form of programming introduced in the 1970s.)

We had learned how to use structured programming, how to write programs in it, and how to debug programs in it. We (as an industry) had programming languages: C, Visual Basic, and even structured methods for COBOL. Our compilers and IDEs supported those languages.

We had accepted structured programming, adopted it, and integrated it into our processes.

We had also learned that structured programming was not perfect. Our programs were hard to maintain. Our programs contained bugs. Our programs were difficult to analyze.

Object-oriented programming was a way forward, a way to organize our programs and reduce defects. Booch was an advocate, out in front of the crowd.

Now, these essays were also a way for Booch to hawk his business, which was consulting and training in UML and system design. Boosh was one of the early proponents of object-oriented programming, and one of the participants in the "notation wars" prior to UML. The agreement on UML was a recognition that there was more money to be made in supplying a uniform notation for system design than in fighting for one's own notation.

But despite the advertising motivation, the articles contain a strong set of content. One can feel the passion for object-oriented programming in Booch. These are legitimate articles on a new technology first, and convenient advertising for his business a distant second.

Fast-forward to the year 2013. We have accepted object-oriented programming as mainstream technology. People (and projects) have been using it for years -- no, decades. We have learned how to use object-oriented programming, how to write programs in it, and how to debug programs in it. We (as an industry) have programming languages: C++, Java, Python, and Ruby. Our compilers and IDEs support these languages.

We have accepted object-oriented programming, adopted it, and integrated it into our processes.

We have also learned that object-oriented programming is not perfect. Our programs are hard to maintain. Our programs contain bugs. Our programs are difficult to analyze.

In short, we have the same problems with object-oriented programs that we had with structured programs.

Now, functional programming is a way forward. It is a way to organize our programs and reduce defects. (I know; I have used it.)

But where are the proponents? Where is the Grady Booch of functional programming?

I have seen a few articles on functional programming. I have read a few books on the topic. Most articles and books have been very specific to a new language, usually Scala or Haskell. None have had the broader vision of Booch. None have described the basic benefits of functional programming, the changes to our teams and processes, or the implications for system architecture and design.

At least, none that I have read. Perhaps I have missed them, in my journeys in technical writings.

Perhaps there is no equivalent to Grady Booch for functional programming. Or perhaps one does not exist yet. Perhaps the time is too early, and the technology must develop a bit more. (I tend to think not, as functional programming languages are ready for use.)

Perhaps it is a matter of passion and business need. Booch wrote his articles because he felt strongly about the technology, and because he had a business case for them.

Perhaps it is a matter of distribution. The software business in 2013 is very different from the software business in 1997; the big change is the success of open source.

Open source projects emerge into the technology mainstream through channels other than books and magazine articles. They are transmitted from person to person via e-mail, USB drives, and Github. Eventually, magazine articles are written, and then books, but only after the technology is established and "running". The Linux, Apache, and Perl projects followed this path.

So maybe we don't need an obvious advocate for a new technology. Perhaps the next programming revolution will be quieter, with technology seeping slowly into organizations and not being forced. That might be a good thing, with smaller shocks to existing projects and longer times to learn and adopt a new programming language.

Saturday, May 25, 2013

Best practices are not best forever

Technology changes quickly. And with changes in technology, our views of technology change, and these views affect our decisions on system design. Best practices in one decade may be inefficient in another.

A recent trip to the local car dealer made this apparent. I had brought my car in for routine service, and the mechanic and I reviewed the car's maintenance history. The dealer has a nice, automated system to record all maintenance on vehicles. It has an on-line display and prints nicely-formatted maintenance summaries. A "modern" computer system, probably designed in the 1980s and updated over the years. (I put the word "modern" in quotes because it clearly runs on a networked PC with a back end database, but it does not have tablet or phone apps.)

One aspect of this system is the management of data. After some amount of time (it looks like a few years), maintenance records are removed from the system.

Proper system design once included the task of storage management. A "properly" designed system (one that followed "best practices") would manage data for the users. Data would be retained for a period of time but not forever. One had to erase information, because the total available space was fixed (or additional space was prohibitively expensive) and programming the system to manage space was more effective that asking people to erase the right data at the right time. (People tend to wait until all free storage is used and then binge-erase more data than necessary.)

That was the best practice -- at the time.

Over time, the cost of storage dropped. And over time, our perception of the cost of storage dropped.

Google has a big role in our new perception. With the introduction of GMail, Google gave each account holder a full gigabyte of storage. A full gigabyte! The announcement shocked the industry. Today, it is a poor e-mail service that cannot promise a gigabyte of storage.

Now, Flickr is giving each account holder a full terabyte of storage. A full terabyte! Even I am surprised at the decision. (I also think that it is a good marketing move.)

Let's return to the maintenance tracking system used by the car dealer.

Such quantities of storage vastly surpass the meager storage used by a few maintenance records. Maintenance records each take a few kilobytes of data (it's all text, and only a few pages). A full megabyte of data would hold all maintenance records for several hundred repairs and check-ups. If the auto dealer assigned a full gigabyte to each customer, they could easily hold all maintenance records for the customer, even if the customer brought the car for repairs every month for an extended car-life of twenty years!

Technology has changed. Storage has become inexpensive. Today, it would be a poor practice to design a system that auto-purges records. You spend more on the code and the tests than you save on the reduction in storage costs. You lose older customer data, preventing you from analyzing trends over time.

The new best practices of big data, data science, and analytics, require data. Old data has value, and the value is more than the cost of storage.

Best practices change over time. Be prepared for changes.




Sunday, March 24, 2013

Your New New Thing will one day become The New Old Thing

In IT, we've had New Things for years. Decades. New Things are the cool products and technologies that the alpha geeks are using. They are the products that appear in the trade magazines, the products that get reviews.

But you cannot have New Things without Old Things. If New Things are the things used by the alpha geeks, Old Things are the things used by the rest of us. They are the products that support the legacy systems. They are the products that are getting the job done (however clumsily).

When I started in IT, microcomputers were the New Thing. Apple II microcomputers ("Apple ][" for the purists), CP/M, IBM PCs running PC-DOS, word processors, spreadsheets, databases (in the form of dBase II), and the programming languages BASIC, Pascal, and C. The Old Things were IBM mainframes, tape drives, batch jobs, and COBOL.

I wanted to work on the New Things. I most emphatically wanted to avoid the Old Things.

Of course, the Old Things from that time were, at some earlier time, New Things. The IBM 360/370 processors were New Things, compared to the earlier 704, 709, and 1401 processors. COBOL was a New Thing compared to assembly language.

The IT industry is, in part, devoted to building New Things and constantly demoting technology to Old Thing status.

Just as COBOL slid from New Thing to Old Thing, so did those early PC technologies. PC-DOS and its sibling MS-DOS became Old Things to the New Thing of Microsoft Windows. The Intel 8088 processor became an Old Thing compared to the Intel 80286, which in turn became Old to the 80386, which in its turn became Old to the New Thing of the Intel Pentium processor.

The slide from New Thing to Old Thing is not limited to hardware. It happens to programming languages, too. Sun made C++ and Old Thing by introducing Java. Microsoft tried to make Java an Old Thing by introducing C# and .NET. While Microsoft may have failed in that attempt, Python and Ruby have succeeded at making Java an Old Thing.

The problem with building systems on any technology is that the technology will one day become and Old Thing. That in itself is not a problem, since the system will continue to work. If all components of the system remain in place, it will work with the same performance and reliability as its first day of operation.

But systems rarely keep all components in place. Hardware is replaced. Operating systems are upgraded. Peripheral devices are exchanged for newer models. Compilers are updated to new standards. And they key "component" in a system is often changed: the people who write, maintain, and operate the system come and go.

These changes stress the system and can disrupt it. A faster processor can change the timing of certain sections of code, and these changes can break the interaction with devices and other systems. A new version of the operating system can provide additional checks and detect invalid operations; some programs rely on quirky behaviors of the processor or operating system and break when the quirks are fixed.

People are a big challenge. Programmers have free will, and can choose to work on a system or they can choose to work somewhere else. To get programmers to work on your system, you have to bribe them with wages and benefits. Programmers have various tastes for technology, and some prefer to work on New Things and other prefer to work on Old Things. I don't know that either is better, but I tend to believe that the programmers pursuing the New Things are the ones with more initiative. (Some may argue that the programmers pursuing New Things are unreliable and ready to leave for an even Newer Thing, and programmers who enjoy the Old Thing are more stable. It is not an unconvincing argument.)

Early in its life, your system is a New Thing, and therefore attractive to certain programmers. But it does not remain a New Thing forever. It eventually matures into an Old Thing, and when it does the set of programmers that it attracts also changes. The interested programmers are those who like the Old Thing; the programmers pursuing New Things are off, um, pursuing New Things.

In the long term, the system graduates into a Very Old Thing and is of interest to a small set of programmers who enjoy working on Esoteric Curiosities. But these programmers come with Very Expensive Expectations for pay.

We often start projects with the idea of using New Thing technologies. This is an acceptable practice. But we too often believe that our systems and their technologies remain New Things. This is a delusion. Our systems will always change from New Things to Old Things. We should plan for that change and manage our system (and our teams) with that expectation.

Friday, February 22, 2013

Software subscriptions

One of our current debates is the change from traditional, PC-installed software to web-based software.

Even Microsoft is switching. Microsoft in addition to its classic PC software "Office 2013" which is installed on your local PC, now offers the subscription package "Office 365".

It's a big change, and many folks are concerned. System admins worry about the new procedures for signing up with new software services. Managers fret that an update could change file formats, and locally stored documents in old formats may become unreadable. Ordinary users find the concept of renting, not owning, their software a bit disconcerting. A few folks are aghast to learn that their software could disappear after failing to pay for the subscription, and have railed against the increased cost and the greed of software vendors.

Yes, I know that software is usually licensed and not sold. But most folks think of current PC software as sold, regardless of the licensing agreement. It is this general understanding that I consider to be important.

Let's run a thought experiment. Suppose technology was going in the other direction. What if, instead of starting with purchased software and moving to subscriptions, we were starting with subscriptions and moving to purchased software?

In that change, people would be complaining, of course. Users would be hesitant to move from the comfort and convenience of subscription software to the strange new world of installed software. System admin types would grumble about the additional work of installing software and applying updates. Managers would fret about compatibility, fearing that some users would have old versions of software and might be unable to share files. Ordinary users might find the concept of "owning" software a bit disconcerting. I suspect that a lot of people would be aghast to learn that they would have to pay for each device they used, and rail against the increased cost and the greed of software vendors.

Such a thought experiment shows that the change from ownership to rental is big, but perhaps not a bad thing. The decision between PC-installed software and web-based software subscriptions (or mobile/cloud subscriptions) is similar to the decision to own a house or rent a condominium. Both have advantages, and drawbacks.

My advice is to experiment with this new model. Start using web-based e-mail, word processing, and spreadsheets. Try the file-sharing services of SkyDrive, Google Drive, and DropBox. Learn how they work. Learn how your organization and use them. Then you can decide which is the better method for your team.