Showing posts with label old technology. Show all posts
Showing posts with label old technology. Show all posts

Monday, September 25, 2017

Web services are the new files

Files have been the common element of computing since at least the 1960s. Files existed before disk drives and file systems, as one could put multiple files on a magnetic tape.

MS-DOS used files. Windows used files. OS/2 used files. (Even the p-System used files.)

Files were the unit of data storage. Applications read data from files and wrote data to files. Applications shared data through files. Word processor? Files. Spreadsheet? Files. Editor? Files. Compiler? Files.

The development of databases saw another channel for sharing data. Databases were (and still are) used in specialized applications. Relational databases are good for consistently structured data, and provide transactions to update multiple tables at once. Microsoft hosts its Team Foundation on top of its SQL Server. (Git, in contrast, uses files exclusively.)

Despite the advantages of databases, the main method for storing and sharing data remains files.

Until now. Or in a little while.

Cloud computing and web services are changing the picture. Web services are replacing files. Web services can store data and retrieve data, just as files. But web services are cloud residents; files are for local computing. Using URLs, one can think of a web service as a file with a rather funny name.

Web services are also dynamic. A file is a static collection of bytes: what you read is exactly was was written. A web service can provide a set of bytes that is constructed "on the fly".

Applications that use local computing -- desktop applications -- will continue to use files. Cloud applications will use web services.

Those web services will be, at some point, reading and writing files, or database entries, which will eventually be stored in files. Files will continue to exist, as the basement of data storage -- around, but visited by only a few people who have business there.

At the application layer, cloud applications and mobile applications will use web services. The web service will be the dominant method of storing, retrieving, and sharing data. It will become the dominant method because the cloud will become the dominant location for storing data. Local computing, long the leading form, will fall to the cloud.

The default location for data will be the cloud; new applications will store data in the cloud; everyone will think of the cloud. Local storage and local computing will be the oddball configuration. Legacy systems will use local storage; modern systems will use the cloud.

Monday, May 1, 2017

That old clunky system -- the smart phone

Mainframes probably have first claim on the title of "that old large hard-to-use system".

Minicomputers were smaller, easier to use, less expensive, and less fussy. Instead of an entire room, they could fit in the corner of an office. Instead of special power lines, they could use standard AC power.

Of course, it was the minicomputer users who thought that mainframes were old, big, and clunky. Why would anyone want that old, large, clunky thing when they could have a new, small, cool minicomputer?

We saw the same effect with microcomputers. PCs were smaller, easier to use, less expensive, and less fussy than minicomputers.

And of course, it was the PC users who thought that minicomputers (and mainframes) were old, big, and clunky. Why would anyone want that old, large, clunky thing when they could have a new, small, cool PC?

Here's the pattern: A technology gets established and adopted by a large number of people. The people who run the hardware devote time and energy to learning how to operate it. They read the manuals (or web pages), they try things, they talk with other administrators. They become experts, or at least comfortable with it.

The second phase of the pattern is this: A new technology comes along, one that does similar (although often not identical) work as the previous technology. Many times, the new technology does a few old things and lots of new things. Minicomputers could handle data-oriented applications like accounting, but were better at data input and reporting. PCs could handle input and reporting, but were really good at word processing and spreadsheets.

The people who adopt the later technology look back, often in disdain, at the older technology that doesn't do all of the cool new things. (And too often, the new-tech folks look down on the old-tech folks.)

Let's move forward in time. From mainframes to minicomputers, from minicomputers to desktop PCs, from desktop PCs to laptop PCs, from classic laptop PCs to MacBook Air-like laptops. Each transition has the opportunity to look back and ask "why would anyone want that?", with "that" being the previous cool new thing.

Of course, such effects are not limited to computers. There were similar feelings with the automobile, typewriters (and then electric typewriters), slide rules and pocket calculators, and lots more.

We can imagine that one day our current tech will be considered "that old thing". Not just ultralight laptops, but smartphones and tablets too. But what will the cool new thing be?

I'm not sure.

I suspect that it won't be a watch. We've had smartwatches for a while now, and they remain a novelty.

Ditto for smart glasses and virtual reality displays.

Augmented reality displays such as Microsoft's Halo, show promise, but also remain a diversion.

What the next big thing needs is a killer app. For desktop PCs, the killer app was the spreadsheet. For smartphones, the killer app was GPS and maps (and possibly Facebook and games). It wasn't the PC or the phone that people wanted, it was the spreadsheet and the ability to drive without a paper map.

Maybe we've been going about this search for the next big thing in the wrong way. Instead of searching for the device, we should search for the killer app. Find the popular use first, and then you will find the device.

Thursday, April 14, 2016

Technology winners and losers

Technology has been remarkably stable for the past three decades. The PC dominated the hardware market and Microsoft dominated the software market.

People didn't have to use PCs and Microsoft Windows. They could choose to use alternative solutions, such as Apple Macintosh computers with Mac OS. They could use regular PCs with Linux. But the people using out-of-mainstream technology *chose* to use it. They knew what they were getting into. They knew that they would be a minority, that when they entered a computer shop that most of the offerings would be for the other, regular Windows PCs and not their configuration.

The market was not always this way. In the years before the IBM PC, different manufacturers provided different systems: the Apple II, the TRS-80, DEC's Pro-325 and Pro-350, the Amiga, ... there were many. All of those systems were swept aside by the IBM PC, and all of the enthusiasts for those systems knew the pain of loss. They had lost their chosen system to the one designated by the market as the standard.

In a recent conversation with a Windows enthusiast, I realized that he was feeling a similar pain in his situation. He was dejected at the dearth of support for Windows phones -- he owned such a phone, and felt left out of the mobile revolution. Windows phones are out-of-mainstream, and many apps do not run on them.

I imagine that many folks in the IT world are feeling the pain of loss. Some because they have Windows phones. Others because they have been loyal Microsoft users for decades, perhaps their entire career, and now Windows is no longer the center of the software world.

This is their first exposure to loss.

The grizzled veterans who remember CP/M or Amiga DOS have had our loss; we know how to cope. The folks who used WordPerfect or Lotus 1-2-3 had to switch to Microsoft products, they know loss too. But no technology has been forced from the market for quite some time. Perhaps the last was IBM's OS/2, back in the 1990s. (Or perhaps Visual Basic, when it was modified to VB.NET.)

But IT consists of more than grizzled veterans.

For someone entering the IT world after the IBM PC (and especially after the rise of Windows), it would be possible -- and even easy -- to enjoy a career in dominant technologies: stay within the Microsoft set of technology and everything was mainstream. Microsoft technology was supported and accepted. Learning Microsoft technologies such as SQL Server and SharePoint meant that you were on the "winning team".

A lot of folks in technology have never known this kind of technology loss. When your entire career has been with successful, mainstream technology, the change is unsettling.

Microsoft Windows Phone is a technology on the edge. It exists, but it is not mainstream. It is a small, oddball system (in the view of the world). It is not the "winning team"; iOS and Android are the popular, mainstream technologies for phones.

As Microsoft expands beyond Windows with Azure and apps for iOS and Android, it competes with more companies and more technologies. Azure competes with Amazon.com's AWS and Google's Compute Engine. Office Online and Office 365 compete with Google Docs. OneDrive competes with DropBox and BOX. Microsoft's technologies are not the de facto standard, not always the most popular, and sometimes the oddball.

For the folks confronting a change to their worldview that Microsoft technology is always the most popular and most accepted (to a worldview that different technologies compete and sometimes Microsoft loses), a example to follow would be ... Microsoft.

Microsoft, after years of dominance with the Windows platform and applications, has widened its view. It is not "the Windows company" but a technology company that supplies Windows. More than that, it is a technology company that supplies Windows, Azure, Linux, and virtual machines. It is a company that supplies Office applications on Windows, iOS, and Android. It is a technology company that supplies SQL Server on Windows and soon Linux.

Microsoft adapts. It changes to meet the needs of the market.

That's a pretty good example.

Sunday, January 3, 2016

Predictions for 2016

It's the beginning of a new year, which means... predictions! Whee!

Let's start with some obvious predictions:

Mobile will be big in 2016.

Cloud will be big on 2016.

NoSQL and distributed databases will be big in 2016.

Predictions like these are easy.

Now for something a little less obvious: legacy applications.

With the continued interest in mobile, cloud, NoSQL, and distributed databases, these areas will see strong demand for architects, developers, designers, and testers. That demand will pull people away from legacy applications -- those applications built for classic, non-cloud web architectures as well as the remaining desktop applications and mainframe batch systems.

Which is unfortunate for the managers of those legacy applications, because I believe that 2016 is going to be the year that companies decide that they want to migrate those legacy applications to the cloud/mobile platform.

When the web appeared, lots of managers held back, waiting to see if the platform would prove itself. It did, and companies migrated most of their applications from desktop to web (either external or internal). Even Microsoft, stalwart of desktop applications, created a web-based version of Outlook.

Likewise, when mobile and cloud appeared, many managers held back and waited for the new technologies to prove themselves. With almost ten years of mobile and cloud, and many companies already using those technologies, its time for the holdouts to take action.

Look for renewed interest in converting existing desktop and classic web applications. The conversions have challenges. In one sense, the job is easier than the early conversions, because we now have experience with mobile/cloud systems and we understand the architecture. In other ways, this may be harder, as the easy conversions (the "low-hanging fruit") have already been done, which means that the remaining conversions are harder.

The architecture of mobile/cloud systems (with or without distributed databases) is different from classic web applications. (And very different from desktop applications.)

I think that 2016 will be the year of rude awakening, as companies look at the effort to convert their legacy systems to newer technologies.

But the rude awakening is delivered in two phases. The first is the cost and time to convert legacy applications. The second is the cost of maintaining legacy applications in their current form.

Why the cost of maintaining legacy applications, without changing them to newer technologies? Because of the demand for mobile/cloud is high. New entrants to the field will know the new technologies, and select jobs that let them use that knowledge. That means that the folks with knowledge of the older technologies will be, um, older.

The folks with knowledge about older languages (C++, Visual Basic) and older APIs (Flash) will be the senior developers. And senior developers are more expensive than junior developers.

So the owners of legacy applications have a rather unpleasant choice: migrate to mobile/cloud, which is expensive, or stay on the legacy platform, with will also be expensive.

Tuesday, June 10, 2014

Slow and steady wins the race -- or does it?

Apple and Google run at a faster pace than their predecessors. Apple introduces new products often: new iPhones, new iPad tablets, new versions of iOS; Google does the same with Nexus phones and Android.

Apple and Google's quicker pace is not limited to the introduction of new products. They also drop items from their product line.

The "old school" was IBM and Microsoft. These companies moved slowly, introduced new products and services after careful planning, and supported their customers for years. New versions of software were backwards compatible. New hardware platforms supported the software from previous platforms. When a product was discontinued, customers were offered a path forward. (For example, IBM discontinued the System/36 minicomputers and offered the AS/400 line.)

IBM and Microsoft were the safe choices for IT, in part because they supported their customers.

Apple and Google, in contrast, have dropped products and services with no alternatives. Apple dropped .Mac. Google dropped their RSS reader. (I started this rant when I learned that Google dropped their conversion services from App Engine.)

I was about to chide Google and Apple for their inappropriate behavior when I thought of something.

Maybe I am wrong.

Maybe this new model of business (fast change, short product life) is the future?

What are the consequences of this business model?

For starters, businesses that rely on these products and services will have to change. These businesses can no longer rely on long product lifetimes. They can no longer rely on a guarantee of "a path forward" -- at least not with Apple and Google.

Yet IBM and Microsoft are not the safe havens of the past. IBM is out of the PC business, and getting out of the server business. Microsoft is increasing the frequency of operating system releases. (Windows 9 is expected to arrive in 2015. The two years of Windows 8's life are much shorter than the decade of Windows XP.) The "old school" suppliers of PC technology are gone.

Companies no longer have the comfort of selecting technology and using it for decades. Technology will "rev" faster, and the new versions will not always be backwards compatible.

Organizations with large IT infrastructures will find that their technologies are less homogeneous. Companies can no longer select a "standard PC" and purchase it over a period of years. Instead, every few months will see new hardware.

Organizations will see software change too. New versions of operating systems. New versions of applications. New versions of online services (software as a service, platform as a service, infrastructure as a service, web services) will occur -- and not always on a convenient schedule.

More frequent changes to the base upon which companies build their infrastructure will mean that companies spend more time responding to those changes. More frequent changes to the hardware will mean that companies have more variations of hardware (or they spend more time and money keeping everyone equipped with the latest).

IT support groups will be stressed as they must learn the new hardware and software, and more frequently. Roll-outs of internal systems will become more complex, as the target base will be more diverse.

Development groups must deliver new versions of their products on a faster schedule, and to a broader set of hardware (and software). It's no longer acceptable to deliver an application for "Windows only". One must include MacOS, the web, tablets, and phones. (And maybe Kindle tablets, too.)

Large organizations (corporations, governments, and others) have developed procedures to control the technology within (and to minimize costs). Those procedures often include standards, centralized procurement, and change review boards (in other words, bureaucracy). The outside world (suppliers and competitors) cares not one whit about a company's internal bureaucracy and is changing.

The slow, sedate pace of application development is a thing of the past. We live in faster times.

"Slow and steady" used to win. The tortoise would, in the long run, win over the hare. Today, I think the hare has the advantage.

Wednesday, January 29, 2014

The PC revolution was about infrastructure

Those of us who lived through the PC revolution like to think that PCs were significant advances in technology. They were advances, but in retrospect they were simply infrastructure.

Let's review the advances in PC technology:

Stand-alone PCs The original PCs were brought in as replacements for typewriters and calculators. This was a tactical use of PCs, one that improved the efficiency of the company but did not change the internal organization or the products and services offered by the company. The PC, with only word processors and spreadsheets, is not strong enough to make a strategic difference for a company.

Databases After some time, people figured out that PCs could be more than typewriters and calculators. There were custom PC applications, but more importantly there were the early databases (dBase II, dBase III, R:Base) and database languages (Clipper, Paradox) that let people store and retrieve data. These databases were single-user and stand-alone.

Networks The original PC networks (Novell, Banyan, Corvus) were introduced to share resources such as disks and printers. Printers (especially letter-quality printers) were expensive. Disks (large disks, say 40 MB) were also expensive. Sharing a common resource made economic sense. But the early networks were LANs (Local Area Networks) and confined to a single building.

Servers Initially part of "client/server systems", servers were database engines that handled requests from multiple clients. Client/server systems gave networks a significant purpose for existing: the ability to update a single database from multiple locations made it possible to migrate mainframe applications onto the cheaper PC platform.

The Internet Connecting networks made it possible for businesses to exchange information. The first big use of internet connections was e-mail; calendars followed quickly. Strictly speaking, the Internet is not a PC technology -- it was built mostly with minicomputers and Unix. The sockets libraries (WinSock) for PCs made the Internet accessible.

Web servers Built on these previous layers, the web is (now) a combination of PCs, minicomputers, mainframes, and rack-mounted servers. It is this layer that enables strategic as well as tactical advantages. Companies can provide self-service web pages (more of a tactical change, I think) and new services (strategic). New companies can form (Facebook, Twitter).

Virtualization The true advantage of virtualization is not consolidation of servers, but the ability to create or destroy machines quickly.

Cloud computing Once virtual machines were available and cheap, we created the cloud paradigm. Using an array of virtual computers, we can design applications that are distributed across multiple servers and are resistant to failure of any one of those servers. The distribution of work allows for scaling (up or down) as needed, adding or removing servers to handle the current load.

All of these technologies are now infrastructure. They are well-understood and easily available.

New technologies are plugging in to this infrastructure. Smartphones, tablets, and big data are all sitting on top of this (impressive) stack of technology. Smartphone and tablet apps use low-wattage user interfaces and connect to cloud computing systems for processing. Big data system use a similar design, with cloud computing engines providing the data for visualization software on PCs (or in web browsers).

When we built the first microcomputers, when we installed DOS on PCs, when we used modems to connect to bulletin-board systems, we thought we were creating the crest of technology. We thought we were building the top dog. But it didn't turn out that way. The PC and its later technologies let us build a significant computing stack.

Now that we have that stack, I think we can discard the traditional PC. The next decade should see the replacement of PCs. Not all at once, and at different rates in different environments. I expect PCs to exist in businesses for quite some time.

But disappear they shall.

Wednesday, April 24, 2013

Perhaps history should be taught backwards

A recent trip to the local computer museum (a decent place with mechanical computation equipment, various microcomputers and PCs, a DEC PDP-8/m and PDP-12, and a Univac 460) gave me time to think about our techniques for teaching history.

The curator gave us a tour, and he started with the oldest computing devices in his collection. Those devices included abaci, Napier's Bones, a slide rule, and electro-mechanical calculators. We then progressed forward in time, looking at the Univac and punch cards, the PDP-8 and PDP-12 and Teletype terminals, then the Apple II and Radio Shack TRS-80 microcomputers, and ended with modern-day PCs and tablets.

The tour was a nice, orderly progression through time,

And perhaps the wrong sequence.

A number of our group were members of the younger set. They had worked with smart phones and iPads and PCs, but nothing earlier than that. I suspect that for them, the early parts of the tour -- the early computation technologies -- were difficult.

Computing technologies have changed over time. Even the concept of computing has changed. Early devices (abaci, slide rules, and even hand-held calculators) were used to perform mathematical operations; the person knew the theory and overall purpose of the computations.

Today, we use computing devices for many purposes, and the underlying computations are distant (and hidden) from the user. Our purposes are not merely word processing and spreadsheets, but web pages, Twitter feeds, and games. (And blog posts.)

The technologies we use to calculate are different: today's integrated circuits do the work of 1960's large discrete electronics, which do the work of lots of wheels and cogs of a mechanical calculator. Even storage has changed: today's flash RAM holds data; in the 1970s it was core memory; in the 1950s it was mercury delay lines.

The changes in technology were mostly gradual with a few large jumps. Yet the technologies of today are sufficiently different from the early technologies that one is not recognizable from the other. Moving from today to the beginning requires a big jump in understanding.

Which is why I question the sequence of history. For computing technology, starting at the beginning requires a good understanding of the existing technology and techniques, and even then it is hard to see how an abacus or slide rule relates to today's smart phone.

Perhaps we should move in the reverse direction. Perhaps we should start with today's technology, on the assumption that people know about it, and work backwards. We can move to slightly older systems and compare them to today's technology. Then repeat the process, moving back into the past.

For example, storage. Showing someone a punch card, for example, means very little (unless they know the history). But leading that same person through technology (SD RAM chips, USB memory sticks, CD-ROMs, floppy disks, early floppy disks, magnetic drums, magnetic tape, paper tape, and eventually punch cards) might give that person an easier time. For someone who does not know the tech, studying something close to current technology (CD-ROMs) and then learning about the previous tech (floppy disks) might be better. It avoids the big leap into the past.

Sunday, March 24, 2013

Your New New Thing will one day become The New Old Thing

In IT, we've had New Things for years. Decades. New Things are the cool products and technologies that the alpha geeks are using. They are the products that appear in the trade magazines, the products that get reviews.

But you cannot have New Things without Old Things. If New Things are the things used by the alpha geeks, Old Things are the things used by the rest of us. They are the products that support the legacy systems. They are the products that are getting the job done (however clumsily).

When I started in IT, microcomputers were the New Thing. Apple II microcomputers ("Apple ][" for the purists), CP/M, IBM PCs running PC-DOS, word processors, spreadsheets, databases (in the form of dBase II), and the programming languages BASIC, Pascal, and C. The Old Things were IBM mainframes, tape drives, batch jobs, and COBOL.

I wanted to work on the New Things. I most emphatically wanted to avoid the Old Things.

Of course, the Old Things from that time were, at some earlier time, New Things. The IBM 360/370 processors were New Things, compared to the earlier 704, 709, and 1401 processors. COBOL was a New Thing compared to assembly language.

The IT industry is, in part, devoted to building New Things and constantly demoting technology to Old Thing status.

Just as COBOL slid from New Thing to Old Thing, so did those early PC technologies. PC-DOS and its sibling MS-DOS became Old Things to the New Thing of Microsoft Windows. The Intel 8088 processor became an Old Thing compared to the Intel 80286, which in turn became Old to the 80386, which in its turn became Old to the New Thing of the Intel Pentium processor.

The slide from New Thing to Old Thing is not limited to hardware. It happens to programming languages, too. Sun made C++ and Old Thing by introducing Java. Microsoft tried to make Java an Old Thing by introducing C# and .NET. While Microsoft may have failed in that attempt, Python and Ruby have succeeded at making Java an Old Thing.

The problem with building systems on any technology is that the technology will one day become and Old Thing. That in itself is not a problem, since the system will continue to work. If all components of the system remain in place, it will work with the same performance and reliability as its first day of operation.

But systems rarely keep all components in place. Hardware is replaced. Operating systems are upgraded. Peripheral devices are exchanged for newer models. Compilers are updated to new standards. And they key "component" in a system is often changed: the people who write, maintain, and operate the system come and go.

These changes stress the system and can disrupt it. A faster processor can change the timing of certain sections of code, and these changes can break the interaction with devices and other systems. A new version of the operating system can provide additional checks and detect invalid operations; some programs rely on quirky behaviors of the processor or operating system and break when the quirks are fixed.

People are a big challenge. Programmers have free will, and can choose to work on a system or they can choose to work somewhere else. To get programmers to work on your system, you have to bribe them with wages and benefits. Programmers have various tastes for technology, and some prefer to work on New Things and other prefer to work on Old Things. I don't know that either is better, but I tend to believe that the programmers pursuing the New Things are the ones with more initiative. (Some may argue that the programmers pursuing New Things are unreliable and ready to leave for an even Newer Thing, and programmers who enjoy the Old Thing are more stable. It is not an unconvincing argument.)

Early in its life, your system is a New Thing, and therefore attractive to certain programmers. But it does not remain a New Thing forever. It eventually matures into an Old Thing, and when it does the set of programmers that it attracts also changes. The interested programmers are those who like the Old Thing; the programmers pursuing New Things are off, um, pursuing New Things.

In the long term, the system graduates into a Very Old Thing and is of interest to a small set of programmers who enjoy working on Esoteric Curiosities. But these programmers come with Very Expensive Expectations for pay.

We often start projects with the idea of using New Thing technologies. This is an acceptable practice. But we too often believe that our systems and their technologies remain New Things. This is a delusion. Our systems will always change from New Things to Old Things. We should plan for that change and manage our system (and our teams) with that expectation.

Saturday, March 2, 2013

Any keyboard you like

For a programmer, the most important aspect of a computer may be the keyboard. It is through the keyboard that we write code, that we control the text editor (or IDE), and issue most commands.

Being of a certain age, my first experience with keyboards was not with a computer but with a typewriter. It was my parents' portable, manual typewriter; I have forgotten the brand. It was hard to use and it smelled of ink and machine oil. Yet it was a fun introduction to the keyboard.

Typewriters were fun, and computers were more fun. The keyboards were more modern, and had more keys (some of which made little sense to me).

I have used several keyboards, and the most enjoyable were the DEC keyboards. DEC keyboards were sleek and sophisticated compared to the other keyboards (Teletype ASR-33, Lear-Siegler ADM-3A, and IBM 3270). I also enjoyed the Zenith Z-100 keyboard (sculpted like an IBM Selectric typewriter) and the IBM Type M keyboard.

Typing on a good keyboard is a joy. Typing on a mediocre keyboard is not.

Sadly, today's PCs cannot use these venerable keyboards. Desktop PCs want to talk to a keyboard through USB, and tablets want bluetooth.

Yet all is not lost. Virtual keyboards may help.

Not the on-screen virtual keyboards of smart phones and tablets, but a different form of virtual keyboard. A keyboard that is drawn (usually with lasers) on a surface, with an accompanying scanner to detect "keypresses".

It strikes me that these keyboards can be used on a variety of surfaces. I'm hoping that some will be programmable (or at least configurable) so that I can create my own layout. (For example, I want the "Control" key on the ASDF home row.) I also have preferences for the arrow, HOME, and END keys. A virtual keyboard should allow for re-positioning of the keys.

Re-positioning the keys is nice, but it doesn't let me use the old keyboard. The surface is still a flat, unyielding surface with no feedback.

But do scanners really care about a flat surface? Can they be used on a lumpy surface? (I'm sure that some are quite fussy, and require a flat surface. But perhaps some are less fussy.)

If a virtual keyboard can be used on a flat surface, and I can re-program the key layouts... then perhaps I can configure the virtual keyboard to emulate an old-style keyboard (say, the DEC VT-52). And perhaps I can use the virtual keyboard on a lumpy surface... say, a DEC VT-52 keyboard.

Such an arrangement would let me use any keyboard I wanted with my computer. The virtual keyboard would do the work, and wouldn't care that I happened to rest my fingers on an old, outdated keyboard.

I would like that arrangement. It would give me the layout and feel of the keyboard of my choice. I wouldn't have to compromise with the current set of keyboards. All I need is a programmable virtual keyboard and a real keyboard that I enjoy.

Now, where is that old Zenith Z-100?

Wednesday, January 9, 2013

Tablets will not replace PCs -- at least not as you think they might

Tablets have seen quite a bit of popularity, and some people now ask: When will tablets replace PCs?

To which I answer: They won't, at least not in the way most folks think of replacing PCs.

The issue of replacement is subtle.

To examine the subtleties, let's review a previous revolution that saw one form of technology replaced by another.

Did PCs replace mainframes? In one sense, they have, yet in another they have not. PCs are the most common form of computing, used in businesses, schools, and homes. Yet mainframes remain, processing millions of transactions each day.

When microcomputers were first introduced, one could argue that PCs had replaced mainframes as the dominant form of computing. Even before the IBM PC, in the days of the Apple II and the Radio Shack TRS-80, there were more microcomputers than mainframes. (I don't have numbers, but I am confident that several tens of thousands of microcomputers were sold. I'm also confident that there were fewer than several tens of thousands of mainframes at the time.)

But sheer numbers is not an accurate measure. For one thing, a mainframe serves multiple users and a PC serves one. So maybe PCs replaced mainframes when the number of PC users exceeded the number of mainframe users.

The measure of users is not particularly clear, either. Many users of microcomputers were hobbyists, not serious day-in-day-out users. And certainly mainframes processed more business transactions than the pre-internet PCs.

Maybe we should measure not the number of devices or the number of users, but instead the conversations about the devices. When did people talk about PCs more than mainframes? We cannot count tweets or blog posts (neither had been invented) but we can look at magazines and publications. Prior to the IBM PC, most technology magazines discussed mainframes (or minicomputers) and ignored the microcomputer systems. There were some microcomputer-specific magazines, the most popular one being BYTE.

The introduction of the IBM PC generated a number of magazines. At this point the magazine content shifted towards PCs.

There are other measures: number of want ads, number of conferences and expos, number of attendees at conferences and expos. We could argue for quite some time about the appropriate metric.

But I think that the notion of one generation of computing technology replacing another is a slippery one, and perhaps a false one. I'm not sure that one technology really replaces an earlier one.

Consider: We have advanced from the mainframe era to the PC era, and then to the networked PC era, and then to the internet era, and now we are venturing into the mobile/cloud era. Each successive "wave" of technology has not replaced the previous "wave", but instead expanded the IT sphere. PCs did not replace mainframes; they replaced typewriters and dedicated word-processing systems and expanded the computational realm with spreadsheets. The web did not replace PCs, it expanded the computation realm with on-line transactions -- many of which are handled with back-end systems running on... mainframes!

A new technology expands our horizons.

Tablets will expand our computing realm. They will enable us to do new things. Some tasks of today that are handled by PCs will be handled by tablets, but not all of them.

So I don't see tablets replacing PCs in the coming year. Or the next year. Or the year after.

But if you insist on a metric, some measurement to clearly define the shift from PC to tablet, I suggest that we look at not numbers or users or apps but conversations. The number people who talk about tablets, compared to the number of people who talk about PCs. I would include blogs, tweets, web articles, advertisements, podcasts, and books as part of the conversation. You may want to include a few more.

Whatever you pick, look at them, take some measurements, and mark when the lines cross.

And then blog or tweet about it.

Wednesday, July 6, 2011

The increasing cost of C++ and other old tech

If you are running a project that uses C++, you may want to think about its costs. As I see it, the cost of C++ is increasing.

The cost of compilers and libraries is remaining constant. (Whether you are using Microsoft's Visual Studio, the open source gcc, or a different toolset.)

The increasing cost is not in the tools, but in the people.

First and most obvious: the time to build applications in C++ is longer than the time to build applications in modern languages. (Notice how I have not-so-subtlely omitted C++ from the set of "modern" languages. But modern or not, programming in C++ takes more time.) This drives up your costs.

Recent college graduates don't learn C++; they learn Java or Python. You can hire recent graduates and ask them to learn C++, but I suspect few will want to work on C++ to any large degree. The programmers who know C++ are the more senior folks, programmers with more experience and higher salary expectations. This drives up your costs.

Not all senior folks admit to knowing C++. Some of my colleagues have removed C++ from their resume, because they want to work on projects with different languages. This removes them from the pool of talent, reducing the number of available C++ programmers. Finding programmers is harder and takes longer. This drives up your cost.

This affect is not limited to C++. Other "old tech" suffers the same fate. Think about COBOL, Visual Basic, Windows API programming, the Sybase database, Powerbuilder, and any of a number of older technologies. Each was popular in their heyday; each is still around but with a much-diminished pool of talent.

When technologies become "old", they become expensive. Eventually, they become so expensive that the product must either change to a different technology set or be discontinued.

As a product manager, how do you project your development and maintenance costs? Do you assume a flat cost model (perhaps with modest increases to match inflation)? Or do you project increasing costs as labor becomes scarce? Do you assume that a single technology will last the life of the product, or do you plan for a migration to a new technology?

Technologies become less popular over time. The assumption that a set of unchanging technology will carry a product over its entire life is naive -- unless your product life is shorter than the technology cycle. Effective project managers will plan for change.