Showing posts with label new thing. Show all posts
Showing posts with label new thing. Show all posts

Monday, January 16, 2023

The end of more

From the very beginning, PC users wanted more. More pixels and more colors on the screen. More memory. Faster processors. More floppy disks. More data on floppy disks. (Later, it would be more data on hard disks.)

When IBM announced the PC/XT, we all longed for the space (and convenience) of its built-in hard drive. When IBM announced the PC/AT we envied those with the more powerful 80286 processor (faster! more memory! protected mode!). When IBM announced the EGA (Enhanced Graphics Adapter) we all longed for the higher-resolution graphics. With the PS/2, we wanted the reliability of 3.5" floppy disks and the millions of colors on a VGA display.

The desire for more didn't stop in the 1980s. We wanted the 80386 processor, and networks, and more memory, and faster printers, and multitasking. More programs! More data!

But maybe -- just maybe -- we have reached a point that we don't need (or want) more.

To quote a recent article in MacWorld:

"Ever since Apple announced its Apple silicon chip transition, the Mac Pro is the one Mac that everyone has anxiously been awaiting. Not because we’re all going to buy one–most of the people reading this (not to mention me, my editor, and other co-workers) won’t even consider the Mac Pro. It’s a pricey machine and the work that we do is handled just as well by any Mac in the current lineup".

Here's the part I find interesting:

"the work that we do is handled just as well by any Mac in the current lineup"

Let that sink in a minute.

The work done in the offices of MacWorld (which I assume is typical office work) can be handled by any of Apple's Mac computers. That means that the lowliest Apple computer can handle the work. Therefore, Macworld, being a commercial enterprise and wanting to reduce expenses, should be equipping its staff with the low-end MacBook Air or Mac mini PCs. To do otherwise would be wasteful.

It is not just the Apple computers that have outpaced computing needs. Low end Windows PCs also handle most office work. (I myself am typing this on a Dell desktop that was made in 2007.)

The move from 32-bit processing to 64-bit processing had a negligible affect on many computing tasks. Microsoft Word, for example, ran just as well in 32-bit Windows as it did in 64-bit Windows. The move to 64-bit processing did not improve word processing.

There are some who do still want more. People who play games want the best performance from not only video cards but also central processors and memory. Folks who edit video want performance and high-resolution displays.

But the folks who need, really need, high performance are a small part of the PC landscape. Many of the demanding tasks in computation can be handled better by cloud-based systems. It is only a few tasks that require local, high-performance processing.

The majority of PC users can get by with a low-end PC. The majority of PC users are content. One may look at a new PC with more memory or more pixels, but the envy has dissipated. We have enough colors, enough pixels, and enough storage.

If we have reached "peak more" in PCs, what does that mean for the future of PCs?

An obvious change is that people will buy PCs less frequently. With no urge to upgrade, people will keep their existing equipment longer. Corporations that buy PCs for employees may continue on a "replace every three years" schedule, but that was driven by depreciation rules and tax laws. Small mom-and-pop businesses will probably keep computers until a replacement is necessary (I suspect that they have been doing that all along). Some larger corporations may choose to defer PC replacements, noting that cash outlays for new equipment are still cash outlays, and should be minimized.

PC manufacturers will probably focus on other aspects of their wares. PC makers will strive for better battery life, durability, or ergonomic design. They may even offer Linux as an alternative to Windows.

It may be that our ideas about computing are changing. It may be that instead of local PCs that do everything, we are now looking at cloud computing (and perhaps older web applications) and seeing a larger expanse of computing. Maybe, instead of wanting faster PCs, we will shift our desires to faster cloud-based systems.

If that is true, then the emphasis will be on features of cloud platforms. They won't compete on pixels or colors, but they may compete on virtual processors, administration services, availability, and supported languages and databases. Maybe we won't be envious of new video cards and local memory, but envious instead of uptime and automated replication. 

Thursday, October 20, 2022

The Next Big Thing

What will we see as the next big thing?

Let's look at the history of computer technology -- or rather, a carefully curated version of the history of computer technology.

The history of computing can be divided into eras: The mainframe era, the minicomputer era, the micro/PC era, and so forth. And, with careful editing, we can see that these eras have similar durations: about 15 years each.

Let's start with mainframe computers. We can say that it ran from 1950 to 1965. Mainframe computers were (and still are) large, expensive computers capable of significant processing. They are housed in rooms with climate control and dedicated power. Significantly, mainframe computers are used by people only indirectly. In the mainframe age, programmers submitted punch cards which contained source code; the cards were fed into the computer by an operator (one who was allowed in the computer room); the computer compiled the code and ran the program; output was usually on paper and delivered to the programmer some time later. Mainframe computers also ran batch jobs to read and process data (usually financial transactions). Data was often read from magnetic tape and output could be to magnetic tape (updated data) or paper (reports).

Minicomputers were popular from 1965 to 1980. Minicomputers took advantage of newer technology; they were smaller, less expensive, and most importantly, allowed for multiple users on terminals (either paper-based or CRT-based). The user experience for minicomputers was very different from the experience on mainframes. Hardware, operating systems, and programming languages let users interact with the computer in "real time"; one could type a command and get a response.

Microcomputers and Personal Computers (with text displays, and without networking) dominated from 1980 to 1995. It was the age of the Apple II and the IBM PC, computers that were small enough (and inexpensive enough) for an individual to own. They inherited the interactive experience of minicomputers, but the user was the owner and could change the computer at will. (The user could add memory, add disk, upgrade the operating system.)

Personal Computers (with graphics and networking) made their mark from 1995 to 2010. They made the internet available to ordinary people. Graphics made computers easier to use.

Mobile/cloud computers became dominant in 2010. Mobile devices without networks were not enough (the Palm Pilot and the Windows pocket computers never gained much traction). Even networked devices such as the original iPhone and the Nokia N800 saw limited acceptance. It was the combination of networked mobile device and cloud services that became the dominant computing model.

That's my curated version of computing history. It omits a lot, and it fudges some of the dates. But it shows a trend, one that I think is useful to observe.

That trend is: computing models rise and fall, with their typical life being fifteen years.

How is this useful? Looking at the history, we can see that the mobile/cloud computing model has been dominant for slightly less than fifteen years. In other words, its time is just about up.

More interesting is that, according to this trend (and my curated history is too pretty to ignore), something new should come along and replace mobile/cloud as the dominant form of computing.

Let's say that I'm right -- that there is a change coming. What could it be?

It could be any of a number of things. Deep-fake tech allows for the construction of images, convincing images, of any subject. It could be virtual reality, or augmented reality. (The difference is nontrivial: virtual reality makes full images, augmented reality lays images over the scene around us.) It could be watch-based computing. 

My guess is that it will be augmented reality. But that's a guess.

Whatever the new thing is, it will be a different experience from the current mobile/cloud model. Each of the eras of computing had its own experience. Mainframes had an experience of separation and working through operators. Minicomputers had interactive experience, although someone else controlled the computer. Personal computers had interaction and the user owned the computer. Mobile/cloud let people hold computers in their hand and use them on the move.

Also, the next big thing does not eliminate the current big thing. Mobile/cloud did not eliminate web-based systems. Web-based systems did not eliminate desktop applications. Even text-mode interactive applications continue to this day. The next big thing expands the world of computing.

Sunday, March 24, 2013

Your New New Thing will one day become The New Old Thing

In IT, we've had New Things for years. Decades. New Things are the cool products and technologies that the alpha geeks are using. They are the products that appear in the trade magazines, the products that get reviews.

But you cannot have New Things without Old Things. If New Things are the things used by the alpha geeks, Old Things are the things used by the rest of us. They are the products that support the legacy systems. They are the products that are getting the job done (however clumsily).

When I started in IT, microcomputers were the New Thing. Apple II microcomputers ("Apple ][" for the purists), CP/M, IBM PCs running PC-DOS, word processors, spreadsheets, databases (in the form of dBase II), and the programming languages BASIC, Pascal, and C. The Old Things were IBM mainframes, tape drives, batch jobs, and COBOL.

I wanted to work on the New Things. I most emphatically wanted to avoid the Old Things.

Of course, the Old Things from that time were, at some earlier time, New Things. The IBM 360/370 processors were New Things, compared to the earlier 704, 709, and 1401 processors. COBOL was a New Thing compared to assembly language.

The IT industry is, in part, devoted to building New Things and constantly demoting technology to Old Thing status.

Just as COBOL slid from New Thing to Old Thing, so did those early PC technologies. PC-DOS and its sibling MS-DOS became Old Things to the New Thing of Microsoft Windows. The Intel 8088 processor became an Old Thing compared to the Intel 80286, which in turn became Old to the 80386, which in its turn became Old to the New Thing of the Intel Pentium processor.

The slide from New Thing to Old Thing is not limited to hardware. It happens to programming languages, too. Sun made C++ and Old Thing by introducing Java. Microsoft tried to make Java an Old Thing by introducing C# and .NET. While Microsoft may have failed in that attempt, Python and Ruby have succeeded at making Java an Old Thing.

The problem with building systems on any technology is that the technology will one day become and Old Thing. That in itself is not a problem, since the system will continue to work. If all components of the system remain in place, it will work with the same performance and reliability as its first day of operation.

But systems rarely keep all components in place. Hardware is replaced. Operating systems are upgraded. Peripheral devices are exchanged for newer models. Compilers are updated to new standards. And they key "component" in a system is often changed: the people who write, maintain, and operate the system come and go.

These changes stress the system and can disrupt it. A faster processor can change the timing of certain sections of code, and these changes can break the interaction with devices and other systems. A new version of the operating system can provide additional checks and detect invalid operations; some programs rely on quirky behaviors of the processor or operating system and break when the quirks are fixed.

People are a big challenge. Programmers have free will, and can choose to work on a system or they can choose to work somewhere else. To get programmers to work on your system, you have to bribe them with wages and benefits. Programmers have various tastes for technology, and some prefer to work on New Things and other prefer to work on Old Things. I don't know that either is better, but I tend to believe that the programmers pursuing the New Things are the ones with more initiative. (Some may argue that the programmers pursuing New Things are unreliable and ready to leave for an even Newer Thing, and programmers who enjoy the Old Thing are more stable. It is not an unconvincing argument.)

Early in its life, your system is a New Thing, and therefore attractive to certain programmers. But it does not remain a New Thing forever. It eventually matures into an Old Thing, and when it does the set of programmers that it attracts also changes. The interested programmers are those who like the Old Thing; the programmers pursuing New Things are off, um, pursuing New Things.

In the long term, the system graduates into a Very Old Thing and is of interest to a small set of programmers who enjoy working on Esoteric Curiosities. But these programmers come with Very Expensive Expectations for pay.

We often start projects with the idea of using New Thing technologies. This is an acceptable practice. But we too often believe that our systems and their technologies remain New Things. This is a delusion. Our systems will always change from New Things to Old Things. We should plan for that change and manage our system (and our teams) with that expectation.