Showing posts with label personal computers. Show all posts
Showing posts with label personal computers. Show all posts

Wednesday, February 19, 2020

A server that is not a PC

Today, servers are PCs. They have the same architecture as PCs. They run PC operating systems. But do they have to be PCs? Is there another approach? There might be.

First, let's consider PCs. PCs have lots of parts, from processor to memory to storage, but the one thing that makes a PC a PC is the video. PCs use memory-mapped video. They dedicate a portion of memory to video display. (Today, the dedicated memory is a "window" into the much larger memory on the video card.)

Which is a waste, as servers do not display video. (Virtual machines on servers do display video, but it is all a game of re-assigned memory. If you attach a display to a server, it does not show the virtual desktop.)

Suppose we made a server that did not dedicate this memory to video. Suppose we created a new architecture for servers, an architecture that is exactly like the servers today, but with no memory reserved for video and no video card.

Such a change creates two challenges: installing an operating system and the requirements of the operating system.

First, we need a way to install an operating system (or a hypervisor that will run guest operating systems). Today, the process is simple: attach a keyboard and display to the server, plug in a bootable USB memory stick, and install the operating system. The boot ROM and the installer program both use the keyboard and display to communicate with the user.

In our new design, they cannot use a keyboard and display. (The keyboard would be possible, but the server has no video circuitry.)

My first though was to use a terminal and attach it to a USB port. A terminal contains the circuitry for a keyboard and display; it has the video board. But no such devices exist nowadays (outside of museums and basements) and asking someone to manufacture them would be a big ask. I suppose one could use a tiny computer such as a Raspberry Pi, with a terminal emulator program. But that solution is merely a throwback to the pre-PC days.

A second idea is to change the server boot ROM. Instead of presenting messages on a video display, and accepting input from a keyboard, the server could run a small web server and accept requests from the network port. (A server is guaranteed to have a network port.)

The boot program could run a web server, just as network routers allow configuration with built-in web servers. When installing a new server, one can simply attach it to your network and then connect to it via SSH.

Which brings us to the next challenge: an operating system. (Or a hypervisor.)

Today, servers run PC operating systems. Hypervisors (such as Microsoft's Hyper-V) are nothing more than standard PC operating systems that have been tweaked to support guest operating systems. As such, they expect to find a video card.

Since our server does not have a video card, these hypervisors will not work properly (if at all). They will have to be tweaked again to run without a video card. (Which should be somewhat easy, as hypervisors do not use the video card for their normal operation.)

Guest operating systems may be standard, unmodified PC operating systems. They want to see a video card, but the hypervisor provides virtualized video cards, one for each instance of a guest operating system. The guest operating systems never see the real video card, and don't need to know that one is present -- or not.

What's the benefit? A simpler architecture. Servers don't need video cards, and in my opinion, shouldn't have them.

Which would, according to my "a PC must have memory-mapped video" rule, make servers different from PCs. Which I think we want.

The video-less server is an idea, and I suspect that it will remain an idea. Implementing it requires special hardware (a PC minus the video circuitry that it normally has) and special software (an operating system that doesn't demand a video card) and special boot ROM (one that talks over SSH). As long as PCs are dominant, our servers will simply be PCs with no display attached.

But if the market changes, and PCs lose their dominance, then perhaps one day we will see servers without video cards.

Sunday, June 18, 2017

Three models of computing

Computing comes in different flavors. We're probably most familiar with personal computers and web applications. Let's look at the models used by different vendors.

Apple has the simplest model: Devices that compute. Apple has built it's empire on high-quality personal computing devices. They do not offer cloud computing services. (They do offer their "iCloud" backup service, which is an accessory to the central computing of the iMac or Macbook.) I have argued that this model is the same as personal computing in the 1970s.

Google has a different model: web-based computing. This is obvious in their Chromebook, which is a lightweight computer that can run a browser -- and nothing else. All of the "real" computing occurs on the servers in Google's data center. The same approach is visible in most of the Google Android apps -- lightweight apps that communicate with servers. In some ways, this model is an update of the 1970s minicomputer model, with terminals connected to a central processor.

Microsoft has a third model, a hybrid of the two. In Microsoft's model, some computing occurs on the personal computer and some occurs in the data center. It is the most interesting of the two, requiring communication and coordination of two components.

Microsoft did not always have their current approach. Their original model was the same as Apple's: personal computers as complete and independent computing entities. Microsoft started with implementations of the BASIC language, and then sold PC-DOS to IBM. Even early versions of Windows were for stand-alone, independent PCs.

Change to that model started with Windows for Workgroups, and became serious with Windows NT, domains, and ActiveDirectory. Those three components allowed for networked computing and distributed processing. (There were network solutions from other vendors, but the Microsoft set was a change in Microsoft's strategy.)

Today, Microsoft offers an array of services under its "Azure" mark. Azure provides servers, message queues, databases, and other services, all hosted in its cloud environment. They allow individuals and companies to create applications that can combine PC and cloud technologies. These applications perform some computing on the local PC and some computing in the Azure cloud. You can, of course, build an application that runs completely on the PC, or completely in the cloud. That you can build those applications shows the flexibility of the Microsoft platform.

I think this hybrid model, combining local computing and server-based computing, has the best potential. It is more complex, but it can handle a wider variety of applications than either the PC-only solution (Apple's) or the server-only solution (Google's). Look for Microsoft to support this model with development tools, operating systems, and communication protocols and libraries.

Looking forward, I can see Microsoft working on a "fluid" model of computing, where some processing can move from the server to the local PC (for systems with powerful local PCs) and from the PC to the server (for systems with limited PCs).

Many things in the IT realm started in a "fixed" configuration, and over time have become more flexible. I think processing is about to join them.

Tuesday, December 8, 2015

The PC Market Stabilizes

Once again we have a report of declining sales of personal computers, and once again some folks are worrying that this might signal the end of the personal computer. While the former is true, the latter certainly is false.

The decline in sales signals not an abandonment of the personal computer, but a change in the technology growth of PCs.

To put it bluntly, PCs have stopped growing.

By "PC" or "personal computer" I refer to desktop and laptop computers that run Windows. Apple products are excluded from this group. I also exclude ChromeBooks, smartphones, and tablets.

Sales of personal computers are declining because demand is declining. Why is demand declining? Let's consider the factors that drive demand:

Growth of the organization When a business grows and increases staff, it needs more PCs.

Replacement of failing equipment Personal computers are cheap enough to replace rather than repair. When a PC fails, the economically sensible thing to do is to replace it.

New features deemed necessary Some changes in technology are considered important enough to warrant the replacement of working equipment. In the past, this has included CD drives, the Windows (or perhaps OS/2) operating system, new versions of Windows, the 80386 and Pentium processors, and VGA monitors (to replace older monitors).

The recent economic recession saw many companies reduce their ranks and only now are they considering new hires. Thus, the growth of organizations has been absent as a driver of PC sales.

The basic PC has remained unchanged for the past several years, with the possible exception of laptops, which have gotten not more powerful but simply thinner and lighter. The PC one buys today is very similar to the PC of a few years ago. More importantly, the PC of today has no new compelling feature - no larger hard drive (well, perhaps larger, but the old one was large enough), no faster processor, no improved video. (Remember, I am excluding Apple products in this analysis. Apple has made significant changes to its hardware.)

The loss of these two drivers of PC sales means that the one factor that forces the sales of PCs is the replacement of failing equipment. Personal computers do fail, but they are, overall, fairly reliable. Thus, replacement of equipment is a weak driver for sales.

In this light, reduced sales is not a surprise.

The more interesting aspect of this analysis is that the technology leaders who introduced changes (Microsoft and Intel) have apparently decided that PCs are now "good enough" and we don't need to step up the hardware. Microsoft is content to sell (or give away) Windows 10 and designed it to run on existing personal computers. Intel designs new processors, but has introduced no new "must have" features. (Or if they have, the marketing of such features has been remarkably quiet.)

Perhaps Microsoft and Intel are responding to their customers. Perhaps the corporate users of computers have decided that PCs are now "good enough", and that PCs have found their proper place in the corporate computing world. That would be consistent with a decline in sales.

I expect the sales of PCs to continue to decline. I also expect corporations to continue to use PCs, albeit in a reduced role. Some applications will move to smart phones and tablets. Other applications will move to virtualized PCs (or virtual desktops). And some applications will remain on old-fashioned desktop (or laptop) personal computers.

* * * * *

Some have suggested that the decline of PC sales may also be explained, in part, by the rise of Linux. When faced with the prospect of replacing an aged PC (because it is old enough that Windows does not support it), people are installing Linux. Thus, some sales of new PCs are thwarted by open source software.

I'm certain that this is happening, yet I'm also convinced that it happens in very small numbers. I use Linux on old PCs myself. Yet the market share of Linux is in the one- and two-percent range. (For desktop PCs, not servers.) This is too small to account for much of the annual decline in sales.

Thursday, October 22, 2015

Windows 10 means a different future for PCs

Since the beginning, PCs have always been growing.  The very first IBM PCs used 16K RAM chips (for a maximum of 64K on the CPU board); these were quickly replaced by PCs with 64K RAM chips (which allowed 256K on the CPU board).

We in the PC world are accustomed to new releases of bigger and better hardware.

It may have started with that simple memory upgrade, but it continued with hard drives (the IBM PC XT), enhanced graphics, higher-capacity floppy disks, and a more capable processor (the IBM PC AT), and an enhanced buss, even better graphics, and even better processors (the IBM PS/2 series).

Improvements were not limited to IBM. Compaq and other manufacturers revised their systems and offered larger hard drives, better processors, and more memory. Every year saw improvements.

When Microsoft became the market leader, it played an active role in the specification of hardware. Microsoft also designed new operating systems for specific minimum platforms: you needed certain hardware to run Windows NT, certain (more capable) hardware for Windows XP, and even more capable hardware for Windows Vista.

Windows 10 may change all of that.

Microsoft's approach to Windows 10 is different from previous versions of Windows. The changes are twofold. First, Windows 10 will see a constant stream of updates instead of the intermittent service packs of previous versions. Second, Windows 10 is "it" for Windows -- there will be no later release, no "Windows 11".

With no Windows 11, people running Windows 10 on their current hardware should be able to keep running it. Windows Vista forced a lot of people to purchase new hardware (which was one of the objections to Windows Vista); Windows 11 won't force that because it won't exist.

Also consider: Microsoft made it possible for just about every computer running Windows 8 or Windows 7 (or possibly Windows Vista) to upgrade to Windows 10. Thus, Windows 10 requires just as much hardware as those earlier versions.

What may be happening is that Microsoft has determined that Windows is as big as it is going to be.

This makes sense for desktop PCs and for servers running Windows.

Most servers running Windows will be in the cloud. (They may not be now, but they will be soon.) Cloud-based servers don't need to be big. With the ability to "spin up" new instances of a server, an overworked server can be given another instance to handle the load. A system can provide more capacity with more servers. It is not necessary to make the server bigger.

Desktop PCs, either in the office or at home, run a lot of applications, and these applications (in Microsoft's plan) are moving to the cloud. You won't need a faster machine to run the new version of Microsoft Word -- it runs in the cloud and all you need is a browser.

It may be that Microsoft thinks that PCs have gotten as powerful as they need to get. This is perhaps not an unreasonable assumption. PCs are powerful and can handle every task we ask of them.

As we shift our computing from PCs and discrete servers to the cloud, we eliminate the need for improvements to PCs and discrete servers. The long line of PC growth stops. Instead, growth will occur in the cloud.

Which doesn't mean that PCs will be "frozen in time", forever unchanging. It means that PC *growth* will stop, or at least slow to a glacial pace. This has already happened with CPU clock frequencies and buss widths. Today's CPUs are about as fast (in terms of clock speed) as CPUs from 2009. Today's CPUs use a 64-bit data path, which hasn't changed since 2009. PCs will grow, slowly. Desktop PCs will become physically smaller. Laptops will become thinner and lighter, and battery life will increase.

PCs, as we know them today, will stay as we know them today.

Thursday, July 31, 2014

Not so special

The history of computers has been the history of things becoming not special.

First were the mainframes. Large, expensive computers ordered, constructed, delivered, and used as a single entity. Only governments and wealthy corporations could own (or lease) a computer. Once acquired, the device was a singleton: it was "the computer". It was special.

Minicomputers reduced the specialness of computers. Instead of a single computer, a company (or a university) could purchase several minicomputers. Computers were no longer single entities in the organization. Instead of "the computer" we had "the computer for accounting" or "the computer for the physics department".

The opposite of "special" is "commodity", and personal computers brought us into a world of commodity computers. A company could have hundreds (or thousands) of computers, all identical.

Yet some computers retained their specialness. E-mail servers were singletons -- and therefore special. Web servers were special. Database servers were special.

Cloud computing reduces specialness again. With cloud systems, we can create virtual systems on demand, from pre-stocked images. We can store an image of a web server and when needed, instantiate a copy and start using it. We have not a single web server but as many as we need. The same holds for database servers. (Of course, cloud systems are designed to use multiple web servers and multiple database servers.)

In the end, specialness goes away. Computers, all computers, become commodities. They are not special.

Sunday, October 2, 2011

The end of the PC age?

Are we approaching the end of the PC age? It seems odd to see the end of the age, as I was there at the beginning. The idea that a technology should have a shorter lifespan than a human leads one to various contemplations.

But perhaps the idea is not so strange. Other technologies have come and gone: videotape recorders, hand-held calculators, Firewire, and the space shuttle come to mind. (And by "gone", I mean "used in limited quantities, if at all". The space shuttles are gone; VCRs and calculators are still in use but considered curiosities.

Personal computers are still around, of course. People use them in the office and at home. They are entrenched in the office, and I think that they will remain present for at least a decade. Home use, in contrast, will decline quickly, with personal computers replaced by game consoles, cell phones, and tablets. Computing will remain in the office and in the home.

But here's the thing: People do not think of cell phones and tablets as personal computers.

Cell phones and tablets are cool computing devices, but they are not "personal computers". Even Macbooks and iMac computers are not "personal computers". The term "PC" was strongly associated with IBM (with "clone" for other brands) and Microsoft DOS (and later, Windows).

People have come to associate the term "personal computer" with a desktop or laptop computer of a certain size and weight, of any brand, running Microsoft Windows. Computing devices in other forms, or running other operating systems, are not "personal computers": they are something else: a Macbook, a cell phone, an iPad... something. But not a PC.

Microsoft's Windows 8 offers a very different experience from the "classic Windows". I believe that this difference is enough to break the idea of a "personal computer". That is, a tablet running Windows 8 will be considered a "tablet" and not a "PC". New desktop computers with touchscreens will be considered computers, but probably not "PCs". Only the older computers with keyboards and mice (and no touchscreen) will be considered "personal computers".

Microsoft has the opportunity to brand these new touchscreen computers. I suggest that they take advantage of this opportunity. I recognize that their track record with product names has been poor ("Zune", "Kin", and the ever-awful "Bob") but they must do something.

The term "personal computer" is becoming a reference to a legacy device, to our father's computing equipment. Personal computers were once the Cool New Thing, but no more.

Saturday, April 2, 2011

The Opposite of Batch

We of the PC revolution like to think that the opposite of big, hulking mainframes are the nimble and friendly personal computers of our trade. Mainframes are expensive, hard to program, hard to use, and come with bureaucracies and rules. Personal computers, in contrast, are affordable, easy to program, easy to use, and one is free to do what one will with the PC -- the only rules are the ones we impose upon ourselves.

Yet mainframes and personal computers have one element in common: Both have fixed resources. The processor, the memory, the disk for storage... all are of fixed capacity. (Changeable only by taking the entire system off-line and adding or removing components.)

Mainframes are used by many people, and the resources must be allocated to the different users. The batch systems used by large mainframes are a means of allocating resources efficiently (at least from the perspective of the hardware). Users must wait their turn, submit their request, and wait for the results. The notion of batch is necessary because there are more requests than computing resources.

Personal computers provide interactive programs by providing more computing resources than a single person requires. The user can start jobs (programs) whenever he wants, because there is always spare capacity. The processor is fast enough, the memory is large enough, and the disk is also large enough.

But personal computers still offer fixed capacity. We rarely notice it, since we rarely bump up against the limits. (Although when we do, we often become irritated. Also, our personal systems perform poorly when they require more resources than available. Try to boot a Windows PC with a completely full hard disk.)

The true opposite of batch is not interactive, but flexible resources -- resources that can change as we need them. Such a design is provided by cloud computing. With cloud computing, we can increase or decrease the number of processors, the number of web server instances, the memory, our data store -- all of our resources -- without taking the system off-line. Our computing platform becomes elastic, expanding or contracting to meet our needs, rather than our needs adjusting to fit the fixed-size platform. Perhaps a better name for cloud computing would have been "balloon computing", since our resources can grow or shrink like a balloon.

This inversion of shape -- the system conforms to our needs, not our needs to the system -- is the revolutionary change offered by cloud computing. It will allow us to think of computing in different ways, to design new types of systems. We will have less thought of hardware constraints and more thought for problem design and business constraints. Cloud computing will free us from the drudgery of system design for hardware -- and let us pick up the drudgery of system design for business logic.

With cloud computing, IT becomes a better partner for the business. IT can enable faster business processes, more efficient supply chains, and better market predictions.