Showing posts with label timesharing. Show all posts
Showing posts with label timesharing. Show all posts

Wednesday, February 3, 2021

The return of timesharing

Timesharing (the style of computing from the 1970s) is making a comeback. Or rather, some of the concepts of timesharing are making a comeback. But first, what was timesharing and how was it different from computing at the time.

Mainframe computers in the 1960s were not the sophisticated devices of today, but simpler computers with equivalent power of an original IBM PC (5 MHz processor, 128KB RAM). They were, of course, much larger and much more expensive. Early mainframes ran one program at a time (much like PC-DOS). When one program finished, the next could be started. They had one "user" which was the system operator.

Timesharing was a big change in computing. Computers had become powerful enough to support multiple interactive users at the same time. It worked because interactive uses spent most of their time thinking and not typing, so a user's "job" is mostly waiting for input. A timesharing system held multiple jobs in memory and cycled among each of those jobs. Timesharing allowed remote access ("remote" in this case meaning "outside of the computer room") by terminals, which meant that individuals in different parts of an organization could "use the computer".

Timesharing raised the importance of time. Specifically, timesharing raised the importance of the time a program needed to run (the "CPU time") and the time a user was connected. The increase in computing power allowed the operating system to record these values for each session. Tracking those values was important, because it let the organization charge users and departments for the use of the computer. The computer was no longer a monolithic entity with a single (large) price tag, but a resource that could be expensed to different parts of the organization.

Now let's consider cloud computing.

It turns out that the cloud is not infinite. Nor is it free. Cloud computing platforms record charges for users (either individuals or organizations). Platforms charge for computing time, for data storage, and for many other services. Not every platform charges for the same things, with some offering a few services for free. 

The bottom line is the same: with cloud computing, an organization has the ability to "charge back" expenses to individual departments, something that was not so easy in the PC era or the web services era.

Or, to put it another way, we are undergoing a change in billing (and information about expenses) that is not new, but has not been seen in half a century. How did the introduction of timesharing (and its expense information) affect organizations? Will we see the same affects again?

I think we will.

Timesharing made interactive computing possible, and it made the expense of that computing visible to users. It let users decide how much computing that wanted to use, and users had discretion to use more or less computing resources.

Cloud computing provides similar information to users. (Or at least the organizations paying for the cloud services; I expect those organizations will "charge back" those expenses to users.) Users will be able to see those charges and decide how much computing resources they want to use.

As organizations move their systems from web to cloud (and from desktop to cloud), expect to see expense information allocated to the users of those systems. Internal users, and also (possibly) external users (partners and customers).

Timesharing made expense information available at a granular level. Cloud computing does the same.


Thursday, October 8, 2015

From multiprogramming to timesharing

Multiprogramming boosted the efficiency of computers, yet it was timesharing that improved the user experience.

Multiprogramming allowed multiple programs to run at the same time. Prior to multiprogramming, a computer could run one and only one program at a time. (Very similar to PC-DOS.) But multiprogramming was focussed on CPU utilitization and not on user experience.

To be fair, there was little in the was of "user experience". Users typed their programs on punch cards, placed the deck in a drawer, and waited for the system operator to transfer the deck to the card reader for execution. The results would be delivered in the form of a printout, and users often had to wait hours for the report.

Timesharing was a big boost for the user experience. It built on multiprogramming, running multiple programs at the same time. Yet it also changed the paradigm. Multiprogramming let a program run until an input-output operation, and then switched control to another program while the first waited for its I/O operation to complete. It was an elegant way of keeping the CPU busy, and therefore improving utilization rates.

With timesharing, users interacted with the computer in real time. Instead of punch cards and printouts, they typed on terminals and got their responses on those same terminals. That change required a more sophisticated approach to the sharing of resources. It wouldn't do to allow a single program to monopolize the CPU for minutes (or even a single minute) which could occur with multiprogramming. Instead, the operating system had to frequently yank control from one program and give it to another, allowing each program to run a little bit in each "time slice".

Multiprogramming focussed inwards, on the efficiency of the system. Timesharing focussed outwards, on the user experience.

In the PC world, Microsoft focussed on the user experience with early versions of Windows. Windows 1.0, Windows 2, Windows 3.0, and Windows 95 all made great strides in the user experience. But other versions of Windows focussed not on the user experience but on the internals: security, user accounts, group policies, and centralized control. Windows NT, Windows 2000, Windows XP all contained enhancements for the enterprise but not for the individual user.

Apple has maintained focus on the user, improving (or at least changing) the user experience with each release of Mac OSX. This is what makes Apple successful in the consumer market.

Microsoft focussed on the enterprise -- and has had success with enterprises. But enterprises don't want cutting-edge user interfaces, or GUi changes (improvements or otherwise) every 18 months. They want stability. Which is why Microsoft has maintained its dominance in the enterprise market.

Yet nothing is constant. Apple is looking to make inroads into the enterprise market. Microsoft wants to get into the consumer market. Google is looking to expand into both markets. All are making changes to the user interface and to the internals.

What we lose in this tussle for dominance is stability. Be prepared for changes to the user interface, to update mechanisms, and to the basic technology.

Monday, November 29, 2010

The return of frugality

We may soon see a return to frugality with computing resources.

Ages ago, computers were expensive, and affordable by only the very wealthy (governments and large companies). The owners would dole out computing power in small amounts and charge for each use. They used the notion of "CPU time", which was the amount of time actually spent by the CPU processing your task.

The computing model of the day was timesharing, the allocation of a fraction of a computer to each user, and the accounting of usage by each user. The key aspects measured were CPU time, connect time, and disk usage.

The PC broke the timesharing model. Instead of one computer shared by a number of people, the PC let each person have their own computer. The computers were small and low-powered (laughingly so by today's standards) but enough for individual needs. With the PC, the timesharing mindset was discarded, and along with it went the attention to efficiency.

A PC is a very different creature from a timesharing system. The purchase is much simpler, the installation is much simpler, and the administration is (well, was) non-existent. Instead of purchasing CPU power by the minute, you purchased the PC in one lump sum.

This change was significant. The PC model had no billing for CPU time; the monthly bill disappeared. That made PC CPU time "free". And since CPU time was free, the need for tight, efficient code become non-existent. (Another factor in this calculus was the availability of faster processors. Instead of writing better code, you could buy a new faster PC for less than the cost of the programming time.)

The cloud computing model is different from the PC model, and returns to the model of timesharing. Cloud computing is timesharing, although with virtual PCs on large servers.

With the shift to cloud computing, I think we will see a return to some of the timesharing concepts. Specifically, I think we will see the concept of billable CPU time. With the return of the monthly bill, I expect to see a renaissance of efficiency. Managers will want to reduce the monthly bill, and they will ask for efficient programs. Development teams will have to deliver.

With pressure to deliver efficient programs, development teams will look for solutions and the market will deliver them. I expect that the tool-makers will offer solutions that provide better optimization and cloud-friendly code. Class libraries will advertise efficiency on various platforms. Offshore development shops will cite certification in cloud development methods and efficiency standards. Eventually, the big consultant houses will get into the act, with efficiency-certified processes and teams.

I suspect that few folks will refer to the works of the earlier computing ages. Our predecessors had to deal with computing constraints much more severe than the cloud environments of the early twenty-first century, yet we will (probably) ignore their work and re-invent their techniques.