Timesharing (the style of computing from the 1970s) is making a comeback. Or rather, some of the concepts of timesharing are making a comeback. But first, what was timesharing and how was it different from computing at the time.
Mainframe computers in the 1960s were not the sophisticated devices of today, but simpler computers with equivalent power of an original IBM PC (5 MHz processor, 128KB RAM). They were, of course, much larger and much more expensive. Early mainframes ran one program at a time (much like PC-DOS). When one program finished, the next could be started. They had one "user" which was the system operator.
Timesharing was a big change in computing. Computers had become powerful enough to support multiple interactive users at the same time. It worked because interactive uses spent most of their time thinking and not typing, so a user's "job" is mostly waiting for input. A timesharing system held multiple jobs in memory and cycled among each of those jobs. Timesharing allowed remote access ("remote" in this case meaning "outside of the computer room") by terminals, which meant that individuals in different parts of an organization could "use the computer".
Timesharing raised the importance of time. Specifically, timesharing raised the importance of the time a program needed to run (the "CPU time") and the time a user was connected. The increase in computing power allowed the operating system to record these values for each session. Tracking those values was important, because it let the organization charge users and departments for the use of the computer. The computer was no longer a monolithic entity with a single (large) price tag, but a resource that could be expensed to different parts of the organization.
Now let's consider cloud computing.
It turns out that the cloud is not infinite. Nor is it free. Cloud computing platforms record charges for users (either individuals or organizations). Platforms charge for computing time, for data storage, and for many other services. Not every platform charges for the same things, with some offering a few services for free.
The bottom line is the same: with cloud computing, an organization has the ability to "charge back" expenses to individual departments, something that was not so easy in the PC era or the web services era.
Or, to put it another way, we are undergoing a change in billing (and information about expenses) that is not new, but has not been seen in half a century. How did the introduction of timesharing (and its expense information) affect organizations? Will we see the same affects again?
I think we will.
Timesharing made interactive computing possible, and it made the expense of that computing visible to users. It let users decide how much computing that wanted to use, and users had discretion to use more or less computing resources.
Cloud computing provides similar information to users. (Or at least the organizations paying for the cloud services; I expect those organizations will "charge back" those expenses to users.) Users will be able to see those charges and decide how much computing resources they want to use.
As organizations move their systems from web to cloud (and from desktop to cloud), expect to see expense information allocated to the users of those systems. Internal users, and also (possibly) external users (partners and customers).
Timesharing made expense information available at a granular level. Cloud computing does the same.
No comments:
Post a Comment