BASIC was the last language with variants. Not "variant" in the flexible-value type known as "Variant", but in different implementations. Different dialects.
Many languages have versions. C# has had different releases, as has Java. Perl is transitioning from version 5 (which had multiple sub-versions) to version 6 (which will most likely have multiple sub-versions). But that's not what I'm talking about.
Some years ago, languages had different dialects. There were multiple implementations with different features. COBOL and FORTRAN all had machine-specific versions. But BASIC had the most variants. For example:
- Most BASICs used the "OPEN" statement to open files, but HP BASIC and GE BASIC used the "FILES" statement which listed the names of all files used in the program. (An OPEN statement lists only one file, and a program may use multiple OPEN statements.)
- Most BASICs used parentheses to enclose variable subscripts, but some used square brackets.
- Some BASICS had "ON n GOTO" statements but some used "GOTO OF n" statements.
- Some BASICS allowed the apostrophe as a comment indicator; others did not.
- Some BASICS allowed for statement modifiers, such as "FOR" or "WHILE" at the end of a statement and others did not.
These are just some of the differences in the dialects of BASIC. There were others.
What interests me is not that BASIC had so many variants, but that languages since then have not. The last attempt at a dialect of a language was Microsoft's Visual J++ as a variant of Java.
They were challenged in court by Sun, and no one has attempted a special
version of a language since. Because of this, I place the demise of variants in the year 2000.
There are two factors that come to mind. One is standards, the other is open source.
BASIC was introduced to the industry in the 1960s. There was no standard for BASIC, except perhaps for the Dartmouth implementation, which was the first implementation. The expectation of standards has risen since then, with standards for C, C++, Java, C#, JavaScript, and many others. With clear standards, different implementations of languages would be fairly close.
The argument that open source prevented the creation of variants of languages makes some sense. After all, one does not need to create a new, special version of a language when the "real" language is available for free. Why invest effort into a custom implementation? And the timing of open source is coincidental with the demise of variants, with open source rising just as language variants disappeared.
But the explanation is different, I think. It was not standards (or standards committees) and it was not open source that killed variants of languages. It was the PC and Windows.
The IBM PC and PC-DOS saw the standardization and commoditization of hardware, and the separation of software from hardware.
In the 1960s and 1970s, mainframe vendors and minicomputer vendors competed for customer business. They sold hardware, operating systems, and software. They needed ways to distinguish their offerings, and BASIC was one way that they could do that.
Why BASIC? There were several reasons. It was a popular language. It was easily implemented. It had no official standard, so implementors could add whatever features they wanted. A hardware manufacturer could offer their own, special version of BASIC as a productivity tool. IBM continued this "tradition" with BASIC in the ROM of the IBM PC and an advanced BASIC with PC-DOS.
But PC compatibles did not offer BASIC, and didn't need to. When manufacturers figured out how to build compatible computers, the factors for selecting a PC compatible were compatibility and price, not a special version of BASIC. Software would be acquired separately from the hardware.
Mainframes and minicomputers were expensive systems, sold with operating systems and software. PCs were different creatures, sold with an operating system but not software.
It's an idea that holds today.
With software being sold (or distributed, as open source) separately from the hardware, there is no need to build variants. Commercial languages (C#, Java, Swift) are managed by the company, which has an incentive for standardization of the language. Open source languages (Perl, Python, Ruby) can be had "for free", so why build a special version -- especially when that special version will need constant effort to match the changes in the "original"? Standard-based languages (C, C++) offer certainty to customers, and variants on them offer little advantage.
The only language that has variants today seems to be SQL. That makes sense, as the SQL interpreter is bundled with the database. Creating a variant is a way of distinguishing a product from the competition.
I expect that the commercial languages will continue to evolve along consistent lines. Microsoft will enhance C#, but there will be only the Microsoft implementation (or at least, the only implementation of significance). Oracle will maintain Java. Apple will maintain Swift.
The open source languages will evolve too. But Perl, Python, and Ruby will continue to see single implementations.
SQL will continue be the outlier. It will continue to see variants, as different database vendors supply them. It will be interesting to see what happens with the various NoSQL databases.
Showing posts with label minicomputers. Show all posts
Showing posts with label minicomputers. Show all posts
Sunday, August 14, 2016
Tuesday, October 21, 2014
Cloud systems are the new mainframe
The history of computers can be divided (somewhat arbitrarily) into six periods. These are:
Mainframe
Timeshare (on mainframes)
Minicomputers
Desktop computers (includes pre-PC microcomputers, workstations, and laptops)
Servers and networked desktops
Mobile devices (phones and tablets)
I was going to add 'cloud systems' to the list as a seventh period, but I got to thinking.
My six arbitrary periods of computing show definite trends. The first trend is size: computers became physically smaller in each successive period. Mainframe computers were (and are) large systems that occupy rooms. Minicomputers were the sizes of refrigerators. Desktop computers fit on (or under) a desk. Mobile devices are small enough to carry in a shirt pocket.
The next trend is cost. Each successive period has a lower cost than the previous one. Mainframes cost in the hundreds of thousands of dollars. Minicomputers in the tens of thousands. Desktop computers were typically under $3000 (although some did edge up near $10,000) and today are usually under $1000. Mobile device costs range from $50 to $500.
The third trend is administrative effort or "load". Mainframes needed a team of well-trained attendants. Minicomputers needed one knowledgeable person to act as "system operator" or "sysop". Desktop computers could be administered by a geeky person in the home, or for large offices a team of support persons (but less than one support person per PC). Mobile devices need... no one. (Well, technically they are administered by the tribal chieftains: Apple, Google, or Microsoft.)
Cloud systems defy these trends.
By "cloud systems", I mean the cloud services that are offered by Amazon.com, Microsoft, Google, and others. I am including all of the services: infrastructure as a service, platform as a service, software as a service, machine images, queue systems, compute engines, storage engines, web servers... the whole kaboodle.
Cloud systems are large and expensive. They also tend to be limited in number, perhaps because they are large and expensive. They also have a sizable team of attendants. Cloud systems are complex and a large team is needed to keep everything running.
Cloud systems are much like mainframe computers.
The cloud services that are offered by vendors are much like the timesharing services offered by mainframe owners. With timesharing, customers could buy just as much computing time as they needed. Sound familiar? It's the model used by cloud computing.
We have, with cloud computing, returned to the mainframe era. This period has many similarities with the mainframe period. Mainframes were large, expensive to own, complex, and expensive to operate. Cloud systems are the same. The early mainframe period saw a number of competitors: IBM, NCR, CDC, Burroughs, Honeywell, and Univac, to name a few. Today we see competition between Amazon.com, Microsoft, Google, and others (including IBM).
Perhaps my "periods of computing history" is not so much a linear list as a cycle. Perhaps we are about to go "around" again, starting with the mainframe (or cloud) stage of expensive systems and evolve forward. What can we expect?
The mainframe period can be divided into two subperiods: before the System/360 and after. Before the IBM System/360, there was competition between companies and different designs. After the IBM System/360, companies standardized on that architecture. The System/360 design is still visible in mainframes of today.
An equivalent action in cloud systems would be the standardization of a cloud architecture. Perhaps the Open Stack software, perhaps Microsoft's Azure. I do not know which it will be. The key is for companies to standardize on one architecture. If it is a proprietary architecture, then that architecture's vendor is elevated to the role of industry leader, as IBM was with the System/360 (and later System/370) mainframes.
While companies are busy modifying their systems to conform to the industry standard platform, innovators develop technologies that allow for smaller versions. In the 1960s and 1970s, vendors introduced minicomputers. These were smaller than mainframes, less expensive, and easier to operate. For cloud systems, the equivalent would be... smaller than mainframe clouds, less expensive, and easier to operate. They would be less sophisticated than mainframe clouds, but "mini clouds" would still be useful.
In the late 1970s, technology advances lead to the microcomputer which could be purchased and used by a single person. As with mainframe computers, there were a variety of competing standards. After IBM introduced the Personal Computer, businesses (and individuals) elevated it to industry standard. Equivalent events in cloud would mean the development of individual-sized cloud systems, small enough to be purchased by a single person.
The 1980s saw the rise of desktop computers. The 1990s saw the rise of networked computers, desktop and server. An equivalent for cloud would be connecting cloud systems to one another. Somehow I think this "inter-cloud connection" will occur earlier, perhaps in the "mini cloud" period. We already have the network hardware and protocols in place. Connecting cloud systems will probably require some high-level protocols, and maybe faster connections, but the work should be minimal.
I'm still thinking of adding "cloud systems" to my list of computing periods. But I'm pretty sure that it won't be the last entry.
Mainframe
Timeshare (on mainframes)
Minicomputers
Desktop computers (includes pre-PC microcomputers, workstations, and laptops)
Servers and networked desktops
Mobile devices (phones and tablets)
I was going to add 'cloud systems' to the list as a seventh period, but I got to thinking.
My six arbitrary periods of computing show definite trends. The first trend is size: computers became physically smaller in each successive period. Mainframe computers were (and are) large systems that occupy rooms. Minicomputers were the sizes of refrigerators. Desktop computers fit on (or under) a desk. Mobile devices are small enough to carry in a shirt pocket.
The next trend is cost. Each successive period has a lower cost than the previous one. Mainframes cost in the hundreds of thousands of dollars. Minicomputers in the tens of thousands. Desktop computers were typically under $3000 (although some did edge up near $10,000) and today are usually under $1000. Mobile device costs range from $50 to $500.
The third trend is administrative effort or "load". Mainframes needed a team of well-trained attendants. Minicomputers needed one knowledgeable person to act as "system operator" or "sysop". Desktop computers could be administered by a geeky person in the home, or for large offices a team of support persons (but less than one support person per PC). Mobile devices need... no one. (Well, technically they are administered by the tribal chieftains: Apple, Google, or Microsoft.)
Cloud systems defy these trends.
By "cloud systems", I mean the cloud services that are offered by Amazon.com, Microsoft, Google, and others. I am including all of the services: infrastructure as a service, platform as a service, software as a service, machine images, queue systems, compute engines, storage engines, web servers... the whole kaboodle.
Cloud systems are large and expensive. They also tend to be limited in number, perhaps because they are large and expensive. They also have a sizable team of attendants. Cloud systems are complex and a large team is needed to keep everything running.
Cloud systems are much like mainframe computers.
The cloud services that are offered by vendors are much like the timesharing services offered by mainframe owners. With timesharing, customers could buy just as much computing time as they needed. Sound familiar? It's the model used by cloud computing.
We have, with cloud computing, returned to the mainframe era. This period has many similarities with the mainframe period. Mainframes were large, expensive to own, complex, and expensive to operate. Cloud systems are the same. The early mainframe period saw a number of competitors: IBM, NCR, CDC, Burroughs, Honeywell, and Univac, to name a few. Today we see competition between Amazon.com, Microsoft, Google, and others (including IBM).
Perhaps my "periods of computing history" is not so much a linear list as a cycle. Perhaps we are about to go "around" again, starting with the mainframe (or cloud) stage of expensive systems and evolve forward. What can we expect?
The mainframe period can be divided into two subperiods: before the System/360 and after. Before the IBM System/360, there was competition between companies and different designs. After the IBM System/360, companies standardized on that architecture. The System/360 design is still visible in mainframes of today.
An equivalent action in cloud systems would be the standardization of a cloud architecture. Perhaps the Open Stack software, perhaps Microsoft's Azure. I do not know which it will be. The key is for companies to standardize on one architecture. If it is a proprietary architecture, then that architecture's vendor is elevated to the role of industry leader, as IBM was with the System/360 (and later System/370) mainframes.
While companies are busy modifying their systems to conform to the industry standard platform, innovators develop technologies that allow for smaller versions. In the 1960s and 1970s, vendors introduced minicomputers. These were smaller than mainframes, less expensive, and easier to operate. For cloud systems, the equivalent would be... smaller than mainframe clouds, less expensive, and easier to operate. They would be less sophisticated than mainframe clouds, but "mini clouds" would still be useful.
In the late 1970s, technology advances lead to the microcomputer which could be purchased and used by a single person. As with mainframe computers, there were a variety of competing standards. After IBM introduced the Personal Computer, businesses (and individuals) elevated it to industry standard. Equivalent events in cloud would mean the development of individual-sized cloud systems, small enough to be purchased by a single person.
The 1980s saw the rise of desktop computers. The 1990s saw the rise of networked computers, desktop and server. An equivalent for cloud would be connecting cloud systems to one another. Somehow I think this "inter-cloud connection" will occur earlier, perhaps in the "mini cloud" period. We already have the network hardware and protocols in place. Connecting cloud systems will probably require some high-level protocols, and maybe faster connections, but the work should be minimal.
I'm still thinking of adding "cloud systems" to my list of computing periods. But I'm pretty sure that it won't be the last entry.
Thursday, July 31, 2014
Not so special
The history of computers has been the history of things becoming not special.
First were the mainframes. Large, expensive computers ordered, constructed, delivered, and used as a single entity. Only governments and wealthy corporations could own (or lease) a computer. Once acquired, the device was a singleton: it was "the computer". It was special.
Minicomputers reduced the specialness of computers. Instead of a single computer, a company (or a university) could purchase several minicomputers. Computers were no longer single entities in the organization. Instead of "the computer" we had "the computer for accounting" or "the computer for the physics department".
The opposite of "special" is "commodity", and personal computers brought us into a world of commodity computers. A company could have hundreds (or thousands) of computers, all identical.
Yet some computers retained their specialness. E-mail servers were singletons -- and therefore special. Web servers were special. Database servers were special.
Cloud computing reduces specialness again. With cloud systems, we can create virtual systems on demand, from pre-stocked images. We can store an image of a web server and when needed, instantiate a copy and start using it. We have not a single web server but as many as we need. The same holds for database servers. (Of course, cloud systems are designed to use multiple web servers and multiple database servers.)
In the end, specialness goes away. Computers, all computers, become commodities. They are not special.
First were the mainframes. Large, expensive computers ordered, constructed, delivered, and used as a single entity. Only governments and wealthy corporations could own (or lease) a computer. Once acquired, the device was a singleton: it was "the computer". It was special.
Minicomputers reduced the specialness of computers. Instead of a single computer, a company (or a university) could purchase several minicomputers. Computers were no longer single entities in the organization. Instead of "the computer" we had "the computer for accounting" or "the computer for the physics department".
The opposite of "special" is "commodity", and personal computers brought us into a world of commodity computers. A company could have hundreds (or thousands) of computers, all identical.
Yet some computers retained their specialness. E-mail servers were singletons -- and therefore special. Web servers were special. Database servers were special.
Cloud computing reduces specialness again. With cloud systems, we can create virtual systems on demand, from pre-stocked images. We can store an image of a web server and when needed, instantiate a copy and start using it. We have not a single web server but as many as we need. The same holds for database servers. (Of course, cloud systems are designed to use multiple web servers and multiple database servers.)
In the end, specialness goes away. Computers, all computers, become commodities. They are not special.
Thursday, July 3, 2014
Bring back "minicomputer"
The term "minicomputer" is making a comeback.
Late last year, I attended a technical presentation in which the speaker referred to his smart phone as a "minicomputer".
This month, I read a magazine website that used the term minicomputer, referring to an ARM device for testing Android version L.
Neither of these devices is a minicomputer.
The term "minicomputer" was coined in the mainframe era, when all computers (well, all electronic computers) were large, required special rooms with dedicated air conditioning, and were attended by a team of operators and field engineers. Minicomputers were smaller, being about the size of a refrigerator and needing only one or two people to care for them. Revolutionary at the time, minicomputers allowed corporate and college departments set up their own computing environments.
I suspect that the term "mainframe" came into existence only after minicomputers obtained a noticeable presence.
In the late 1970s, the term "microcomputer" was used to describe the early personal computers (the Altair 8800, the IMSAI 8080, the Radio Shack TRS-80). But back to minicomputers.
For me and many others, the term "minicomputer" will always represent the department-sized computers made by Digital Equipment Corporation or Data General. But am I being selfish? Do I have the right to lock the term "minicomputer" to that definition?
Upon consideration, the idea of re-introducing the term "minicomputer" may be reasonable. We don't use the term today. Computers are either mainframes (that term is still in use), servers, desktops, laptops, tablets, phones, phablets, and ... whatever the open-board Arduino and Raspberry Pi devices are called. So the term "minicomputer" has been, in a sense, abandoned. As an abandoned term, it can be re-purposed.
But what devices should be tagged as minicomputers? The root "mini" implies small, as it does in "minimum" or "minimize". A "minicomputer" should therefore be "smaller than a (typical) computer".
What is a typical computer? In the 1960s, they were the large mainframes. And while mainframes exist today, one can hardly argue that they are typical: laptops, tablets, and phones are all outselling them. Embedded systems, existing in cars, microwave ovens, and cameras, are probably the most common form of computing device, but I consider them out of the running. First, they are already small and a smaller computer would be small indeed. Second, most people use those devices without thinking about the computer inside. They use a car, not a "car equipped with onboard computers".
So a minicomputer is something smaller that a desktop PC, a laptop PC, a tablet, or a smartphone.
I'm leaning towards the bare-board computers: the Arduino, the BeagleBone, the Raspberry Pi, and their brethren. These are all small computers in the physical sense, smaller than desktop and laptops. They are also small in power; typically they have low-end processors and limited memory and storage, so they are "smaller" (that is, less capable) that a smartphone.
The open-board computers (excuse me, minicomputers) are also a very small portion of the market, just as their refrigerator-sized namesakes.
Let's go have some fun with minicomputers!
Monday, March 17, 2014
Mobile changes how we think about computing
The rise of mobile computers, while not a revolution, does introduce a significant change in our thinking of computing. I believe that this change generates angst for many.
The history of computing has seen three major waves of technology. Each of these waves has had a specific mindset, a specific way that we view computing.
Mainframes The first wave of computing was the mainframe era. Computers were large, expensive, magical boxes that were contained in sealed rooms (temples?) and attended by technicians (priests?). The main tasks of computers was to calculate numbers for the company (or government) and most jobs were either accounting or specific mathematical calculations (think "ballistics tables").
Minicomputers The second wave of computing was the minicomputer era. Computers were the size of refrigerators or washing machines and could be purchased by departments within a company (or a university). They did not need a sealed room with special air conditioning, although they were usually stored in locked rooms to prevent someone from wheeling them away. The main tasks were still corporate accounting, inventory management, order processing, and specific mathematical calculations.
Personal computers The third wave of computing saw a major shift in our mindset of computing. Personal computers could be purchased (and run) by individuals. They could be used at home or in the office (if you carried it in yourself). The mindset for personal computing was very different from the corporate-centered computing of the previous eras. Personal computing could be used for ... anything. The composition and printing of documents was handled by word processors. Spreadsheets let us calculate our own budgets. Small databases (and later larger databases) let us store our own transaction data. If off-the-shelf software was not suitable to the task, we could write our own programs.
The mindset of personal computing has been with us for over thirty years. The size and shape of personal computers has been roughly the same: the same CPU box, the same keyboard, the same monitor. We know what a PC looks like, The software has seen one major change, from DOS to Windows, but Windows has been with us for the past twenty years. We know what programs look like.
The introduction of tablets has caused us to re-think our ideas of computing. And we're not that good at re-thinking. We see tablets and phones and they seem strange to us. The size and shape are different (and therefore "wrong"); the user interface is different (and therefore "wrong"); the way we purchase applications is different (and therefore "wrong"); even the way we call applications ("apps") is different (and therefore... you get the idea).
I observe that mobile devices caused little discomfort while they remained in the consumer market. Phones that could play music and games were not a problem. Tablets that let one scroll through Facebook or read books were not a problem. These were extensions to our existing technology.
Now phones and tablets are moving into the commercial sphere, and their application is not obvious. It is clear that they are not personal computers -- their size and shape prove that. But there are more differences that cause uncertainty.
Touch interface The user interface for phones and tablets is not about keyboards and mice but about taps and swipes.
Small screen Tablets have small-ish screens, and phones have tiny screens. How can anyone work on those?
Different operating systems Personal computers run Windows (except for a few in the marketing groups that use Mac OS). Tablets run something called "Android" or something else called "iOS".
Something other than Microsoft Microsoft's entries in the phone and tablet market are not the market leaders and their operating systems have not been accepted widely.
Even Microsoft isn't Microsoft-ish Microsoft's operating system for phones and tablets isn't really Windows, it is this thing called "Windows 8". The user interface looks completely different. Windows RT doesn't run "classic" Windows programs at all (except for Microsoft Office).
The changes coming from mobile are only one front; changes to the PC are also coming.
The typical PC is shrinking Display screens have become flat. The CPU box is shrinking, losing the space for expansion cards and empty disk bays. Apple's Mac Mini, Intel's New Unit of Computing, and other devices are changing how we look at computers.
Windows is changing Windows 8 is very different from "good old Windows". (My view is that Windows 8's tiles are simply a bigger, better "Start" menu, but many disagree.)
These changes mean that one cannot stay put with Windows. You either advance into the mobile world or you advance into the new Windows world.
The brave new worlds of mobile and Windows look and feel very different from the old world of computing. Many of our familiar techniques are replaced with something new (and strange).
We thought we knew what computers were and what computing was. Mobile changes those ideas. After thirty years of a (roughly) constant notion of personal computing, many people are not ready for a change.
I suspect that the people who are hardest hit by the changes of mobile are those aged 25 to 45; old enough to know PCs quite well but not old enough to remember the pre-PC days. This group never had to go through a significant change in technology. Their world is changing and few are prepared for the shock.
The under-25 crowd will be fine with tablets and computers. It's what they know and want.
Interestingly, the over-45 folks will probably weather the change. They have already experienced a change in computing, either from mainframes or minicomputers to personal computers, or from nothing to personal computers.
The history of computing has seen three major waves of technology. Each of these waves has had a specific mindset, a specific way that we view computing.
Mainframes The first wave of computing was the mainframe era. Computers were large, expensive, magical boxes that were contained in sealed rooms (temples?) and attended by technicians (priests?). The main tasks of computers was to calculate numbers for the company (or government) and most jobs were either accounting or specific mathematical calculations (think "ballistics tables").
Minicomputers The second wave of computing was the minicomputer era. Computers were the size of refrigerators or washing machines and could be purchased by departments within a company (or a university). They did not need a sealed room with special air conditioning, although they were usually stored in locked rooms to prevent someone from wheeling them away. The main tasks were still corporate accounting, inventory management, order processing, and specific mathematical calculations.
Personal computers The third wave of computing saw a major shift in our mindset of computing. Personal computers could be purchased (and run) by individuals. They could be used at home or in the office (if you carried it in yourself). The mindset for personal computing was very different from the corporate-centered computing of the previous eras. Personal computing could be used for ... anything. The composition and printing of documents was handled by word processors. Spreadsheets let us calculate our own budgets. Small databases (and later larger databases) let us store our own transaction data. If off-the-shelf software was not suitable to the task, we could write our own programs.
The mindset of personal computing has been with us for over thirty years. The size and shape of personal computers has been roughly the same: the same CPU box, the same keyboard, the same monitor. We know what a PC looks like, The software has seen one major change, from DOS to Windows, but Windows has been with us for the past twenty years. We know what programs look like.
The introduction of tablets has caused us to re-think our ideas of computing. And we're not that good at re-thinking. We see tablets and phones and they seem strange to us. The size and shape are different (and therefore "wrong"); the user interface is different (and therefore "wrong"); the way we purchase applications is different (and therefore "wrong"); even the way we call applications ("apps") is different (and therefore... you get the idea).
I observe that mobile devices caused little discomfort while they remained in the consumer market. Phones that could play music and games were not a problem. Tablets that let one scroll through Facebook or read books were not a problem. These were extensions to our existing technology.
Now phones and tablets are moving into the commercial sphere, and their application is not obvious. It is clear that they are not personal computers -- their size and shape prove that. But there are more differences that cause uncertainty.
Touch interface The user interface for phones and tablets is not about keyboards and mice but about taps and swipes.
Small screen Tablets have small-ish screens, and phones have tiny screens. How can anyone work on those?
Different operating systems Personal computers run Windows (except for a few in the marketing groups that use Mac OS). Tablets run something called "Android" or something else called "iOS".
Something other than Microsoft Microsoft's entries in the phone and tablet market are not the market leaders and their operating systems have not been accepted widely.
Even Microsoft isn't Microsoft-ish Microsoft's operating system for phones and tablets isn't really Windows, it is this thing called "Windows 8". The user interface looks completely different. Windows RT doesn't run "classic" Windows programs at all (except for Microsoft Office).
The changes coming from mobile are only one front; changes to the PC are also coming.
The typical PC is shrinking Display screens have become flat. The CPU box is shrinking, losing the space for expansion cards and empty disk bays. Apple's Mac Mini, Intel's New Unit of Computing, and other devices are changing how we look at computers.
Windows is changing Windows 8 is very different from "good old Windows". (My view is that Windows 8's tiles are simply a bigger, better "Start" menu, but many disagree.)
These changes mean that one cannot stay put with Windows. You either advance into the mobile world or you advance into the new Windows world.
The brave new worlds of mobile and Windows look and feel very different from the old world of computing. Many of our familiar techniques are replaced with something new (and strange).
We thought we knew what computers were and what computing was. Mobile changes those ideas. After thirty years of a (roughly) constant notion of personal computing, many people are not ready for a change.
I suspect that the people who are hardest hit by the changes of mobile are those aged 25 to 45; old enough to know PCs quite well but not old enough to remember the pre-PC days. This group never had to go through a significant change in technology. Their world is changing and few are prepared for the shock.
The under-25 crowd will be fine with tablets and computers. It's what they know and want.
Interestingly, the over-45 folks will probably weather the change. They have already experienced a change in computing, either from mainframes or minicomputers to personal computers, or from nothing to personal computers.
Labels:
changes,
mainframe,
minicomputers,
mobile computers,
PC revolution
Monday, March 10, 2014
IBM makes... mainframes
IBM, that venerable member of the technology world, built its reputation on mainframe computers. And they are still at it.
In the 1940s and 1950s, computing devices were specific to the task. We didn't have general purpose computers; we had tabulators and sorters and various types of machines. The very early electronic calculators were little more than adding machines -- addition was their only operation. The later machines were computers, albeit specialized, usually for military or commercial needs. (Which made some sense, as only the government and large corporations could afford the machines.)
IBM's System/360 changed the game. It was a general purpose machine, suitable for use by government, military, or commercial organizations. IBM's System/370 was a step up with virtual memory, dual processors, and built-in floating point arithmetic.
But these were still large, expensive machines, and these large, expensive machines defined the term "mainframe". IBM was the "big company that makes big computers".
Reluctantly, IBM entered the minicomputer market to compete with companies like DEC and Data General.
Also reluctantly, IBM entered the PC market to compete with Apple, Radio Shack, and other companies that were making inroads into the corporate world.
But I think, in its heart, IBM remained a mainframe company.
Why do I think that? Because over the years IBM has adjusted its product line. Look at what they have stopped producing:
And look at what they have kept in their product line:
The last item, Watson, is particularly telling. Watson is IBM's super-sized information storage and retrieval system. It is quite sophisticated and has appeared (successfully) on the "Jeopardy!" TV game show.
Watson is a product that IBM is marketing to large companies (and probably the government). They do not offer a "junior" version for smaller companies or university departments. They do not offer a "personal" version for individuals. IBM's Watson is today's equivalent of the System/360 computer: large, expensive, and made for wealthy clients.
So IBM has come full circle, from the System/360 to minicomputers to personal computers and back to Watson. Will they ever offer smaller versions of Watson? Perhaps, if other companies enter the market and force IBM to respond.
We PC revolutionaries wanted to change the world. We wanted to bring computing to the masses. And we wanted to destroy IBM (or at least take it down a peg or two). Well, we did change the world. We did bring computing to the masses. We did not destroy IBM, or its mainframes. IBM is still the "big company that makes big computers".
In the 1940s and 1950s, computing devices were specific to the task. We didn't have general purpose computers; we had tabulators and sorters and various types of machines. The very early electronic calculators were little more than adding machines -- addition was their only operation. The later machines were computers, albeit specialized, usually for military or commercial needs. (Which made some sense, as only the government and large corporations could afford the machines.)
IBM's System/360 changed the game. It was a general purpose machine, suitable for use by government, military, or commercial organizations. IBM's System/370 was a step up with virtual memory, dual processors, and built-in floating point arithmetic.
But these were still large, expensive machines, and these large, expensive machines defined the term "mainframe". IBM was the "big company that makes big computers".
Reluctantly, IBM entered the minicomputer market to compete with companies like DEC and Data General.
Also reluctantly, IBM entered the PC market to compete with Apple, Radio Shack, and other companies that were making inroads into the corporate world.
But I think, in its heart, IBM remained a mainframe company.
Why do I think that? Because over the years IBM has adjusted its product line. Look at what they have stopped producing:
- Typewriters
- Photocopiers
- Disk drives
- Tape drives
- Minicomputers
- Microcomputers (PCs)
- Laptop computers
- Printers for PCs
And look at what they have kept in their product line:
- Mainframe computers
- Servers
- Cloud-based services
- Watson
The last item, Watson, is particularly telling. Watson is IBM's super-sized information storage and retrieval system. It is quite sophisticated and has appeared (successfully) on the "Jeopardy!" TV game show.
Watson is a product that IBM is marketing to large companies (and probably the government). They do not offer a "junior" version for smaller companies or university departments. They do not offer a "personal" version for individuals. IBM's Watson is today's equivalent of the System/360 computer: large, expensive, and made for wealthy clients.
So IBM has come full circle, from the System/360 to minicomputers to personal computers and back to Watson. Will they ever offer smaller versions of Watson? Perhaps, if other companies enter the market and force IBM to respond.
We PC revolutionaries wanted to change the world. We wanted to bring computing to the masses. And we wanted to destroy IBM (or at least take it down a peg or two). Well, we did change the world. We did bring computing to the masses. We did not destroy IBM, or its mainframes. IBM is still the "big company that makes big computers".
Labels:
IBM,
mainframe computers,
minicomputers,
PC revolution
Tuesday, September 17, 2013
When programming, think like a computer
When programming, it is best to think like a computer. It is tempting to think like a human. But humans think very differently than computers (if we allow that computers think), and thinking like a human leads to complex programs.
This was brought home to me while reading William Conley's "Computer Optimization Techniques" which discusses the solutions to Integer Programming problems and related problems. Many of these problems can be solved with brute-force calculations, evaluating every possible solution and identifying the most profitable (or least expensive).
The programs for these brute-force methods are short and simple. Even in FORTRAN, they run less than fifty lines. Their brevity is due to their simplicity. There is no clever coding, no attempt to optimize the algorithm. The programs take advantage of the computer's strength of fast computation.
Humans think very differently. They tire quickly of routine calculations. They can identify patterns and have insights into shortcuts for algorithms. They can take creative leaps to solutions. These are all survival skills, useful for dealing with an uncertain environment and capable predators. But they are quite difficult to encode into a computer program. So hard that it is often more efficient to use brute-force calculations without insights and creative leaps. The time spent making the program "smart" is larger than the time saved by the improved program.
Brute-force is not always the best method for calculations. Sometimes you need a smart program, because the number of computations is staggering. In those cases, it is better to invest the time in improvements. (To his credit, Conley shows techniques to reduce the computations, sometimes by increasing the complexity of the code.)
Computing efficiency (that is, "smart" programs) has been a concern since the first computing machines were made. Necessary at first, the need for efficiency drops over time. Mainframe computers became faster, which allowed for "sloppy" programs ("sloppy" meaning "anything less than maximum efficiency").
Minicomputers were slower than mainframes, significantly less expensive, and another step away from the need for optimized, "smart" programs. PCs were another step. Today, smart phones have more computing power than PCs of a few years ago, at a fraction of the price. Cloud computing, a separate branch in the evolution of computing, offers cheap, readily-available computing power.
I won't claim that computing power is (or will ever be) "too cheap to meter". But it is cheap, and it is plentiful. And with cheap and plentiful computing power, we can build programs that use simple methods.
When writing a computer program, think like a computer. Start with a simple algorithm, one that is not clever. Chances are, it will be good enough.
This was brought home to me while reading William Conley's "Computer Optimization Techniques" which discusses the solutions to Integer Programming problems and related problems. Many of these problems can be solved with brute-force calculations, evaluating every possible solution and identifying the most profitable (or least expensive).
The programs for these brute-force methods are short and simple. Even in FORTRAN, they run less than fifty lines. Their brevity is due to their simplicity. There is no clever coding, no attempt to optimize the algorithm. The programs take advantage of the computer's strength of fast computation.
Humans think very differently. They tire quickly of routine calculations. They can identify patterns and have insights into shortcuts for algorithms. They can take creative leaps to solutions. These are all survival skills, useful for dealing with an uncertain environment and capable predators. But they are quite difficult to encode into a computer program. So hard that it is often more efficient to use brute-force calculations without insights and creative leaps. The time spent making the program "smart" is larger than the time saved by the improved program.
Brute-force is not always the best method for calculations. Sometimes you need a smart program, because the number of computations is staggering. In those cases, it is better to invest the time in improvements. (To his credit, Conley shows techniques to reduce the computations, sometimes by increasing the complexity of the code.)
Computing efficiency (that is, "smart" programs) has been a concern since the first computing machines were made. Necessary at first, the need for efficiency drops over time. Mainframe computers became faster, which allowed for "sloppy" programs ("sloppy" meaning "anything less than maximum efficiency").
Minicomputers were slower than mainframes, significantly less expensive, and another step away from the need for optimized, "smart" programs. PCs were another step. Today, smart phones have more computing power than PCs of a few years ago, at a fraction of the price. Cloud computing, a separate branch in the evolution of computing, offers cheap, readily-available computing power.
I won't claim that computing power is (or will ever be) "too cheap to meter". But it is cheap, and it is plentiful. And with cheap and plentiful computing power, we can build programs that use simple methods.
When writing a computer program, think like a computer. Start with a simple algorithm, one that is not clever. Chances are, it will be good enough.
Monday, September 9, 2013
Microsoft is not DEC
Some have pointed out the comparisons of Microsoft to the long-ago champion of mini-computers DEC.
The commonalities seem to be:
Yet there are differences:
DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.
DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.
The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)
PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.
Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.
The commonalities seem to be:
- DEC and Microsoft were both large
- DEC and Microsoft had strong cultures
- DEC missed the PC market; Microsoft is missing the mobile market
- DEC and Microsoft changed their CEOs
Yet there are differences:
DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.
DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.
The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)
PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.
Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.
Subscribe to:
Posts (Atom)