The history of computing can be described as a series of developments, alternating between computing platforms and programming languages. The predominant pattern is one in which hardware is advanced, and then programming languages. Occasionally, hardware and programming languages advance together, but that is less common. (Hardware and system software -- not programming languages -- do advance together.)
The early mainframe computers were single-purpose devices. In the early 21st century, we think of computers as general-purpose devices, handling financial transactions, personal communication, navigation, and games, because our computing devices perform all of those tasks. But in the early days of electronic computing, devices were not so flexible. Mainframe computers were designed for a single purpose; either commercial (financial) processing, or scientific computation. The distinction was visible through all aspects of the computer system, from the processor and representations for numeric values to input-output devices and the characters available.
Once we had those computers, for commercial and for scientific computation, we built languages. COBOL for commercial processing; FORTRAN for scientific processing.
And thus began the cycle of alternating developments: computing platforms and programming languages. The programming languages follow the platforms.
The next advance in hardware was the general-purpose mainframe. The IBM System/360 was designed for both types of computing, and it used COBOL and FORTRAN. But we also continued the cycle of "platform and then language" with the invention of a general-purpose programming language: PL/1.
PL/1 was the intended successor to COBOL and to FORTRAN. It improved the syntax of both languages and was supposed to replace them. It did not. But it was the language we invented after general-purpose hardware, and it fits in the general pattern of advances in platforms alternating with advances in languages.
The next advance was timesharing. This advance in hardware and in system software let people use computers interactively. It was a big change from the older style of scheduled jobs that ran on batches of data.
The language we invented for this platform? It was BASIC. BASIC was designed for interactive use, and also designed to avoid requests of system operators to load disks or tapes. A BASIC program could contain its code and its data, all in one. Such a thing was not possible in earlier languages.
The next advance was minicomputers. The minicomputer revolution (DEC's PDP-8, PDP-11 and other systems from other vendors) used BASIC (adopted from timesharing) and FORTRAN. Once again, a new platform initially used the languages from the previous platform.
We also invented languages for minicomputers. DEC invented FOCAL (a lightweight FORTRAN) and DIBOL (a lightweight COBOL). Neither replaced its corresponding "heavyweight" language, but invent them we did.
The PC revolution followed minicomputers. PCs were small computers that could be purchased and used by individuals. Initially, PCs used BASIC. It was a good choice: small enough to fit into the small computers, and simple enough that individuals could quickly understand it.
The PC revolution invented its own languages: CBASIC (a compiled form of BASIC), dBase (later named "xbase"), and most importantly, spreadsheets. While not a programming language, a spreadsheet is a form of programming. It organizes data and specifies calculations. I count it as a programming platform.
The next computing platform was GUI programming, made possible with both the Apple Macintosh and Microsoft Windows. These "operating environments" (as they were called) changed programming from text-oriented to graphics, and required more powerful hardware -- and software. But the Macintosh first used Pascal, and Windows used C, two languages that were already available.
Later, Microsoft invented Visual Basic and provided Visual C++ (a concoction of C++ and macros to handle the needs of GUI programming), which became the dominant languages of Windows. Apple switched from Pascal to Objective-C, which it enhanced for programming the Mac.
The web was another computing advance, bringing two distinct platforms: the server and the browser. At first, servers used Perl and C (or possibly C++); browsers were without a language and had to use plug-ins such as Flash. We quickly invented Java and (somewhat less quickly) adopted it for servers. We also invented JavaScript, and today all browsers provide JavaScript for web pages.
Mobile computing (phones and tablets) started with Objective-C (Apple) and Java (Android), two languages that were convenient for those devices. Apple later invented Swift, to fix problems with the syntax of Objective-C and to provide a better experience for its users. Google invented Go and made it available for Android development, but it has seen limited adoption.
Looking back, we can see a clear pattern. A new computing platform emerges. At first, it uses existing languages. Shortly after the arrival of the platform, we invent new languages for that platform. Sometimes these languages are adopted, sometimes not. Sometimes a language gains popularity much later than expected, as in the case of BASIC, invented for timesharing but used for minicomputers and PCs.
It is a consistent pattern.
Consistent that is, until we get to cloud computing.
Cloud computing is a new platform, much like the web was a new platform, and PCs were a new platform, and general-purpose mainframes were a new platform. And while each of those platforms saw the development of new languages to take advantage of new features, the cloud computing platform has seen... nothing.
Well, "nothing" is a bit harsh and not quite true.
True to the pattern, cloud computing uses existing languages. Cloud applications can be built in Java, JavaScript, Python, C#, C++, and probably Fortran and COBOL. (And there are probably cloud applications that use these languages.)
And we have invented Node.js, a framework in JavaScript that is useful for cloud computing.
But there is no native language for cloud computing. No language that has been designed specifically for cloud computing. (No language of which I am aware. Perhaps there is, lurking in the dark corners of the internet that I have yet to visit.)
Why no language for the cloud platform? I can think of a few reasons:
First, it may be that our current languages are suitable for the development of cloud applications. Languages such as Java and C# may have the overhead of object-oriented design, but that overhead is minimal with careful design. Languages such as Python and JavaScript are interpreted, but that may not be a problem with the scale of cloud processing. Maybe the pressure to design a new language is low.
Second, it may be that developers, managers, and anyone else connected with projects for cloud applications is too busy learning the platform. Cloud platforms (AWS, Azure, GCP, etc.) are complex beasts, and there is a lot to learn. It is possible that we are still learning about cloud platforms and not ready to develop a cloud-specific language.
Third, it may be too complex to develop a cloud-specific programming language. The complexity may reside in separating cloud operations from programming, and we need to understand the cloud before we can understand its limits and the boundaries for a programming language.
I suspect that we will eventually see one or more programming languages for cloud platforms. The new languages may come from the big cloud providers (Amazon, Microsoft, Google) or smaller providers (Dell, Oracle, IBM) or possibly even someone else. Programming languages from the big providers will be applicable for their respective platforms (of course). A programming language from an independent party may work across all cloud platforms -- or may work on only one or a few.
We will have to wait this one out. But keep yours eyes open. Programming languages designed for cloud applications will offer exciting advances for programming.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment