Saturday, October 29, 2016

For compatibility, look across and not down

The PC industry has always had an obsession with compatibility. Indeed, the first question many people asked about computers was "Is it PC compatible?". A fair question at the time, as most software was written for the IBM PC and would not run on other systems.

Over time, our notion of "PC compatible" has changed. Most people today think of a Windows PC as "an IBM PC compatible PC" when in fact the hardware has changed so much that any current PC is not "PC compatible". (You cannot attach any device from an original IBM PC, including the keyboard, display, or adapter card.)

Compatibility is important -- not for everything but for the right things.

The original IBM PCs were, of course, all "PC compatible" (by definition) and the popular software packages (Lotus 1-2-3, Wordstar, WordPerfect, dBase III) were all "PC compatible" too. Yet one could not move data from one program to another. Text in Wordstar was in Wordstar format, numbers and formulas in Lotus 1-2-3 was in Lotus format, and data in dBase was in dBase format.

Application programs were compatible "downwards" but not across". That is, they were compatible with the underlying layers (DOS, BIOS, and the PC hardware) but not with each other. To move data from one program to another it was necessary to "print" to a file and read the file into the destination program. (This assumes that both programs had the ability to export and import text data.)

Windows addressed that problem, with its notion of the clipboard and the ability to copy and paste text. The clipboard was not a complete solution, and Microsoft worked on other technologies to make programs more compatible (DDE, COM, DCOM, and OLE). This was the beginning of compatibility between programs.

The networked applications and the web gave us more insight to compatibility. The first networked applications for PCs were the client/server applications such as Powerbuilder. One PC hosted the database and other PCs sent requests to store, retreive, and update data. At the time, all of the PCs were running Windows.

The web allowed for variation between client and server. With web servers and capable network software, it was no longer necessary for all computers to use the same hardware and operating systems. A Windows PC could request a web page from a server running Unix. A Macintosh PC could request a web page from a Linux server.

Web services use the same mechanisms for web pages, and allow for the same variation between client and server.

We no longer need "downwards" compatibility -- but we do need compatibility "across". A server must understand the incoming request. The client must understand the response. In today's world we ensure compatibility through the character set (UNICODE), and the data format (commonly HTML, JSON, or XML).

This means that our computing infrastructure can vary. It's no longer necessary to ensure that all of our computers are "PC compatible". I expect variation in computer hardware, as different architectures are used for different applications. Large-scale databases may use processors and memory designs that can handle the quantities of data. Small processors will be used for "internet of things" appliances. Nothing requires them to all use a single processor design.

Thursday, October 27, 2016

Actually, new platforms compel new languages

My previous post claimed that new platforms spur the adoption of new languages. The more I think about it, the more I believe I was wrong.

New platforms don't simply spur the adoption of new languages. They compel the adoption of new languages.

A platform offers a set of capabilities and a set of concepts. Languages are designed around those capabilities and concepts. Change the platform, and you change the capabilities and the concepts, and you need a different language.

For batch processing, COBOL and FORTRAN were acceptable. They didn't work for timeshare systems and they didn't work for microcomputers. Timeshare and micrcomputers were interactive, and they needed a language like BASIC.

Windows and OS/2's Presentation Manager required a language that could handle event-driven processing, and object oriented languages (first C++, later Visual Basic and Java) met that need.

Web applications needed a run-time system that was constantly present. We started web applications with Perl and C++ and quickly learned that the startup time for the programs was costing us performance. Java and C# load their run-time systems independently of the application program, and can keep the run-time in memory, which gives better performance.

Changing languages (and the mindset of the underlying platform) is a significant effort. One does not do it lightly, which is why large organizations tend to use older technology.

But where does this leave functional languages?

From my view, I see no platform that requires the use of functional languages. And without a compelling reason to use functional languages, I expect that we won't. Oh, functional languages won't go away; lots of developers use them. (I myself am a fan, although I tend to use Ruby or Python for my own projects.)

But functional languages won't become the popular languages of the day without a reason. Inertia will keep us with other languages.

At least, until a platform arrives that compels the capabilities of functional languages. That platform might be the "internet of things" although I expect the first versions will use the currently popular languages.

Functional languages offer increased reliability. It may be possible to prove certain programs correct, which will be of interest to government agencies, banks, and anyone in the security field. (Turing proved that we could not prove correct most programs, but I believe that we can prove correct programs that are subject to a set of constraints. Functional languages may offer those constrants.)

I'm not abandoning functional languages. I like what they offer. Yet I recognize that they require an additional level of discipline (much like structured programming and object-oriented programmed required additional discipline) and we will switch only when the benefits are higher than the cost.

Sunday, October 23, 2016

Platforms spur languages

We developers like to think that we make languages popular. We like to think that it is us that decide the fate of languages. I'm not so sure. It may be that we're in less control than we believe.

I posit that it is platforms, that is, hardware and operating systems, that drive the popularity of programming languages.

The first programming languages, COBOL and FORTRAN, became popular only after computers became accepted in business and, more importantly, as platforms for running applications.

When computers were special-purpose devices, programs were written for the specific computer, usually in machine code or assembly language, with the idea that the program was part of the computer. The thought of moving an application from one computer to another was alien, like toasting bread with a refridgerator.

It was only after our mindset changed that we thought of moving programs. The idea of the computer as a platform, and the successive idea of moving a program from one computer to another, led to the idea of "portability" and languages common across computers. Thus, COBOL and FORTRAN were born.

Timesharing, a platform different from batch processing, gave us BASIC. Microcomputers in the late 1970s took BASIC to a higher level, making it one of the most popular languages at the time.

In the age of the IBM PC, computer programs become commercial, and BASIC was not suitable. BASIC offered no way to obscure source code and its performance was limited. Most commercial programs were written in assembly language, a choice possible due to the uniformity of the PC.

Microsoft Windows drove C++ and Visual Basic. While it was possible to write applications for Windows in assembly language, it was tedious and maintenance was expensive. C++ and its object-oriented capabilities made programming in Windows practical. Later, Visual Basic had just enough object-oriented capabilities to make programming in Windows not only possible but also easy.

Java's rise started with the internet and the web, but was also popular because it was "not from Microsoft". In this, Java is an outlier: not driven by a technical platform, but by emotion.

MacOS and iOS raised the popularity of Objective-C, at least until Apple announced Swift as the successor language for development. After that announcement, Objective-C dropped in popularity (and Swift started its rise).

Cloud computing and large data sets ("big data") has given us new languages for data management.

Looking at these data points, it seems that different platforms are favorable to different languages, and the popularity of the platform drives the popularity of the language.

Which is not to say that a language cannot achieve popularity without a platform. Some languages have, but they are few.

Some languages have failed to gain popularity despite other, non-platform inducements. The two largest "failures" that come to mind are PL/I and Ada. PL/I was invented by IBM and enjoyed some popularity, but has faded into all-but-obscurity. Ada was mandated by the US Department of Defense as a standard for all applications, and it, too, has faded.

If IBM at its height of market control, and the US DoD through mandate cannot make languages popular, what else (besides platforms) can?

If my theory is correct, then developers and managers should consider platforms when selecting programming languages. Often, developers pick what they know (not an unreasonable choice) and managers pick what is "safe" (also not unreasonable). If using a platform conducive to the language, then the decision is sound. If switching platforms, a different language may be the better option.

Wednesday, October 19, 2016

We prefer horizontal layers, not vertical stacks

Looking back at the 60-plus years of computer systems, we can see a pattern of design preferences. That pattern is an initial preference for vertical design (that is, a complete system from top to bottom) followed by a change to a horizontal divide between a platform and applications on that platform.

A few examples include mainframe computers, word processors, and smart phones.

Mainframe computers, in the early part of the mainframe age, were special-purpose machines. IBM changed the game with its System/360, which was a general-purpose computer. The S/360 could be used for commercial, scientific, or government organizations. It provided a common platform upon which ran application programs. The design was revolutionary, and it has stayed with us. Minicomputers followed the "platform and applications" pattern, as did microcomputers and later IBM's own Personal Computer.

When we think of the phrase "word processor", we think of software, most often Microsoft's "Word" application (which runs on the Windows platform). But word processors were not always purely software. The original word processors were smart typewriters, machines with enhanced capabilities. In the mid-1970s, a word processor was a small computer with a keyboard, display, processing unit, floppy disks for storage, a printer, and software to make it all go.

But word processors as hardware did not last long. We moved away from the all-in-one design. In its place we used the "application on platform" approach, using PCs as the hardware and a word processing application program.

More recently, smart phones have become the platform of choice for photography, music, and navigation. We have moved away from cameras (a complete set of hardware and software for taking pictures), moved away from MP3 players (a complete set of hardware and software for playing music), and moved away from navigation units (a complete set of hardware and software for providing directions). In their place we use smart phones.

(Yes, I know that some people still prefer discrete cameras, and some people still use discrete navigation systems. I myself still use an MP3 player. But the number of people who use discrete devices for these tasks is small.)

I tried thinking of single-use devices that are still popular, and none came to mind. (I also tried thinking of applications that ran on platforms that moved to single-use devices, and also failed.)

It seems we have a definite preference for the "application on platform" design.

What does this mean for the future? For smart phones, possibly not so much -- other than they will remain popular until a new platform arrives. For the "internet of things", it means that we will see a number of task-specific devices such as thermostats and door locks until an "internet of things" platform comes along, and then all of those task-specific devices will become obsolete (like the task-specific mainframes or word processor hardware).

For cloud systems, perhaps the cloud is the platform and the virtual servers are the applications. Rather than discrete web servers and database servers the cloud is the platform for web server and database server "applications" that will be containerized versions of the software. The "application on platform" pattern means that cloud and containers will endure for some time, and is a good choice for architecture.