Saturday, April 3, 2010

The See-saw of Complexity

We've been using computers for more than fifty years. With over half a century of experience, what can we see?

Well, certainly different languages have risen and fallen in popularity. Popularity in terms of praise and fandom as well as popularity in terms of sheer (if grudging) use. Modern languages such as C# and Ruby have a large fan base, and therefore a social popularity. COBOL has few fans, yet many organizations use it for many applications. COBOL retains a degree of utilitarian popularity.

In addition to popularity, languages have changed over time. Most have become more complex. This makes sense: As language designers add features, they strive to keep backwards compatibility. A clear example is C and C++. The initial C (K&R C) was a small, simple language. The revision of ANSI C (which mandated prototypes and introduced the 'void' type) was somewhat more complex. If we consider C++ to be a revision of C (and Stroustrup thought to call it "C with classes") then we see an additional step-up in complexity. The trend continues with revisions to C++ which included templates, STL, and now the functors in the latest version.

If we allow the C/C++ family to continue with Java and C#, we see a decrease in complexity. At the expense of compatibility, simpler languages succeeded the complex C++/STL language.

Let's look at another "track" of languages: the track that includes the Unix shell and scripting languages. I lump all of the Unix and Linux shells together as "shells", for simplicity. I want to include other languages on this track: awk, Perl, Python, and Ruby. These are the scripting languages. (I'm omitting VBscript and JavaScript as they are used -- mostly -- in specific domains.)

On the scripting language track, we see an increase in complexity, rising from the shells, peaking at Perl, and then declining with Python and Ruby. (Indeed, one of Ruby's selling points is that it is a simple, elegant language.) Some compatibility was kept, but not as strictly as with C and C++.

A third track, consisting of FORTRAN, COBOL, Algol, BASIC, and Pascal, is possible. This track is less cohesive than the previous two, as each language went through its own growth of complexity. Yet the trend is: FORTRAN and COBOL at the beginning with a degree of complexity, Algol later with more complexity, BASIC as a return to simplicity (although growing into a move complex Microsoft Basic), and another return to simplicity in Pascal (which in turn grew in complexity in the form of Turbo Pascal and eventually Delphi). My description is not quite accurate, as some of these events overlapped. But let's allow some elasticity in time to give us the effect of moving between complex and simple.

Each track reveals a pattern. It seems that we, as humans, move from simple to complex and then back to simple. We don't start with simple and then grow to complexity and keep going. We oscillate around a certain level of complexity. (Seven plus of minus two, anyone?)

Perhaps this is not surprising. As we add features to a language (hacking them in, to maintain compatibility) we eventually see larger patterns. Someone codifies the patterns into a new language, but since it is a new language, it doesn't do everything that the old language does. But not for long, as the new language begins its trip up the complexity curve, adding features. There must be a point when a new simpler language (one that uses the new patterns) is more effective than the old language. At that point, developers start using the new language.

Interestingly, operating systems do not follow this trend. CP/M was a simple operating system, replaced by MS-DOS with more complexity. Successive versions of MS-DOS added complexity. OS/2 and Windows upped the complexity again. Later versions of Windows added complexity. For PC operating systems, and I think operating systems for all levels of hardware, we have never seen a move to simplicity.

What we have seen is a shift from one platform to another. From mainframes to minicomputers, from minicomputers to PCs, and from PCs to servers. Each shift saw a reduction in complexity, later followed by an increase.

One current shift is from PCs to smartphones. The initial smartphones (and pocket PCs) had very simple operating sytems (consider the Palm Pilot). Even today, the the iPhone OSX and the Microsoft Windows Phone versions are smaller versions of their desktop counterparts. Yet each new version ups the complexity.

A second current shift is to the cloud. Here again, we see that "cloud operating systems" are considered simple and perhaps not capable of business operations. Yet each new version ups the complexity.

None of this is to say that one must abandon their current software or hardware and jump onto the smartphone platform for all applications. Or the cloud. Minicomputers did not kill off mainframes, but expanded the space of possible computing solutions. PCs did the same.

Smartphones and cloud computing will expand the space of computing solutions. Existing applications will continue, and new applications will emerge, just as the previous new platforms made e-mail, desktop publishing, and spreadsheets possible.

Languages will continue to emerge and grow. Some will decline, and a few will fall into oblivion. Yet the popular languages of today will remain for many years, just as FORTRAN and COBOL remain.


No comments: