Showing posts with label platforms. Show all posts
Showing posts with label platforms. Show all posts

Wednesday, October 7, 2020

Platforms are not neutral

We like to think that our platforms (processor, operating system and operating environment) are neutral, at least when it comes to programming languages. After all, why should a processor care if the code that it executes (machine code) was generated by a compiler for C#, or Java, or Fortran, or Cobol? Ditto for the operating system. Does Windows care of the code was from a C++ program? Does Linux care of the code came from Go?

And if processors and operating systems don't care about code, why should a platform such as .NET or the JVM care about code?

One could argue that the JVM was designed for Java, and that it has the data types and operations that are needed for Java programs and not other languages. That argument is correct: the JVM was built for Java. Yet people have built compilers that convert other languages to the JVM bytecode. That list includes Clojure, Lisp, Kotlin, Groovy, JRuby, and Jython. All of those languages run on the JVM. All of those languages use the same data types as Java.

The argument for .NET is somewhat different. The .NET platform was designed for multiple languages. When announced, Microsoft supplied not only the C# compiler but also compilers for C++, VB, and Visual J++. Other companies have added compilers for many other languages.

But those experiences do not mean that the platforms are unbiased.

The .NET platform, and specifically the Common Language Runtime (CLR) was about interoperability. The goal was to allow programs written in different languages to work together. For example, to call a function in Visual Basic from a function in C++.

To achieve this interoperability, the CLR requires languages to use a common set of data types. These common types include 32-bit integers, 64-bit floating-point numbers, and strings. Prior to .NET, the different language compilers from Microsoft all had different ideas about numeric types and string types. C and C++ user null-terminated strings, but Visual Basic used counter-in-front strings. (Thus, a string in Visual Basic could contain NUL characters, but a string in C or C++ could not.) There were differences with floating-point numeric values also.

Notice that these types are aligned with the C data types. The CLR, and the agreement on data types, works for languages that use data types that match C's data types. The .NET version of Visual Basic (VB.NET) had to change its data types in order to comply with the rules of the CLR. Thus, VB.NET was quite a bit different from the previous Visual Basic.

The CLR works for languages that use C-style data types. The CLR supports custom data types, which is nice, and necessary for languages that do not use C-style data types, but then one loses interoperability, and interoperability was the major benefit of .NET.

The .NET platform favors C-style data types. (Namely integers, floating point, and NUL-terminated strings.)

The JVM also favors C-style data types.

Many languages use C-style data types.

What languages don't use C-style data types?

Cobol, for one. Cobol was developed prior to C, and it has its own ideas about data. It allows numeric values with PICTURE clauses, which can define limits and also formatting. Some examples:

   05  AMOUNT1          PIC 999.99.
   05  AMOUNT2          PIC 99999.99.
   05  AMOUNT3          PIC 999999.99.
   05  AMOUNT4          PIC 99.99.

(The '05' at the front of each line is not a part of the variable, but indicates how the variable is part of a larger structure.)

These four different values are numeric, but they do not align well with any of the CLR types. Thus, they cannot be exchanged with other programs in .NET.

There are compilers for Cobol that emit .NET modules. I don't know how they work, but I suspect that they either use custom types (which are not easily exchanged with modules from other languages) or they convert the Cobol-style data to a C-style value (which would incur a performance penalty).

Pascal has a similar problem with data types. Strings in Pascal are length-count strings, not NUL-terminated strings. Pascal has "sets" which can contain a set of values. The notion of a set translates poorly to other languages. C, C++, Java, and C# can use enums to do some of the work, but sets in Pascal are not quite enums.

Pascal also has definite ideas about memory management and pointers, and those ideas do not quite align with memory management in C (or .NET). With care, one can make it work, but Pascal is not a native .NET language any more than Cobol.

Fortran is another language that predates the .NET platform, and doesn't work well on it. Fortran is a simpler language that Cobol or Pascal, and concentrates on numeric values. The numeric types can convert to the CLR numeric types, so compiling and exchanging data is possible.

Fortran's strength was speed. It was (and still is) one of the fastest languages for numeric processing. Its speed is due to its static memory layout, something that I have not seen in compilers for Fortran to .NET modules. Thus, Fortran on .NET loses its advantage. Fortran on .NET is not fast, and I fear it never will be.

Processors, too, are biased. Intel processors handle binary numeric values for integers and floating-point values, but not BCD values. IBM S/360 processors (and their descendants) can handle BCD data. (BCD data is useful for financial transactions because it avoids many issues with floating-point representations.)

Our platforms are biased. We often don't see that bias, most likely because with use only a single platform. (A single processor type, a single operating system, a single programming language.) The JVM and .NET platforms are biased towards C-style data types.

There are different approaches to data and to computation, and we're limiting ourselves by limiting our expertise. I suspect that in the future, developers will rediscover the utility of data types that are not C-style types, especially the PICTURE-specified numeric types of Cobol. As C and its descendants are ill-equipped to handle such data, we will see new languages with the new (old) data types.


Thursday, October 27, 2016

Actually, new platforms compel new languages

My previous post claimed that new platforms spur the adoption of new languages. The more I think about it, the more I believe I was wrong.

New platforms don't simply spur the adoption of new languages. They compel the adoption of new languages.

A platform offers a set of capabilities and a set of concepts. Languages are designed around those capabilities and concepts. Change the platform, and you change the capabilities and the concepts, and you need a different language.

For batch processing, COBOL and FORTRAN were acceptable. They didn't work for timeshare systems and they didn't work for microcomputers. Timeshare and micrcomputers were interactive, and they needed a language like BASIC.

Windows and OS/2's Presentation Manager required a language that could handle event-driven processing, and object oriented languages (first C++, later Visual Basic and Java) met that need.

Web applications needed a run-time system that was constantly present. We started web applications with Perl and C++ and quickly learned that the startup time for the programs was costing us performance. Java and C# load their run-time systems independently of the application program, and can keep the run-time in memory, which gives better performance.

Changing languages (and the mindset of the underlying platform) is a significant effort. One does not do it lightly, which is why large organizations tend to use older technology.

But where does this leave functional languages?

From my view, I see no platform that requires the use of functional languages. And without a compelling reason to use functional languages, I expect that we won't. Oh, functional languages won't go away; lots of developers use them. (I myself am a fan, although I tend to use Ruby or Python for my own projects.)

But functional languages won't become the popular languages of the day without a reason. Inertia will keep us with other languages.

At least, until a platform arrives that compels the capabilities of functional languages. That platform might be the "internet of things" although I expect the first versions will use the currently popular languages.

Functional languages offer increased reliability. It may be possible to prove certain programs correct, which will be of interest to government agencies, banks, and anyone in the security field. (Turing proved that we could not prove correct most programs, but I believe that we can prove correct programs that are subject to a set of constraints. Functional languages may offer those constrants.)

I'm not abandoning functional languages. I like what they offer. Yet I recognize that they require an additional level of discipline (much like structured programming and object-oriented programmed required additional discipline) and we will switch only when the benefits are higher than the cost.

Sunday, October 23, 2016

Platforms spur languages

We developers like to think that we make languages popular. We like to think that it is us that decide the fate of languages. I'm not so sure. It may be that we're in less control than we believe.

I posit that it is platforms, that is, hardware and operating systems, that drive the popularity of programming languages.

The first programming languages, COBOL and FORTRAN, became popular only after computers became accepted in business and, more importantly, as platforms for running applications.

When computers were special-purpose devices, programs were written for the specific computer, usually in machine code or assembly language, with the idea that the program was part of the computer. The thought of moving an application from one computer to another was alien, like toasting bread with a refridgerator.

It was only after our mindset changed that we thought of moving programs. The idea of the computer as a platform, and the successive idea of moving a program from one computer to another, led to the idea of "portability" and languages common across computers. Thus, COBOL and FORTRAN were born.

Timesharing, a platform different from batch processing, gave us BASIC. Microcomputers in the late 1970s took BASIC to a higher level, making it one of the most popular languages at the time.

In the age of the IBM PC, computer programs become commercial, and BASIC was not suitable. BASIC offered no way to obscure source code and its performance was limited. Most commercial programs were written in assembly language, a choice possible due to the uniformity of the PC.

Microsoft Windows drove C++ and Visual Basic. While it was possible to write applications for Windows in assembly language, it was tedious and maintenance was expensive. C++ and its object-oriented capabilities made programming in Windows practical. Later, Visual Basic had just enough object-oriented capabilities to make programming in Windows not only possible but also easy.

Java's rise started with the internet and the web, but was also popular because it was "not from Microsoft". In this, Java is an outlier: not driven by a technical platform, but by emotion.

MacOS and iOS raised the popularity of Objective-C, at least until Apple announced Swift as the successor language for development. After that announcement, Objective-C dropped in popularity (and Swift started its rise).

Cloud computing and large data sets ("big data") has given us new languages for data management.

Looking at these data points, it seems that different platforms are favorable to different languages, and the popularity of the platform drives the popularity of the language.

Which is not to say that a language cannot achieve popularity without a platform. Some languages have, but they are few.

Some languages have failed to gain popularity despite other, non-platform inducements. The two largest "failures" that come to mind are PL/I and Ada. PL/I was invented by IBM and enjoyed some popularity, but has faded into all-but-obscurity. Ada was mandated by the US Department of Defense as a standard for all applications, and it, too, has faded.

If IBM at its height of market control, and the US DoD through mandate cannot make languages popular, what else (besides platforms) can?

If my theory is correct, then developers and managers should consider platforms when selecting programming languages. Often, developers pick what they know (not an unreasonable choice) and managers pick what is "safe" (also not unreasonable). If using a platform conducive to the language, then the decision is sound. If switching platforms, a different language may be the better option.

Wednesday, October 19, 2016

We prefer horizontal layers, not vertical stacks

Looking back at the 60-plus years of computer systems, we can see a pattern of design preferences. That pattern is an initial preference for vertical design (that is, a complete system from top to bottom) followed by a change to a horizontal divide between a platform and applications on that platform.

A few examples include mainframe computers, word processors, and smart phones.

Mainframe computers, in the early part of the mainframe age, were special-purpose machines. IBM changed the game with its System/360, which was a general-purpose computer. The S/360 could be used for commercial, scientific, or government organizations. It provided a common platform upon which ran application programs. The design was revolutionary, and it has stayed with us. Minicomputers followed the "platform and applications" pattern, as did microcomputers and later IBM's own Personal Computer.

When we think of the phrase "word processor", we think of software, most often Microsoft's "Word" application (which runs on the Windows platform). But word processors were not always purely software. The original word processors were smart typewriters, machines with enhanced capabilities. In the mid-1970s, a word processor was a small computer with a keyboard, display, processing unit, floppy disks for storage, a printer, and software to make it all go.

But word processors as hardware did not last long. We moved away from the all-in-one design. In its place we used the "application on platform" approach, using PCs as the hardware and a word processing application program.

More recently, smart phones have become the platform of choice for photography, music, and navigation. We have moved away from cameras (a complete set of hardware and software for taking pictures), moved away from MP3 players (a complete set of hardware and software for playing music), and moved away from navigation units (a complete set of hardware and software for providing directions). In their place we use smart phones.

(Yes, I know that some people still prefer discrete cameras, and some people still use discrete navigation systems. I myself still use an MP3 player. But the number of people who use discrete devices for these tasks is small.)

I tried thinking of single-use devices that are still popular, and none came to mind. (I also tried thinking of applications that ran on platforms that moved to single-use devices, and also failed.)

It seems we have a definite preference for the "application on platform" design.

What does this mean for the future? For smart phones, possibly not so much -- other than they will remain popular until a new platform arrives. For the "internet of things", it means that we will see a number of task-specific devices such as thermostats and door locks until an "internet of things" platform comes along, and then all of those task-specific devices will become obsolete (like the task-specific mainframes or word processor hardware).

For cloud systems, perhaps the cloud is the platform and the virtual servers are the applications. Rather than discrete web servers and database servers the cloud is the platform for web server and database server "applications" that will be containerized versions of the software. The "application on platform" pattern means that cloud and containers will endure for some time, and is a good choice for architecture.

Sunday, April 24, 2011

Single platform is so 90s

Ah, for those days of yesteryear when things were simple. The software market was easy: build your application to run under Windows and ignore everyone else.

Life is complicated these days. There are three major PC platforms: Windows, OSX, and Linux. Targeting an application to a single platform locks you out of the others. And while Windows enjoys a large share of desktops, one should not ignore a platform simply because its market share is small. All it takes to ruin your single-platform day is for a large potential customer to ask "Does it run on X?" where X is not your platform. Telling a customer "no" is not winning strategy.

But life is even more complicated than three desktop platforms. Apple's success with iPhones and iPads, RIM's success with BlackBerry devices, and the plethora of Android phones (and soon to be tablets) has created more platforms for individuals. These new devices have not worked their way into the office (except for the BlackBerry) but they will soon be there. Androids and iPhones will be there shortly after a CEO wants to read e-mail, and declines the IT-proffered BlackBerry.

The single-platform strategy is no longer viable. To be viewed as a serious contender, providers will have to support multiple platforms. They will have to support the desktop (Windows, OSX, and Linux), the web (Internet Explorer, FireFox, Safari, and Chrome), smartphones (iPhone, Android, and BlackBerry), and tablets (iPad and Android). Their services will have to make sense on these different platforms; stuffing your app into a browser window and claiming that it works on all platforms won't do it. Your customers won't buy it, and your competitors that do support the platforms will win the business.