Showing posts with label lua. Show all posts
Showing posts with label lua. Show all posts

Sunday, April 10, 2016

Complexity of programming languages

Are programming languages becoming more or less complex?

It is a simple question, and like many simple questions, the answer is not so simple.

Let's look at a simple program in some languages. The simple program will print the numbers from 1 to 10. Here are programs in FORTRAN, BASIC, Pascal, C, C++, C#, and Python:

FORTRAN 66 (1966)

      DO 100 I = 1, 10
100   WRITE (6, 200) I
200   FORMAT (I5)
      STOP
      END

BASIC (1965)

10 FOR I = 1 TO 10
20 PRINT I
30 NEXT I
99 END

Pascal (1970)

program hello;
begin
  i: integer;

  for i := 1 to 10 do

    WriteLn(i);
end.

C (1978)

#include "stdlib.h"

int main(void)
{
    int i;

    for (i = 1; i <= 10; i++)

      printf("%d", i);
}

C++ (1983)

#include

int main()

{
    for (unsigned int i = 1; i <= 10; i++)
      std::cout << i << std::endl;

    return 0;

}

C# (2000)

using System;

class Program

{
    public static void Main(string[] args)
    {
        for (int i = 1; i <= 10; i++)
            Console.WriteLine(i);
    }
}

Python (2000)

for i in range(1, 10):
    print(str(i))

From this small sampling, a few things are apparent.

First, the programs vary in length. Python is the shortest with only two lines. It's also a recent language, so are languages becoming more terse over time? Not really, as FORTRAN and BASIC are the next shortest languages (with 5 and 4 lines, respectively) and C#, a contemporary of Python, requires 10 lines.

Second, the formatting of program statements has changed. FORTRAN and BASIC, the earliest languages in this set, have strong notions about columns and lines. FORTRAN limits line length to 72 characters, reserves the first 6 columns for line numbers, and another column for continuation characters (which allows statements to exceed the 72 character limit). BASIC relaxed the column restrictions but add the requirement for line numbers on each line. Pascal, C, C++, and C# care nothing about columns and lines, looking at tokens in the code to separate statements. Python relies on indentation for block definitions.

Some languages (BASIC, C#) are capable of printing things by simply mentioning them. Others languages need specifications. FORTRAN has FORMAT statements to specify the exact format of the output. C has a printf() function that needs similar formatting information. I find the mechanisms of BASIC and C# easier to use than the mechanisms of C and Python.

Let's consider a somewhat more complex program, one that lists a set of prime numbers. We'll look at BASIC and Lua, which span the computing age.

Lua

local N = 100
local M = 10
function PRIME()  -- PROCEDURE DECLARATION;
  local X, SQUARE, I, K, LIM, PRIM -- DECLARATION OF VARIABLES;
  local P, V = {}, {}
  P[1] = 2 -- ASSIGNMENT TO FIRST ELEMENT OF p;
  print(2) -- OUTPUT A LINE CONTAINING THE NUMBER 2;
  X = 1
  LIM = 1
  SQUARE = 4
  for I = 2, N do -- LOOP. I TAKES ON 2, 3, ... N;
    repeat -- STOPS WHEN "UNTIL" CONDITION IS TRUE;
      X = X + 2
      if SQUARE <= X then
        V[LIM] = SQUARE
        LIM = LIM + 1
        SQUARE = P[LIM] * P[LIM]
      end
      local K = 2
      local PRIM = true
      while PRIM and K < LIM do
        if V[K] < X then
          V[K] = V[K] + P[K]
        end
        PRIM = X ~= V[K]
        K = K + 1
      end
    until PRIM -- THIS LINE CLOSES THE REPEAT
    P[I] = X
    print(X)
  end
end
PRIME()


BASIC

100 LET N = 100
110 LET M = 10
200 DIM P(100), V(100)
300 LET P(1) = 2
310 PRINT P(1)
320 LET X = 1
330 LET L = 1
340 LET S = 4
350 FOR I = 2 TO N
360  REM    repeat -- STOPS WHEN "UNTIL" CONDITION IS TRUE;
370   LET X = X + 2
380   IF S > X THEN 420
390    LET V(L) = S
400    LET L = L + 1
410    LET S = P(L)^2
420   REM
430   LET K = 2
440   LET P = 1
450   REM while PRIM and K < LIM do
455   IF P <> 1 THEN 520
460   IF K >= L THEN 520
470    IF V(K) >= X THEN 490
480     LET V(K) = V(K) + P(K)
490    REM
500    LET P = 0
501    IF X = V(K) THEN 510
502    LET P = 1
510    LET K = K + 1
515   GOTO 450
520  IF P <> 1 THEN 360
530  LET P(I) = X
540  PRINT X
550 NEXT I
999 END

Both programs work. They produce identical output.  The version in Lua may be a bit easier to read, given that variable names can be more than a single letter.

They are about the same size and complexity. Two versions of a program, one from today and one from the early years of computing, yet they have similar complexity.

Programming languages encode operations into pseudo-English instructions. If the measure of a programming language's capability is its capacity to represent operations in a minimal number of steps, then this example shows that programming languages have not changed over the past five decades.

Caution is advised. These examples (printing "hello" and calculating prime numbers) may be poor representatives of typical programs. Perhaps we should withhold judgement until we consider more (and larger) programs. After all, very few people in 2016 use BASIC; there must be a reason they have selected modern languages.

Perhaps it is better to keep asking the question and examining our criteria for programming languages.

Saturday, January 7, 2012

Predictions for 2012


Happy new year!

The turning of the year provides a time to pause, look back, and look ahead. Looking ahead can be fun, since we can make predictions.

Here are my predictions for computing in the coming year:

With the rise of mobile apps, we will see changes in project requirements and in the desires of candidates.

The best talent will work on mobile apps. The best talent will -- as always -- work on the "cool new stuff". The "cool new stuff" will be mobile apps. The C#/.NET and Java applications will be considered "that old stuff". Look for the bright, creative programmers and designers to flock to companies building mobile apps. Companies maintaining legacy applications will have to hire the less enthusiastic workers.

Less funding for desktop applications. Desktop applications will be demoted to "legacy" status. Expect a reduced emphasis on their staffing. These projects will be viewed as less important to the organization, and will see less training, less tolerance for "Fast Company"-style project teams, and lower compensation. Desktop projects will be the standard, routine, bureaucratic (and boring) projects of classic legacy shops. The C# programmers will be sitting next to, eating lunch with, and reminiscing with, the COBOL programmers.

More interest in system architects. Mobile applications are a combination of front end apps (the iPhone and iPad apps) and back-end systems that store and supply data. Applications like Facebook and Twitter work only because the front end app can call upon the back end systems to obtain data (updates submitted by other users). Successful applications will need people who can visualize, describe, and lead the team in building mobile applications.

More interest in generalists. Companies will look to bring on people skilled in multiple areas (coding, testing, and user interfaces). They will be less interested in specialists who know a single area -- with a few exceptions of the "hot new technologies".

Continued fracturing of the tech world. Amazon.com, Apple, and Google will continue to build their walled gardens of devices, apps, and media. Music and books available from Amazon.com will not be usable in the Apple world (although available on the iPod and iPad in the Amazon.com Kindle app). Music and books from Apple will not be available on Amazon.com Kindles and Google devices. Consumers will continue to accept this model. (Although like 33 RPM LPs and 45 PRM singles, consumers will eventually want a music and books on multiple devices. But that is a year or two away.)

Cloud computing will be big, popular, and confused. Different cloud suppliers offer different types of cloud services. Amazon.com's EC2 offering is a set of virtual machines that allow one to "build up" from there, installing operating systems and applications. Microsoft's Azure is a set of virtual machines with Windows/.NET and one may build applications starting at a higher level that Amazon's offering. Salesforce.com offers their cloud platform that lets one build applications at an even higher level. Lots of folks will want cloud computing, and vendors will supply it -- in the form that the vendor offers. When people from different "clouds" meet, they will be confused that the "other guy's cloud" is different from theirs.

Virtualization will fade into the background. It will be useful in large shops, and it will not disappear. It is necessary for cloud computing. But it will not be the big star. Instead, it will be a quiet, necessary technology, joining the ranks of power management, DASD management, telecommunications, and network administration. Companies will need smart, capable people to make it work, but they will be reluctant to pay for them.

Telework will exist, quietly. I expect that the phrase "telework" will be reserved for traditional "everyone works in the office" companies that allow some employees to work in remote locations. For them, the default will be "work in the office" and the exception will be "telework". In contrast, small companies (especially start-ups) will leverage faster networks, chat and videoconferencing, mobile devices, and social networks. Their standard mode of operation will be "work from wherever" but they won't think of themselves as offering "telework". From their point of view, it will simply be "how we do business", and they won't need a word to distinguish it. (They may, however, create a word to describe folks who insist on working in company-supplied space every day. Look for new companies to call these people "in-house employees" or "residents".)

Understand the sea change of the iPad. The single-app interface works for people consuming information. The old-fashioned multi-windowed desktop interface works for people composing and creating information. This change leads to a very different approach to the design of applications. This year people will understand the value of the "swipe" interface and the strengths of the "keyboard" interface.

Voice recognition will be the hot new tech. With the success of "Siri" (and Android's voice recognizer "Majel"), expect interest in voice recognition technology and apps designed for voice.

Content delivery becomes important. Content distributors (Amazon.com, Google, and Apple) become more important, as they provide exclusive content within their walled gardens. The old model of a large market in which anyone can write and sell software will change to a market controlled by the delivery channels. The model becomes one similar to the movie industry (a few studios producing and releasing almost all movies) and the record industry (a few record labels producing and releasing almost all music) and the book industry (a few publishing houses... you get the idea).

Content creation becomes more professional. With content delivery controlled by the few major players, the business model becomes less "anyone can put on a show" and more of "who do you know". Consumers and companies will have higher expectations of content and the abilities of those who prepare it.

Amateur producers will still exist, but with less perceived value. Content that is deemed "professional" (that is, for sale on the market) will be developed by professional teams. Other content (such as the day-to-day internal memos and letters) will be composed by amateur content creators: the typical office worker equipped with a word processor, a spreadsheet, and e-mail will be viewed as less important, since they provide no revenue.

Microsoft must provide content provider and enable professional creators. Microsoft's business has been to supply tools to amateur content creators. Their roots of BASIC, DOS, Windows, Office, and Visual Basic let anyone (with or without specific degrees or certifications) create content for the market. With the rise of the "professional content creator", expect Microsoft to supply tools labeled (and priced) for professionals.

Interest in new programming languages. Expect a transition from the object-oriented languages (C++, Java, C#) to a new breed of languages that introduce ideas from functional programming. Languages such as Scala, Lua, Python, and Ruby will gain in popularity. C# will have a long life -- but not the C# we know today. Microsoft has added functional programming capabilities to the .NET platform, and modified the C# language to use them. C# will continue to change as Microsoft adapts to the market.

The new year brings lots of changes and a bit of uncertainty, and that's how it should be.