Thursday, May 28, 2020

After the quarantine, think

The 2020 quarantine, with its spate of "stay at home" orders and closure of offices, has enabled (forced?) may companies to implement "work from home" procedures that allow employees to, well, work from home. For some companies, this was a small change, as they already had procedures and infrastructure in place to allow employees to work from remote locations. For other companies, it was a big change.

As various parts of the country rescind the "stay at home" orders, companies are free to resume work as normal. It is my position that, instead of simply requiring employees to report to the office as before, companies (and their managers) think about what is best for the company.

Companies now have experience with remote work. In the past, one reason to stay with "work at the office" (the traditional arrangement of everyone working in a single office) was that managers could not be sure that "work from home" (or, more generally, "work from anywhere") would work for the company. The lack of experience made such a change risky. That excuse is no longer valid. Companies now have several weeks of experience with remote work.

But I am not suggesting that companies blindly adopt "work from home" for all employees. Nor I am suggesting that companies abandon remote work and require employees to work in the office.

Instead, I recommend that managers review the performance of the past few weeks, identify the strengths and weaknesses of remote work, and agree on a plan for the future. Some companies may be happy with remote work and decide to continue with it. Other companies may revert to "work at the office". A third group will choose a middle ground, with some employees remote and others in the office, or remote work for a portion of the week.

I am sure that managers are aware of the costs of maintaining an office building, and will view the path of remote work as a way to reduce those costs. Remote work also allows for expansion of the workforce without a corresponding expansion (and cost) of office space.

"Work in the office" on the other hand allows for all work to be done in a single location, which may make it easier to interact with people. Face-to-face communication is more effective than e-mail, voice phone, and video calls. A single office building also keeps the IT infrastructure in one place, with no need (or cost) for remote access and the accompanying security.

The 2020 pandemic and quarantine gave us information about remote work. It would be foolish for managers to ignore that information when deciding how to run their company.

Thursday, May 21, 2020

The lessons we programmers learn

We in the programming industry learn from our mistakes, and we create languages to correct our mistakes. Each "generation" of programming language takes the good aspects of previous languages, drops the bad aspects, and adds new improved aspects. (Although we should recognize that "good", "bad", and "improved" are subjective.) Over time, our "programming best practices" are baked into our programming languages.

We learned that assembly language was specific to hardware and forced us to think too much about memory layouts, so we invented high-level languages. COBOL and FORTRAN allowed us to write programs that were portable across computers from different vendors and let us specify variables easily. (I won't say "memory management" here, as early high-level languages did not allow for dynamic allocation of memory the way C and C++ do.)

COBOL, FORTRAN, and BASIC (another high-level language) used GOTO statements and rudimentary IF statements for flow control. We learned that those statements lead to tangled code (some say "spaghetti code"), so we invented structured programming with its "if-then-else" and "do-while" statements. Pascal was one of the first languages to implement structured programming. (It retained the GOTO statement, but it was rarely needed.)

Structured programming was better than non-structured programming, but it was not sufficient for large systems. We learned that large system need more than if-then-else and do-while to organize the code, so we invented object-oriented programming. The programming languages C++, Java, and C# became popular.

Designing new languages seems like a built-in feature of the human brain. And designing new languages that use the best parts of the old languages while replacing the mistakes of the old languages seems like a good thing.

But this arrangement bothers me.

Programmers who learn the whole trail of languages, from assembly to BASIC to C++ to Java, understand the weaknesses of the early languages and the strengths of later languages. But these programmers are few. Most programmers do not learn the whole sequence; they learn only the current languages which have pruned away all of the mistakes.

We programmers often look forward. We want the latest language, the newest database, the most recent operating system. In looking forward, we don't look back. We don't look at the older systems, and the capabilities that they had.

Those old systems (and languages) had interesting features. Their designers had to be creative to solve certain problems. Many of those solutions were discarded as hardware became more powerful and languages became more structured.

Is it possible that we have, in our enthusiasm to improve programming languages, discarded some ideas that are worthy? Have we thrown out a baby (or two) with the bathwater of poor language features?

If we don't look back, if we leave those abandoned features in the dust heap, how would we know?

Thursday, May 7, 2020

COBOL all the way down

Programming languages have changed over time. That's not a surprise. But what surprised me was one particular way in which languages have changed: the importance of libraries.

The first programming languages were designed to be complete. That is, a program or application built in that language would use only that language, and nothing else.

Programs built in COBOL (usually financial applications) use COBOL and nothing else. COBOL was built to handle everything. A COBOL program is COBOL, all the way down. (COBOL programs can use SQL, which was fitted into COBOL in the 1970s, but SQL is an exception.)

We saw a change in later languages. FORTRAN, BASIC, Pascal, and C provided functions in their run-time libraries. Most of the application was written in the top-level language, with calls to functions to perform low-level tasks such as trigonometric calculations or string operations.

The introduction of IBM's OS/2 and Microsoft's Windows also changed programming. The graphical operating systems provided a plethora of functions. There were functions for graphical output (to displays or printers), input devices (keyboards and mice), memory management, process management, and network functions. It was no longer sufficient to learn the language and its keywords; one had to learn the extra functions too.

Programming languages such as Java and C# provided more libraries and packages, and some of the libraries and packages handled nontrivial tasks. Libraries allowed for a collection of classes and functions, and packages allowed for a collection of libraries in a form that was easily deployed and updated. These additional packages required the programmer to know even more functions.

The trend has been not only an increase in the number of functions, but also the capabilities and sophistication of library functions and classes. Programming is, more and more, about selecting libraries, instantiating classes, and invoking functions, and less and less about writing the functions that perform the work.

We can see this trend continue in recent languages. Many applications in Python and R are use libraries for a majority of the work. In Python and R, the libraries do the work, and the code in Python and R act more like plumbing, connecting classes and functions.

To put this succinctly:

Early programming languages assume that the processing of the application will occur in those languages. Libraries provide low-level operations, such as input-output operations. A COBOL application is a COBOL program with assistance from some libraries.

Recent programming languages assume that the processing will occur in libraries and not user-written code. The expectation is that libraries will handle the heavy lifting. A Python application is one or more libraries with some Python code to coordinate activities.

This change has profound impacts for the future of programming, from system architecture to hiring decisions. It won't be enough to ask a candidate to write code for a linked list or a bubble sort; instead one will ask about library capabilities. System design will depend more on libraries and less on programming languages.