Friday, April 4, 2014

What was true is no longer true

The IT world has truths, basic precepts that govern other decisions, but these precepts can change over time.

Early computers were expensive. Really expensive. So expensive that we took steps to ensure that computers were used efficiently, scheduling jobs to maximize the use of the hardware. A computer that was not being used was considered a waste of resources -- you had purchased (or leased) a computer larger than you needed.

Today computers are inexpensive and we care little for the CPU utilization rate. We leave computers unassigned with abandon. (Letting computers run and "do nothing" would horrify those early-days computer operators.) We leave computers running as we go for lunch (or dinner), we leave them running overnight with no active jobs.

The early days of computing saw expensive hardware and cheap programmers. Hardware and "computing time" was so precious that every moment counted. We could not afford to let the computer check our programs for errors; we asked programmers to "desk check" their programs for errors prior to compiling and running.

Today programmers are expensive and we gleefully let our compilers check for errors. Modern IDEs check for errors as we type. It is better to let the computers check our syntax several times each second, rather than force the programmers to think about syntax.

The introduction of compilers (FORTRAN and COBOL, and other early languages) sparked a debate about efficiency. Compilers convert high-level statements into machine code, but a programmer skilled in assembly language could create programs that were more efficient than the code from early compilers.

Today, the reverse is true. The code-generators in compilers have been improved and routinely emit efficient code. The CPU architecture has become more complex, so complex that only the most skilled programmer can out-perform a compiler (and only by spending more time). Processors with multiple cores and multiple pipelines demand extreme attention to data alignment and memory access, and compilers are better than people when generating machine instructions.

Things that we know are true may be true today and false tomorrow. Our challenge is to recognize when the cost equation changes. Too often we learn a "truth" and keep it with us, even when reality changes. Then our hard-won, comforting knowledge leads us astray.

What fundamental truths (aside from those above) have become invalid? Quite a few. Here is a sample:

"C++ is slower and bulkier than pure C code" - possibly still true, but C++ compilers have improved to the point that it is hard to see the difference.

"Java is slower than C++" - Java compilers and run-time libraries have improved. C++ may be faster to run, but not by much. Java is faster to write, and the overall benefit may be with Java.

"Personal computers are toys" - initially yes, they were little more than toys and incapable of performing in businesses. Yet today almost every business has personal computers in some capacity.

"Microsoft is the standard" - many open source projects have demonstrated their competence (or even superiority) to Microsoft tools. Many companies have adopted technologies other than Microsoft's.

"The Windows PC is the standard" - phones and tablets have challenged the dominance of the Windows PC.

"Databases are always relational" - Not true in the early days of computers; it became true in the 1980s with the rise of SQL. Now alternatives offer competent data storage.

Looking forward, we can see other maxims proven false:

"Source code must be text"

"Computers will always become cheaper"

"Computers will always become faster"

"Virtualized computers are cheaper"

"Linux is immune to malware"

"Windows is always vulnerable to malware"

That which we believe to be true today may be true today. It may be false. Yet even if it is true, it may become false tomorrow. Best to be on our guard. The danger is not in what we know, but in what we believe.

No comments: