Monday, June 26, 2017

The humble PRINT statement

One of the first statements we learn in any language is the "print" statement. It is the core of the "Hello, world!" program. We normally learn it and then discard it, focussing our efforts on database and web service calls.

But the lowly "print" statement has its uses, as I was recently reminded.

I was working on a project with a medium-sized C++ application. We needed information to resolve several problems, information that would normally be available from the debugger and from the profiler. But the IDE's debugger was not usable (executing under the debugger would require a run time of about six hours) and the IDE did not have a profiler.

What to do?

For both cases, the PRINT statement (the "fprintf()" statement, actually, as we were using C++) was the thing we needed. A few carefully placed statements allowed use to capture the necessary information, make decisions, and resolve the problems.

The process wasn't that simple, of course. We needed several iterations, adding and removing PRINT statements in various locations. We also captured counts and timings of various functions.

The effort was worth it.

PRINT statements (or "printf()", or "print()", or "puts()", whatever you use) are useful tools. Here's how they can help:
  • They can capture values of internal variables and state when the debugger is not available.
  • They can capture lots of values of variables and state, for analysis at a level higher than the interactive level of the debugger. (Consider viewing several thousand values for trends in a debugger.)
  • They can capture performance when a profiler is not available.
  • They can extract information from the "release" version of software, because sometimes the problem doesn't occur in "debug" mode.
They may be simple, but they are useful. Keep PRINT statements in your toolbox.

* * * * *

I was uncertain about the title for this column. I considered the C/C++ form of the generic statement ('printf()'). I also considered the general form used by other languages ('print()', 'puts()', 'WriteLine()'). I settled on BASIC's form of PRINT -- all capitals, no parentheses. All popular languages have such a statement; in the end, I suspect it matters little. Use what is best for you.

Sunday, June 18, 2017

Three models of computing

Computing comes in different flavors. We're probably most familiar with personal computers and web applications. Let's look at the models used by different vendors.

Apple has the simplest model: Devices that compute. Apple has built it's empire on high-quality personal computing devices. They do not offer cloud computing services. (They do offer their "iCloud" backup service, which is an accessory to the central computing of the iMac or Macbook.) I have argued that this model is the same as personal computing in the 1970s.

Google has a different model: web-based computing. This is obvious in their Chromebook, which is a lightweight computer that can run a browser -- and nothing else. All of the "real" computing occurs on the servers in Google's data center. The same approach is visible in most of the Google Android apps -- lightweight apps that communicate with servers. In some ways, this model is an update of the 1970s minicomputer model, with terminals connected to a central processor.

Microsoft has a third model, a hybrid of the two. In Microsoft's model, some computing occurs on the personal computer and some occurs in the data center. It is the most interesting of the two, requiring communication and coordination of two components.

Microsoft did not always have their current approach. Their original model was the same as Apple's: personal computers as complete and independent computing entities. Microsoft started with implementations of the BASIC language, and then sold PC-DOS to IBM. Even early versions of Windows were for stand-alone, independent PCs.

Change to that model started with Windows for Workgroups, and became serious with Windows NT, domains, and ActiveDirectory. Those three components allowed for networked computing and distributed processing. (There were network solutions from other vendors, but the Microsoft set was a change in Microsoft's strategy.)

Today, Microsoft offers an array of services under its "Azure" mark. Azure provides servers, message queues, databases, and other services, all hosted in its cloud environment. They allow individuals and companies to create applications that can combine PC and cloud technologies. These applications perform some computing on the local PC and some computing in the Azure cloud. You can, of course, build an application that runs completely on the PC, or completely in the cloud. That you can build those applications shows the flexibility of the Microsoft platform.

I think this hybrid model, combining local computing and server-based computing, has the best potential. It is more complex, but it can handle a wider variety of applications than either the PC-only solution (Apple's) or the server-only solution (Google's). Look for Microsoft to support this model with development tools, operating systems, and communication protocols and libraries.

Looking forward, I can see Microsoft working on a "fluid" model of computing, where some processing can move from the server to the local PC (for systems with powerful local PCs) and from the PC to the server (for systems with limited PCs).

Many things in the IT realm started in a "fixed" configuration, and over time have become more flexible. I think processing is about to join them.

Wednesday, June 14, 2017

Evangelism from Microsoft

Microsoft has designated certain employees as "evangelists": people knowledgeable in the details of specific products and competent at presentations.

It strikes me that the folks in the evangelist role were mostly, well, preaching to the choir. They would appear at events where one would expect Microsoft users, fans, and enthusiasts to gather.

I'm not sure that Microsoft needs them, and it seems that Microsoft is coming to the same conclusion. A recent blog post on MSDN seems to indicate that the Developer Evangelist group is being disbanded. (The post is vague.)

Can Microsoft compete (and survive) without the evangelist team? I believe that they can.

First, I believe that Satya Nadella is confident in his position as CEO of Microsoft, and that confidence flows down to the entire company.

Second, I believe that Microsoft has confidence in the direction of its products and services. It has ceased being the "Windows company" in which everything revolved around Windows. Today, Microsoft has embraced outside technologies (namely open source) and developed its cloud services (Azure), competing successfully with them.

In short, Microsoft feels good about its current position and its future.

With such confidence in its products and services, Microsoft doesn't need the re-assurance of evangelists. Perhaps they were there to tell Microsoft -- not customers -- that its products were good. Now Microsoft believes it without their help.

Sunday, June 11, 2017

Apple's Files App is an admission of imperfection

When Apple introduced the iPhone, they introduced not just a smart phone but a new approach to computing. The iPhone experience was a new, simpler experience for the user. The iPhone (and iOS) did away with much of the administrative work of PCs. It eliminated the notion of user accounts and administrator accounts. Updates were automatic and painless. Apps knew how to get their data. The phone "just worked".

The need for a Files app is an admission that the iPad experience does not meet those expectations. It raises the hood and allows the user to meddle with some of the innards of the iPhone. One explanation for its existence is that Apps cannot always find the needed files, and the Files App lets you (the user) find those files.

Does anyone see the irony in making the user do the work that the computer should do? Especially a computer from Apple?

To be fair, Android has had File Manager apps for years, so the Android experience does not meet those expectations either. Microsoft's Surface tablets, starting with the first one, have had Windows Explorer built in, so they are failing to provide the new, simpler experience too.

A curmudgeon might declare that the introduction of the Files App shows that even Apple cannot provide the desired user experience, and if Apple can't do it then no one can.

I'm not willing to go that far.

I will say that the original vision of a simple, easy-to-use, reliable computing device still holds. It may be that the major players have not delivered on that vision, but that doesn't mean the vision is unobtainable.

It may be that the iPhone (and Android) are steps in a larger process, one starting with the build-it-yourself microcomputers of the mid 1970s, passing through IBM PCs with DOS and later PC-compatibles with Windows, and currently with iPhones and tablets. Perhaps we will see a new concept in personal computing, one that improves upon the iPhone experience. It may be as different from iPhone and Android as those operating systems are from Windows and MacOS. It may be part of the "internet of things" and expand personal computing to household appliances.

I'm looking forward to it.

Monday, June 5, 2017

Better programming languages let us do more -- and less

We tend to think that better programming languages let us programmers do more. Which is true, but it is not the complete picture.

Better languages also let us do less. They remove capabilities. In doing so, they remove the possibility for errors.

PL/I was better than COBOL and FORTRAN because it let us write free-form source code. In COBOL and FORTRAN, the column in which code appeared was significant. The restrictions were from the technology of the time (punch cards) but once in the language they were difficult to remove.

BASIC was better than FORTRAN because it eliminated FORMAT specifications. FORMAT specifications were necessary to parse input data and format output data. They were precise, opaque, and easy to get wrong. BASIC, with no such specifications, removed the possibility of errors from such specifications. BASIC also fixed the DO loops of FORTRAN and removed restrictions on subscript form. (In FORTRAN, a subscript could not be an arbitrary expression but had to have the form A*B+C. Any component could be zero and omitted so A+C was allowed, as was A*B. But you could not use A+B+C or A/2.)

Pascal was better than BASIC because it limited the use of GOTO statements. In BASIC, you could use a GOTO to transfer control to any other part of the program, including in and out of loops or subroutines. It made for "spaghetti code" which was difficult to understand and debug. Pascal put an end to that, with a constrained form of GOTO.

Java eliminated the need for the explicit 'delete' or 'free' operations on allocated memory. You cannot forget the 'delete' operation -- you can't write one at all! The internal garbage collector recycles memory. In Java, it is much harder to create memory leaks than in C++ and C.

Python forces us to consider indentation as part of the code. In C, C++, Java, and C#, you can write:

initialize();

if (some_condition)
    do_something();
    do_another_thing();

complete_the_work();

But the code acts in a way you may not expect. Python's use of indentation to specify code organization makes the code clearer. The Python code:

initialize()

if some_condition:
    do_something()
    do_another_thing()

complete_the_work()

does what you expect.

New programming languages do provide new capabilities. (Often, they are refinements to constructs and concepts that were implemented roughly in earlier programming languages.) A new programming language is a combination of new things we can do and old things we no longer need to do.

When considering a new language (or reviewing the current language for a project), keep in mind not only the things that a new language lets you do, but also the things that it won't let you do.

The Demise of Apple

Future historians will look back at Apple, point to a specific moment, and say "Here, at this point, is when Apple started its decline. This event started Apple's fall.". That point will be the construction of their new spaceship-inspired headquarters.

Why do I blame their new building? I don't, actually. I think others -- those future historians -- will. They will get the time correct, but point to the wrong event.

First things first. What do I have against Apple's shiny new headquarters?

It's round.

Apple's new building is large, elegant, expensive, and ... the wrong shape. It is a giant circle, or wheel, or doughnut, and it works poorly with human psychology and perception. The human mind works better with a grid than a circle.

Not that humans can't handle circular objects. We can, when they are small or distant. We have no problem with the moon being round, for example. We're okay with clocks and watches, and old-style speedometers in cars.

We're good when we are looking at the entire circle. Watches and clocks are smaller than us, so we can view the entire circle and process it. (Clocks in towers, such as the "Big Ben" clock in London or the center of town, are also okay, since we view them from a distance and they appear small.)

The problems occur when we are inside the circle, when we are navigating along the circumference. We're not good at keeping track of gradual changes in direction. (Possibly why so many people get lost in the desert. They travel in a circle without realizing it.)

Apple's building looks nice, from above. I suspect the experience of working inside the building will be one of modest confusion and discomfort. Possibly at such a minor level that people do not realize that something is wrong. But this discomfort will be significant, and eventually people will rebel.

It's ironic that Apple, the company that designs and builds products with the emphasis on "easy to use", got the design of their building wrong.

So it may be that historians, looking at Apple's (future) history, blame the design of the new headquarters for Apple's (future) failures. They will (rightly) associate the low-level confusion and additional brain processing required for navigation of such a building as draining Apple's creativity and effectiveness.

I think that they (the historians) will be wrong.

The building is a problem, no doubt. But it won't cause Apple's demise. The true cause will be overlooked.

That true cause? It is Apple's fixation on computing devices.

Apple builds (and sells) computers. They are the sole company that has survived from the 1970s microcomputer age. (Radio Shack, Commodore, Cromemco, Sol, Northstar, and the others left the market decades ago.) In that age, microcomputers were stand-alone devices -- there was no internet, no ethernet, no communication aside from floppy disks and a few on-line bulletin board systems (BBS) that required acoustic coupler modems. Microcomputers were "centers of computing" and they had to do everything.

Today, computing is changing. The combination of fast and reliable networks, cheap servers, and easy virtual machines allows the construction of cloud computing, where processing is split across multiple processors. Google is taking advantage of this with its Chromebooks, which are low-end laptops that run a browser and little else. The "real" processing is performed not on the Chromebook but on web servers, often hosted in the cloud. (I'm typing this essay on a Chromebook.)

All of the major companies are moving to cloud technology. Google, obviously, with Chromebooks and App Engine and Android devices. Microsoft has its Azure services and versions of Word and Excel that run entirely in the cloud, and they are working on a low-end laptop that runs a browser and little else. It's called the "Cloudbook" -- at least for now.

Amazon.com has its cloud services and its Kindle and Fire tablets. IBM, Oracle, Dell, HP, and others are moving tasks to the cloud.

Except Apple. Apple has no equivalent of the Chromebook, and I don't think it can provide one. Apple's business model is to sell hardware at a premium, providing a superior user experience to justify that premium. The superior user experience is possible with local processing and excellent integration of hardware and software. Apps run on the Macs, MacBooks, and iPhones. They don't run on servers.

A browser-only Apple laptop (a "Safaribook"?) would offer little value. The Apple experience does not translate to web sites.

When Apple does use cloud technology, they use it as an accessory to the PC. The processing for Siri is done in a a big datacenter, but its all for Siri and the user experience. Apple's iCloud lets users store data and synchronize it across devices, but it is simply a big, shared disk. Siri and iCloud make the PC a better PC, but don't transform the PC.

This is the problem that Apple faces. It is stuck in the 1970s, when individual computers did everything. Apple has made the experience pleasant, but it has not changed the paradigm.

Computing is changing. Apple is not. That is what will cause Apple's downfall.