Wednesday, December 28, 2016

Moving to the cloud requires a lot. Don't be surprised.

Moving applications to the cloud is not easy. Existing applications cannot be simply dropped onto cloud servers and leverage the benefits of cloud computing. And this should not surprise people.

The cloud is a different environment than a web server. (Or a Windows desktop.) Moving to the cloud is a change in platform.

The history of IT has several examples of such changes. Each transition from one platform to another required changes to the code, and often changes to how we *think* about programs.

The operating system

The first changes occurred in the mainframe age. The very first was probably the shift from a raw hardware platform to hardware with an operating system. With raw hardware, the programmer has access to the entire computing system, including memory and devices. With an operating system, the program must request such access through the operating system. It was no longer possible to write directly to the printer; one had to request the use of each device. This change also saw the separation of tasks between programmers and system operators, the latter handling the scheduling and execution of programs. One could not use the older programs; they had to be rewritten to call the operating system rather that communicate with devices.

Timesharing and interactive systems

Timesharing was another change in the mainframe era. In contrast to batch processing (running one program at a time, each program reading and writing data as needed but with no direct interaction with the programmer), timeshare systems interacted with users. Timeshare systems saw the use of on-line terminals, something not available for batch systems. The BASIC language was developed to take advantage of these terminals. Programs had to wait for user input and verify that the input was correct and meaningful. While batch systems could merely write erroneous input to a 'reject' file, timeshare systems could prompt the user for a correction. (If they were written to detect errors.) One could not use a batch program in an interactive environment; programs had to be rewritten.

Minicomputers

The transition from mainframes to minicomputers was, interestingly, one of the simpler conversions in IT history. In many respects, minicomputers were smaller versions of mainframes. IBM minicomputers used the batch processing model that matched its mainframes. Minicomputers from manufacturers like DEC and Data General used interactive systems, following the lead of timeshare systems. In this case, is *was* possible to move programs from mainframes to minicomputers.

Microcomputers

If minicomputers allowed for an easy transition, microcomputers were the opposite. They were small and followed the path of interactive systems. Most ran BASIC in ROM with no other possible languages. The operating systems available (CP/M, MS-DOS, and a host of others) were limited and weak compared to today's, providing no protection for hardware and no multitasking. Every program for microcomputers had to be written from scratch.

Graphical operating systems

Windows (and OS/2 and other systems, for those who remember them) introduced a number of changes to programming. The obvious difference between Windows programs and the older DOS programs was, of course, the graphical user interface. From the programmer's perspective, Windows required event-driven programming, something not available in DOS. A Windows program had to respond to mouse clicks and keyboard entries anywhere on the program's window, which was very different from the DOS text-based input methods. Old DOS programs could not be simply dropped into Windows and run; they had to be rewritten. (Yes, technically one could run the older programs in the "DOS box", but that was not really "moving to Windows".)

Web applications

Web applications, with browsers and servers, HTML and "submit" requests, with CGI scripts and JavaScript and CSS and AJAX, were completely different from Windows "desktop" applications. The intense interaction of a window with fine-grained controls and events was replaced with the large-scale request, eventually getting smaller AJAX and AJAX-like web services. The separation of user interface (HTML, CSS, JavaScript, and browser) from "back end" (the server) required a complete rewrite of applications.

Mobile apps

Small screen. Touch-based. Storage on servers, not so much on the device. Device processor for handling input; main processing on servers.

One could not drop a web application (or an old Windows desktop application) onto a mobile device. (Yes, you can run Windows applications on Microsoft's Surface tablets. But the Surface tablets are really PCs in the shape of tablets, and they do not use the model used by iOS or Android.)

You had to write new apps for mobile devices. You had to build a collection of web services to be run on the back end. (Not too different from the web application back end, but not exactly the same.)

Which brings us to cloud applications

Cloud applications use multiple instances of servers (web servers, database servers, and others) each hosting services (called "microservices" because the service is less that a full application) communicating through message queues.

One cannot simply move a web application into the cloud. You have to rewrite them to split computation and coordination, the latter handled by queues. Computation must be split into small, discrete services. You must write controller services that make requests to multiple microservices. You must design your front-end apps (which run on mobile devices and web browsers) and establish an efficient API to bridge the front-end apps with the back-end services.

In other words, you have to rewrite your applications. (Again.)

A different platform requires a different design. This should not be a surprise.


Wednesday, December 14, 2016

Steps to AI

The phrase "Artificial Intelligence" (AI) has been used to describe computer programs that can perform sophisticated, autonomous operations, and it has been used for decades. (One wag puts it as "artificial intelligence is twenty years away... always".)

Along with AI we have the term "Machine Learning" (ML). Are they different? Yes, but the popular usages make no distinction. And for this post, I will consider them the same.

Use of the term waxes and wanes. The AI term was popular in the 1980s and it is popular now. Once difference between the 1980s and now: we may have enough computing power to actually pull it off.

Should anyone jump into AI? My guess is no. AI has preconditions, things you should be doing before you start with a serious commitment to AI.

First, you need a significant amount of computing power. Second, you need a significant amount of human intelligence. With AI and ML, you are teaching the computer to make decisions. Anyone who has programmed a computer can tell you that this is not trivial.

It strikes me that the necessary elements for AI are very similar to the necessary elements for analytics. Analytics is almost the same as AI - analyzing large quantities of data - except it uses humans to interpret the data, not computers. Analytics is the predecessor to AI. If you're successful at analytics, then you are ready to move on to AI. If you haven't succeeded (or even attempted) at analytics, you're not ready for AI.

Of course, one cannot simply jump into analytics and expect to be successful. Analytics has its own prerequisites. Analytics needs data, the tools to analyze the data and render it for humans, and smart humans to interpret the data. If you don't have the data, the tools, and the clever humans, you're not ready for analytics.

But we're not done with levels of prerequisites! The data for analytics (and eventually AI) has its own set of preconditions. You have to collect the data, store the data, and be able to retrieve the data. You have to understand the data, know its origin (including the origin date and time), and know its expiration date (if it has one). You have to understand the quality of your data.

The steps to artificial intelligence are through data collection, metadata, and analytics. Each step has to be completed before you can advance to the next level. (Much like the Capability Maturity Model.) Don't make the mistake of starting a project without the proper experience in place.