Sunday, April 17, 2016
After the spreadsheet
For the individual, the spreadsheet is a useful tool. But for the enterprise, the spreadsheet creates perhaps more problems than it solves. Since a spreadsheet file contains the data, formulas, and presentation of data, they are often replicated to share with co-workers (usually via e-mail) and duplicated to process different sets of data (the spring sales figures and then the summer sales figures, for example).
The replication of spreadsheets via e-mail can me mitigated by the use of shared file locations ("network drives") and by online versions of spreadsheets which allow for multiple users. But the bigger problem is the duplication of spreadsheets with minor changes.
The duplication of spreadsheet means the duplication of not only the data (which is often changed) but also the duplication of the formulas and the presentation (which often do not change). Since a spreadsheet contains all three components, a new version of data requires a new copy of all components. There is no way to share only one component, no way to share formulas against new data, or different presentations against data and formulas. This means that, over time, an enterprise of any size accumulates multiple spreadsheets with different data and duplicate formulas and macros -- at least you hope that they are duplicate copies.
The design of spreadsheets -- containing the data, formulas, and presentation in one package -- is a holdover from the days of Visicalc and Lotus 1-2-3. Those programs were developed for the Apple II and the IBM PC. With their ability to run only one program at a time, putting everything into one program made sense -- using one program for data, another for calculation, and a third for presentation was awkward and time-consuming. But that applies to the old single-tasking operating systems. Windows and Mac OS and Linux allow for multiple programs to run at the same time, and windowing systems allow for multiple programs to present information to the user at the same time.
If spreadsheets were being invented now in the age of web services and cloud systems and multi-window displays, their design would probably be quite different. Instead of a single program that performed all functions and a single file that contained data, formulas and presentation, we might have something very different. We might create a system of web services, providing data with some and performing calculations with others. The results could be displayed by yet other functions in other windows, possibly published for co-workers to view.
Such a multi-component system would follow the tenets of Unix, which recommends small, independent programs that read data, perform some processing, and provide data. The data and computations could be available via web services. A central service could "fan out" requests to collect data from one or more services, send that data through one or more computing services, and the provide the data to a presentation mechanism such as a graph in a window or even a printed report.
By separating the formulas and macros from the data, we can avoid needless duplication of both. (While most cases see the duplication of formulas to handle different data sets, sometimes different formulas can be applied to the same data.)
Providing data via web services is easy -- web services do that today. There are even web services to convert data into graphs. What about calculations? What language can be used to perform computations on data sets?
The traditional languages of C# and Java are not the best here; we're replacing spreadsheets with something equally usable by non-programmers (or at least similarly usable). The best candidate may be R, the statistical-oriented language. R is established, cross-platform, and capable. It's also a high-level language, close the the formulas of spreadsheets (and more powerful that Microsoft's VBA, which is used for macros in Excel).
Replacing spreadsheets with a trio of data management, computation, and presentation tools will not be easy. The advantages of the spreadsheet include convenience and familiarity. The advantages of separate components are better integration in cloud systems, leveraging of web services, and easier audits of formulas. It may not happen soon, but I think it will happen eventually.
Thursday, April 14, 2016
Technology winners and losers
People didn't have to use PCs and Microsoft Windows. They could choose to use alternative solutions, such as Apple Macintosh computers with Mac OS. They could use regular PCs with Linux. But the people using out-of-mainstream technology *chose* to use it. They knew what they were getting into. They knew that they would be a minority, that when they entered a computer shop that most of the offerings would be for the other, regular Windows PCs and not their configuration.
The market was not always this way. In the years before the IBM PC, different manufacturers provided different systems: the Apple II, the TRS-80, DEC's Pro-325 and Pro-350, the Amiga, ... there were many. All of those systems were swept aside by the IBM PC, and all of the enthusiasts for those systems knew the pain of loss. They had lost their chosen system to the one designated by the market as the standard.
In a recent conversation with a Windows enthusiast, I realized that he was feeling a similar pain in his situation. He was dejected at the dearth of support for Windows phones -- he owned such a phone, and felt left out of the mobile revolution. Windows phones are out-of-mainstream, and many apps do not run on them.
I imagine that many folks in the IT world are feeling the pain of loss. Some because they have Windows phones. Others because they have been loyal Microsoft users for decades, perhaps their entire career, and now Windows is no longer the center of the software world.
This is their first exposure to loss.
The grizzled veterans who remember CP/M or Amiga DOS have had our loss; we know how to cope. The folks who used WordPerfect or Lotus 1-2-3 had to switch to Microsoft products, they know loss too. But no technology has been forced from the market for quite some time. Perhaps the last was IBM's OS/2, back in the 1990s. (Or perhaps Visual Basic, when it was modified to VB.NET.)
But IT consists of more than grizzled veterans.
For someone entering the IT world after the IBM PC (and especially after the rise of Windows), it would be possible -- and even easy -- to enjoy a career in dominant technologies: stay within the Microsoft set of technology and everything was mainstream. Microsoft technology was supported and accepted. Learning Microsoft technologies such as SQL Server and SharePoint meant that you were on the "winning team".
A lot of folks in technology have never known this kind of technology loss. When your entire career has been with successful, mainstream technology, the change is unsettling.
Microsoft Windows Phone is a technology on the edge. It exists, but it is not mainstream. It is a small, oddball system (in the view of the world). It is not the "winning team"; iOS and Android are the popular, mainstream technologies for phones.
As Microsoft expands beyond Windows with Azure and apps for iOS and Android, it competes with more companies and more technologies. Azure competes with Amazon.com's AWS and Google's Compute Engine. Office Online and Office 365 compete with Google Docs. OneDrive competes with DropBox and BOX. Microsoft's technologies are not the de facto standard, not always the most popular, and sometimes the oddball.
For the folks confronting a change to their worldview that Microsoft technology is always the most popular and most accepted (to a worldview that different technologies compete and sometimes Microsoft loses), a example to follow would be ... Microsoft.
Microsoft, after years of dominance with the Windows platform and applications, has widened its view. It is not "the Windows company" but a technology company that supplies Windows. More than that, it is a technology company that supplies Windows, Azure, Linux, and virtual machines. It is a company that supplies Office applications on Windows, iOS, and Android. It is a technology company that supplies SQL Server on Windows and soon Linux.
Microsoft adapts. It changes to meet the needs of the market.
That's a pretty good example.
Sunday, April 10, 2016
Complexity of programming languages
It is a simple question, and like many simple questions, the answer is not so simple.
Let's look at a simple program in some languages. The simple program will print the numbers from 1 to 10. Here are programs in FORTRAN, BASIC, Pascal, C, C++, C#, and Python:
FORTRAN 66 (1966)
DO 100 I = 1, 10
100 WRITE (6, 200) I
200 FORMAT (I5)
STOP
END
BASIC (1965)
10 FOR I = 1 TO 10
20 PRINT I
30 NEXT I
99 END
Pascal (1970)
program hello;
begin
i: integer;
for i := 1 to 10 do
WriteLn(i);
end.
C (1978)
#include "stdlib.h"
int main(void)
{
int i;
for (i = 1; i <= 10; i++)
printf("%d", i);
}
C++ (1983)
#include
int main()
{
for (unsigned int i = 1; i <= 10; i++)
std::cout << i << std::endl;
return 0;
}
using System;
class Program
{
public static void Main(string[] args)
{
for (int i = 1; i <= 10; i++)
Console.WriteLine(i);
}
}
for i in range(1, 10):
print(str(i))
From this small sampling, a few things are apparent.
First, the programs vary in length. Python is the shortest with only two lines. It's also a recent language, so are languages becoming more terse over time? Not really, as FORTRAN and BASIC are the next shortest languages (with 5 and 4 lines, respectively) and C#, a contemporary of Python, requires 10 lines.
Second, the formatting of program statements has changed. FORTRAN and BASIC, the earliest languages in this set, have strong notions about columns and lines. FORTRAN limits line length to 72 characters, reserves the first 6 columns for line numbers, and another column for continuation characters (which allows statements to exceed the 72 character limit). BASIC relaxed the column restrictions but add the requirement for line numbers on each line. Pascal, C, C++, and C# care nothing about columns and lines, looking at tokens in the code to separate statements. Python relies on indentation for block definitions.
Some languages (BASIC, C#) are capable of printing things by simply mentioning them. Others languages need specifications. FORTRAN has FORMAT statements to specify the exact format of the output. C has a printf() function that needs similar formatting information. I find the mechanisms of BASIC and C# easier to use than the mechanisms of C and Python.
Let's consider a somewhat more complex program, one that lists a set of prime numbers. We'll look at BASIC and Lua, which span the computing age.
Lua
local N = 100
local M = 10
function PRIME() -- PROCEDURE DECLARATION;
local X, SQUARE, I, K, LIM, PRIM -- DECLARATION OF VARIABLES;
local P, V = {}, {}
P[1] = 2 -- ASSIGNMENT TO FIRST ELEMENT OF p;
print(2) -- OUTPUT A LINE CONTAINING THE NUMBER 2;
X = 1
LIM = 1
SQUARE = 4
for I = 2, N do -- LOOP. I TAKES ON 2, 3, ... N;
repeat -- STOPS WHEN "UNTIL" CONDITION IS TRUE;
X = X + 2
if SQUARE <= X then
V[LIM] = SQUARE
LIM = LIM + 1
SQUARE = P[LIM] * P[LIM]
end
local K = 2
local PRIM = true
while PRIM and K < LIM do
if V[K] < X then
V[K] = V[K] + P[K]
end
PRIM = X ~= V[K]
K = K + 1
end
until PRIM -- THIS LINE CLOSES THE REPEAT
P[I] = X
print(X)
end
end
PRIME()
BASIC
100 LET N = 100
110 LET M = 10
200 DIM P(100), V(100)
300 LET P(1) = 2
310 PRINT P(1)
320 LET X = 1
330 LET L = 1
340 LET S = 4
350 FOR I = 2 TO N
360 REM repeat -- STOPS WHEN "UNTIL" CONDITION IS TRUE;
370 LET X = X + 2
380 IF S > X THEN 420
390 LET V(L) = S
400 LET L = L + 1
410 LET S = P(L)^2
420 REM
430 LET K = 2
440 LET P = 1
450 REM while PRIM and K < LIM do
455 IF P <> 1 THEN 520
460 IF K >= L THEN 520
470 IF V(K) >= X THEN 490
480 LET V(K) = V(K) + P(K)
490 REM
500 LET P = 0
501 IF X = V(K) THEN 510
502 LET P = 1
510 LET K = K + 1
515 GOTO 450
520 IF P <> 1 THEN 360
530 LET P(I) = X
540 PRINT X
550 NEXT I
999 END
They are about the same size and complexity. Two versions of a program, one from today and one from the early years of computing, yet they have similar complexity.
Programming languages encode operations into pseudo-English instructions. If the measure of a programming language's capability is its capacity to represent operations in a minimal number of steps, then this example shows that programming languages have not changed over the past five decades.
Caution is advised. These examples (printing "hello" and calculating prime numbers) may be poor representatives of typical programs. Perhaps we should withhold judgement until we consider more (and larger) programs. After all, very few people in 2016 use BASIC; there must be a reason they have selected modern languages.
Perhaps it is better to keep asking the question and examining our criteria for programming languages.
Thursday, April 7, 2016
An end to Microsoft Windows
Not in the sense of crashing and needing to be re-started -- although that happens too -- but in the sense of becoming dead technology, abandoned by their vendors and users.
I've seen various operating systems die. PC-DOS (or MS-DOS) is a dead operating system, used only in museums and by the rare enthusiast.
CP/M is perhaps the most well-known of the pre-PC operating systems, but there were several: TRS-DOS for Radio Shack, HDOS for Heathkit computers, the UCSD p-system, and Apple's DOS for its Apple II line, to name a few. They were all tied to specific hardware designs (except for CP/M) and they died as people moved away from those devices. CP/M transitioned to the IBM PC but failed in the market.
Will Windows die? Eventually, yes. All operating systems, just like all technologies, are abandoned for something new. But the demise of Windows may be sooner than we expect.
Windows, for a long time, was a strategic component in Microsoft's empire. Microsoft built Windows and applications on top of Windows, and sold them to customers. Microsoft's products all interlocked: Active Directory ran on Windows, SQL Server used Active Directory for authentication, Word and Excel used OLE and COM, and so on. Once you bought into a single Microsoft product, you had strong incentive to use others.
Microsoft is shifting its strategy. It no longer centers its technology on Windows; Azure and web services are the new center. Microsoft doesn't care if Azure customers use Windows on their computers, or even if the virtual servers in Azure are running Windows; Microsoft wants Azure customers. Thus, Windows is less important.
Look at Microsoft's recent actions:
- Windows 10 released to existing users for free (so it is not a source of revenue)
- Linux instances in Azure
- Visual Studio Express, a web application (no Windows needed!)
- .NET and C# ported to Linux
- Microsoft Office apps for Android and iOS
- SQL Server to run on Linux
- A Linux layer to Windows
If Microsoft can move their Azure instances and services to Linux, they will have little need for Windows. When the major revenue source is Azure and web services, why invest in a product with high expense and little return? It is quite possible that Microsoft is preparing to abandon Windows.
Here are some other actions that Microsoft may take:
- A new Surface laptop to compete with Chromebooks, one that runs a stripped-down Windows (or possibly Linux)
- A filesystem for Linux that can read and write to NTFS
- Other Office products migrated to Android and iOS
Microsoft has already moved away from Windows as a strategic technology. Each of these actions moves Microsoft away from Windows a little more.
Windows won't disappear overnight. Microsoft has committed to supporting Windows 10 for ten years, like it has supported the earlier versions. This is important for large corporations and governments who have contracts with Microsoft. But that doesn't mean Microsoft will keep Windows in the market forever. At some point Microsoft could stop selling Windows; in the retail market first and the large-contract market later.
At that point, Windows will be a dead operating system. I'm sure it will be used by individuals, small companies, and large organizations for many more years, but over time those instances will be shut down in favor of cloud-based systems.
Operating systems die. Windows is no exception.
Sunday, April 3, 2016
No more empires
Apple wants to be a rebel. To do so, they need an empire to rebel against. For the past two decades, Microsoft was their empire. Prior to Microsoft's rise, the empire was IBM.
IBM had a long and storied empire. It was the first to have an empire in IT, and perhaps the only company to do so. (More on that later.)
IBM had a comprehensive empire, starting with mainframes. They sold everything you needed for computers. They sold the processors, the card readers and card punches, tape drives, disk drives, and even the cables to connect them. They sold operating systems, utilities, compilers, and job scheduling programs.
Empires must be all-encompassing. They must sell everything one needs. If they don't they are not truly empires.
When DEC introduced its line of mini-computers, IBM competed with them, by selling its own minicomputers. (And operating systems, terminals, and printers for the minicomputers.)
When microcomputers became popular, IBM introduced the PC. To offer a solution quickly, IBM used other manufacturers for several components: Epson for printers and Microsoft for the operating system.
IBM maintained its empire until Microsoft took control with Windows. The breakup was ugly and has been documented by others, so I won't go into those details. But build an empire, Microsoft did.
Microsoft's empire was different from IBM's. IBM's empire was all-encompassing, from hardware to software to supplies. Microsoft's empire was limited to software. It sold no processors, disk drives, or other peripherals. (I'm ignoring the Xbox and the Surface tablet and the Microsoft keyboard and Microsoft mouse, which are not insignificant but not really to the point.)
Microsoft did keep the "we supply everything" mindset for its software empire. It provided the operating system, office programs (Word, Excel, Powerpoint, Outlook, etc.), developer tools (Visual Studio with compilers for various languages, SourceSafe and TFS), databases, accounting software, ... you name it, Microsoft offered it. They even created a file packager like PKZIP but with proprietary technology (a thing called "OLE Structured Storage", which let one contain multiple files in a single file, but without compression).
Today Microsoft does not dominate in every aspect of IT. It dominates in some areas (desktop operating systems, office software), competes in others (cloud services, tablets), and fails in others (phones). One could build a modern cloud/mobile app with only Microsoft technology, but without iPhone and Android support, it would have very limited acceptance. But one cannot build a mobile/cloud app with only Apple technologies (they don't offer cloud services). One could use Google's technology; they offer phones and tablets, cloud services, and development tools, but you would still lose the iPhone market. No one vendor has all of the solutions.
Will we see another empire in the IT world? Microsoft took the empire role from IBM, will someone take the role from Microsoft?
A new empire would be difficult to arrange. It would have to become the dominant supplier of IT hardware and software. Even if it followed Microsoft's lead and provided only software, it would have a large task. Software ranges from operating systems to office programs to development tools to business software to databases to analytics to games to video editing to ... you get the idea. And don't forget that a lot of software is available via open source.
I think we will see no new empire rise. I think we will see no one company offer everything one needs, and be dominant in all of those areas. The breadth of technology is too wide.
With no single, dominant provider, we will see instead a market with multiple providers. And that makes things less convenient for some.
An empire offers simplicity and comfort. In the era of the IBM empire, one could select IBM as the vendor, knowing that it was a safe choice. The saying was "no one was fired for buying IBM equipment"; IBM made the best, and if IBM equipment didn't solve the problem, no one else's would either. (At least that was the belief.) When Microsoft built its empire on Windows, they became the safe choice.
With the rise of mobile and cloud technologies, there is no one provider for all technologies. One has to select from multiple vendors and get their technologies to work together. One can never be sure that one has the best tool for the task. Is Microsoft Azure the best cloud solution for you. (More to the point, is it acceptable?) How to develop apps for iOS and Android, and should you include Microsoft Mobile? Do you develop a desktop version of your app? What tools do you use to build it?
The good news is that there are several "right" answers to these questions. Microsoft Azure, Amazon AWS, and Google Cloud are capable platforms. There are multiple tools to develop for iOS, Android, and Windows. You don't have to find the one and only one tool that will work for you.
The bad news is that you have to think more about your objectives, the tools you want to use, and the techniques you will use. It is the thinking part that will frighten people who are used to picking the safe choice.
Friday, April 1, 2016
Apple wants to be a rebel
It should be no surprise that Apple is the rebel in the computer industry. They are the small, scrappy upstart competing against the big, established company. Consider their "1984" ad to introduce the MacIntosh computer, or the later "Think Different" campaign.
Apple won its counterculture role by accident. They were one of the first companies to sell microcomputers in the late 1970s. At the time, there was no PC standard; IBM would introduce its PC in 1981. Prior to that, the market was fragmented in terms of hardware and software. Apple, Commodore, Radio Shack, and several others offered non-compatible systems. The CP/M operating system was beginning to emerge as a standard, but it was by no means universal.
IBM became the computing standard-bearer, and saved Apple from becoming that fate. The IBM PC and PC-DOS was an instant success, and other manufacturers were pushed aside. Only by adopting the role of rebel could Apple survive.
The strategy worked for several decades. Apple built a reputation as the "other computer company" with the expensive but well-designed products. It has software that "just worked" without the need for support teams.
But now Apple has a problem.
Their products, and Apple by extension, have become the market leaders. The iPod was the premium music device. The iPhone and iPad are the envied mobile devices. The MacBook is the standard for laptop computers. Other manufacturers design their products to emulate the Apple line. (Even the lowly Mac Mini is copied.) It's hard to be different when everyone is trying to be like you.
A bigger problem is the demise of the empire. A rebel needs someone (or something) to rebel against. In "Star Wars", the Rebel Alliance exists only because the Empire exists. (So much so that the latest Star Wars movie had to invent the First Order to keep the Rebel Alliance alive.)
Apple's first foe was IBM and the PC. IBM served well in the role of evil empire; it was despised by all of the hobbyists and tinkerers who had adopted the earlier computers, it was large and bureaucratic, and it was successful. The IBM PC empire was defeated by a combination of PC clone manufacturers, Microsoft, and Windows, and when it fell, Microsoft neatly stepped into place and became the empire against which Apple could fight.
But now the PC market is split between Windows, Mac OS (and a smidge of Linux); the phone market split between iOS and Android (and a smidge of Windows); and the cloud split between Amazon.com, Microsoft, Google (and others). There is no big, evil empire to fight.
Apple cannot be the rebel that they were, and I think that they are uncomfortable with that. I think the folks at Apple yearn for "the good old days" when they were not number one, and when they made computers that were different.
The question is: Where does Apple go from here?