The term "shadow IT" has come to mean IT products and services used within an organization without the blessing (or even knowledge) of the IT support team. Often such services and products are not on the list of approved software.
In the good old days before the open source movement and when software had to be purchased, it was easy to control purchases of software: the Purchasing Department would verify all purchase requests with the IT Department; any unauthorized requests would be refused.
Even with open source and "free" software, the IT Department could set policies on individual PCs and prevent people from installing software. IT remained the gatekeepers of software.
With cloud computing, those controls can be bypassed. Software can now be used in the browser. Point your browser at a web site, register, supply a credit card, and you're ready to go. No Purchasing Department, no administrator rights. This is the situation that most people associate with the term "shadow IT".
Yet there is another technology that is used without the knowledge of the IT Department. A technology that has been used by many people, to perform many tasks within the organization. Programs have been written, databases have been designed, and systems have been implemented without the involvement of the IT Department. Worse, these systems have been used in production without extensive testing (perhaps no testing), have not been audited, and have no backups or disaster recover plans.
I'm talking about spreadsheets.
Specifically, Microsoft Excel spreadsheets.
Microsoft Excel is a standard among corporate computing. Just about every "business" PC (as opposed to "developer" PC or "sysadmin" PC) runs Windows and has Excel. The technology is available, often mandated by the IT Department as a standard configuration.
Millions of people have access to Excel, and they use it. And why not? Excel is powerful, flexible, and useful. There are tutorials for it. There are web pages with hints and tips. Microsoft has made it easy to use. There is little work needed to use Excel to perform calculations and store data. One can even connect Excel to external data sources (to avoid re-typing data) and program Excel with macros in VBA.
Excel, in other words, is a system platform complete with programming language. It is used by millions of people in thousands (hundreds of thousands?) of organizations. Some small businesses may run completely on Excel. Larger business may run on a combination of "properly" designed and supported systems and Excel.
This is the other shadow IT. The spreadsheets used by people to perform mundane (or perhaps not-so-mundane) tasks. The queries to corporate databases. The programs in VBA that advertise themselves as "macros". All operating without IT's knowledge or support.
Comparing two programming languages is difficult. Different languages have different capabilities and different amounts of programming "power". One line of COBOL can do the work of many lines of assembly. Ten lines of Python can do more work than ten lines of Java.
I suspect that if we could compare Excel to the corporate-approved languages of C# and Java, we would find that there is more Excel code that corporate-approved code. That is a lot of code! It means that Excel is the "dark matter" of the IT universe: existing but not observed. (I realize that this amount is speculation. We have no measurements for Excel code.)
Excel is the shadow technology to watch. Don't ignore file-sharing and browser-based apps; they are risks too. But keep an eye on the technology we already have and use.
Wednesday, May 13, 2015
Tuesday, May 12, 2015
Cloud programs are mainframe programs, sort of
I was fortunate to start my programming career in the dawn of the age of BASIC. The BASIC language was designed with the user in mind and had several features that made it easy to use.
To truly appreciate BASIC, one must understand the languages that came before it. Comparing BASIC to JavaScript, or Swift, or Ruby makes little sense; each of those came after BASIC (long after) and built on the experience of BASIC. The advantages of BASIC are clear when compared to the languages of the time: COBOL and Fortran.
BASIC was interpreted, which meant that a program could be typed and run in one fast session. COBOL and Fortran were compiled, which meant that a program had to be typed, saved to disk, compiled, linked, and then run. With BASIC, one could change a program and re-run it; with other languages you had to go through the entire edit-save-compile-link cycle.
Where BASIC really had an advantage over COBOL and Fortran was with input. BASIC had a flexible INPUT statement that let a program read values from the user. COBOL was designed to read data from punch cards; Fortran was designed to read data from magnetic tape. Both were later modified to handle input from "the console" -- the terminal at which a programmer used for an interactive session -- but even with those changes, interactive programs were painful to write. Yet in BASIC it was easy to write a program that asked the user "would you like to run again?".
The interactive properties of BASIC made it a hit with microcomputer users. (It's availability, due to Microsoft's aggressive marketing, also helped.) Fortran and COBOL achieved minimal success with microcomputers, setting up the divide between "mainframe programming" (COBOL and Fortran) and "microcomputer programming" (BASIC, and later Pascal). Some rash young members of the computing field called the two divisions "ancient programming" and "modern programming".
But the division wasn't so much between mainframe and microcomputer (or old and new) as we thought. Instead, the division was between interactive and non-interactive. Microcomputers and their applications were interactive and mainframes and their applications were non-interactive. (Mainframe applications were also batch-oriented, which is another aspect.)
What does all of this history have to do with computing in the current day? Well, cloud computing is pretty modern stuff, and it is quite different from the interactive programming on microcomputers. I don't see anyone building cloud applications with BASIC or Pascal; people use Python or Ruby or Java or C#. But cloud computing is close to mainframe computing (yes, that "ancient" form of computing) in that it is non-interactive. A cloud application gets a request, processes it, and returns a response -- and that's it. There is no "would you like to run again?" option from cloud applications.
Which is not to say that today's systems are not interactive -- they are. But it is not the cloud portion of the system that is interactive. The interactivity with the use has been separated from the cloud; it lives in the mobile app on the user's phone, or perhaps in a JavaScript app in a browser.
With all of the user interaction in the mobile app (or browser app), cloud apps can go about their business and focus on processing. It's a pretty good arrangement.
But it does mean that cloud apps are quite similar to mainframe apps.
To truly appreciate BASIC, one must understand the languages that came before it. Comparing BASIC to JavaScript, or Swift, or Ruby makes little sense; each of those came after BASIC (long after) and built on the experience of BASIC. The advantages of BASIC are clear when compared to the languages of the time: COBOL and Fortran.
BASIC was interpreted, which meant that a program could be typed and run in one fast session. COBOL and Fortran were compiled, which meant that a program had to be typed, saved to disk, compiled, linked, and then run. With BASIC, one could change a program and re-run it; with other languages you had to go through the entire edit-save-compile-link cycle.
Where BASIC really had an advantage over COBOL and Fortran was with input. BASIC had a flexible INPUT statement that let a program read values from the user. COBOL was designed to read data from punch cards; Fortran was designed to read data from magnetic tape. Both were later modified to handle input from "the console" -- the terminal at which a programmer used for an interactive session -- but even with those changes, interactive programs were painful to write. Yet in BASIC it was easy to write a program that asked the user "would you like to run again?".
The interactive properties of BASIC made it a hit with microcomputer users. (It's availability, due to Microsoft's aggressive marketing, also helped.) Fortran and COBOL achieved minimal success with microcomputers, setting up the divide between "mainframe programming" (COBOL and Fortran) and "microcomputer programming" (BASIC, and later Pascal). Some rash young members of the computing field called the two divisions "ancient programming" and "modern programming".
But the division wasn't so much between mainframe and microcomputer (or old and new) as we thought. Instead, the division was between interactive and non-interactive. Microcomputers and their applications were interactive and mainframes and their applications were non-interactive. (Mainframe applications were also batch-oriented, which is another aspect.)
What does all of this history have to do with computing in the current day? Well, cloud computing is pretty modern stuff, and it is quite different from the interactive programming on microcomputers. I don't see anyone building cloud applications with BASIC or Pascal; people use Python or Ruby or Java or C#. But cloud computing is close to mainframe computing (yes, that "ancient" form of computing) in that it is non-interactive. A cloud application gets a request, processes it, and returns a response -- and that's it. There is no "would you like to run again?" option from cloud applications.
Which is not to say that today's systems are not interactive -- they are. But it is not the cloud portion of the system that is interactive. The interactivity with the use has been separated from the cloud; it lives in the mobile app on the user's phone, or perhaps in a JavaScript app in a browser.
With all of the user interaction in the mobile app (or browser app), cloud apps can go about their business and focus on processing. It's a pretty good arrangement.
But it does mean that cloud apps are quite similar to mainframe apps.
Labels:
BASIC,
cloud computing,
COBOL,
Fortran,
interactive processing,
mainframe,
mobile/cloud
Sunday, May 10, 2015
The artistic appeal of the white MacBook
Laptops have established their place in computing history. Some have been innovative, others bland.
The white MacBooks were appreciated by many. They were (and still are) useful. And they were unique.
The uniqueness of the white MacBooks is in their popularity for decorations. People have added stickers and artistic work to many laptops, but the white MacBooks had more. It was as if they reached out and asked people to decorate them.
Perhaps it was the allure of a flat, white space. (There was a black version of the MacBook, but it was rare. I saw none "in the wild" so I cannot comment on their decorations.) Laptops from other vendors came in black, grey, beige, and sometimes with bright colors, but I recall none in the "Imperial Stormtrooper White" used by Apple.
Perhaps it was the prestige of owning an Apple product, and not a typical PC running Windows. In that age, running an operating system other than Windows was partly an act of rebellion.
Perhaps it was because the people who owned MacBooks were typically artists, musicians, writers, designers, or other creative folks.
That was in the age of the white MacBook. Today, things are different.
I look at the laptops in today's world, and I see little in the way of decorations. Even (or perhaps I should say "especially") among the Apple laptops. Today's laptops are, mostly, plain and un-enhanced. (Yes, there are a few who add stickers to their laptops. But the number is a lot smaller than it used to be.)
Some factor inhibits decorations. It might be the new color scheme (silver, gold, or grey). It might be the texture of the dimpled surface. It might be the curve of the device covers, no longer offering the flat surface.
I will content myself with speculation, and leave the analysis to the psychologists and anthropologists.
The white MacBooks were appreciated by many. They were (and still are) useful. And they were unique.
The uniqueness of the white MacBooks is in their popularity for decorations. People have added stickers and artistic work to many laptops, but the white MacBooks had more. It was as if they reached out and asked people to decorate them.
Perhaps it was the allure of a flat, white space. (There was a black version of the MacBook, but it was rare. I saw none "in the wild" so I cannot comment on their decorations.) Laptops from other vendors came in black, grey, beige, and sometimes with bright colors, but I recall none in the "Imperial Stormtrooper White" used by Apple.
Perhaps it was the prestige of owning an Apple product, and not a typical PC running Windows. In that age, running an operating system other than Windows was partly an act of rebellion.
Perhaps it was because the people who owned MacBooks were typically artists, musicians, writers, designers, or other creative folks.
That was in the age of the white MacBook. Today, things are different.
I look at the laptops in today's world, and I see little in the way of decorations. Even (or perhaps I should say "especially") among the Apple laptops. Today's laptops are, mostly, plain and un-enhanced. (Yes, there are a few who add stickers to their laptops. But the number is a lot smaller than it used to be.)
Some factor inhibits decorations. It might be the new color scheme (silver, gold, or grey). It might be the texture of the dimpled surface. It might be the curve of the device covers, no longer offering the flat surface.
I will content myself with speculation, and leave the analysis to the psychologists and anthropologists.
Wednesday, May 6, 2015
Cloud apps may not need OOP
The rise of different programming styles follows the need for program sizes.
The earliest programs were small -- tiny by today's standards. Most would fit on a single printed page; the largest took a few pages.
Programs larger than a few pages quickly became a tangled mess. Structured Programming (SP) was a reaction to the need for larger programs. It was a technique to organize code and allow programmers to quickly learn an existing system. With SP we saw languages that used structured techniques: Pascal and C became popular, and Fortran and BASIC changed to use the structured constructs IF-THEN-ELSE and DO-WHILE.
Structured Programming was able to organize code up to a point, but it could not manage the large systems of the 1990s. Object-oriented programming (OOP) was a reaction to the need for programs larger than several hundred printed pages. With OOP we saw languages that used object-oriented techniques: Java and C# became popular, C mutated into C++, and Pascal mutated into ObjectPascal. These new languages (and new versions of old languages) used the object-oriented constructs of encapsulation, inheritance, and polymorphism.
Cloud computing brings changes to programming, but in a new way. Instead of larger programs, cloud computing allows for (and encourages) smaller programs. The need for large, well-organized programs has been replaced by a need for well-organized systems of small programs. In addition, the needs placed on the small programs are different from the needs of the old, pre-cloud programs: cloud programs must be fast, replicable, and substitutable. The core idea of cloud computing is a that a number of servers are ready to respond to requests and that any server (of a given class) can handle your request -- you don't need a specific server.
In this environment, object-oriented programming is less useful. It requires some overhead -- for the design or programs and at run time. Its strength is to organize code for large programs, but it offers little for small programs. I expect that people will move away from OOP languages for cloud systems, and towards languages than emphasize readability and reliability.
I don't expect a renaissance of Structured Programming. I don't expect anyone to move back to the older SP-inspired languages of Pascal and Fortran-77. Cloud computing may be the technology that pushes us to move to the "Functional Programming" style. Look for cloud-based applications to use functional languages such as Haskell and Erlang. (Maybe F#, for Microsoft shops.)
The earliest programs were small -- tiny by today's standards. Most would fit on a single printed page; the largest took a few pages.
Programs larger than a few pages quickly became a tangled mess. Structured Programming (SP) was a reaction to the need for larger programs. It was a technique to organize code and allow programmers to quickly learn an existing system. With SP we saw languages that used structured techniques: Pascal and C became popular, and Fortran and BASIC changed to use the structured constructs IF-THEN-ELSE and DO-WHILE.
Structured Programming was able to organize code up to a point, but it could not manage the large systems of the 1990s. Object-oriented programming (OOP) was a reaction to the need for programs larger than several hundred printed pages. With OOP we saw languages that used object-oriented techniques: Java and C# became popular, C mutated into C++, and Pascal mutated into ObjectPascal. These new languages (and new versions of old languages) used the object-oriented constructs of encapsulation, inheritance, and polymorphism.
Cloud computing brings changes to programming, but in a new way. Instead of larger programs, cloud computing allows for (and encourages) smaller programs. The need for large, well-organized programs has been replaced by a need for well-organized systems of small programs. In addition, the needs placed on the small programs are different from the needs of the old, pre-cloud programs: cloud programs must be fast, replicable, and substitutable. The core idea of cloud computing is a that a number of servers are ready to respond to requests and that any server (of a given class) can handle your request -- you don't need a specific server.
In this environment, object-oriented programming is less useful. It requires some overhead -- for the design or programs and at run time. Its strength is to organize code for large programs, but it offers little for small programs. I expect that people will move away from OOP languages for cloud systems, and towards languages than emphasize readability and reliability.
I don't expect a renaissance of Structured Programming. I don't expect anyone to move back to the older SP-inspired languages of Pascal and Fortran-77. Cloud computing may be the technology that pushes us to move to the "Functional Programming" style. Look for cloud-based applications to use functional languages such as Haskell and Erlang. (Maybe F#, for Microsoft shops.)
Thursday, April 30, 2015
Files are static, requests are dynamic
The transition from desktop or web applications to mobile/cloud systems is more than the re-organization of programs. It is a change from data sources: desktop and web applications often store data in files, and mobile/cloud systems store data via web services.
Files are static things. They perform no actions by themselves. A program can read the contents of a file and take action on those contents, but it must consume the contents as they exist. The file may contain just the data that a program needs, or it may contain more, or less. For example, a file containing a Microsoft Word document actually contains the text of the document, revisions to the text, information about fonts and formatting, and meta-information about the author.
A program reading the contents of that file must read all of that information; it has no choice. If the task is to extract the text -- and only the text -- the program must read the entire file, revisions and fonts and meta-information included. If we want only the meta-information, the program must read the entire file, text and revisions... you get the idea.
(The more recent DOCX format does isolate the different sets of information, and makes reading a subset of the file easier. The older DOC format required reading and interpreting the entire file to obtain any part of the file.)
Web services, used by mobile/cloud systems, are not static but dynamic. (At least they have the possibility of being dynamic. You can build web services that mimic files, but you probably want the dynamic versions.)
A web service can be dynamic because there is another program processing the request and creating the response. A web service to read a document can do more than simply return the bytes in the document file. It can perform some processing on your behalf. It can accept instructions, such as "give me the text" or "only the meta-information, please". It can do these things on our behalf. We can delegate them to the web service, and our job becomes easier.
Astute readers will observe that my arrangement of dynamic web services does not reduce the work involved, it merely shifts work to different parts of the system. (The web service must still read the entire document, pick out the bits of interest, and send those to us.) That is true. Yet it is also true that once in place, the web service provide an interface to reading (and writing) documents, and we may then choose to change the implementation of that storage.
With document web services in place, our applications are completely ignorant of the storage format and the web services may change that format to suit their needs. A new version of web services may store documents not in files but in databases, or JSON files, or any method appropriate. I'm pretty sure that Google Docs uses this approach, and I suspect Microsoft's Office 365, if not using it now, will use it soon.
Moving from desktop and web to mobile/cloud lets us do many things. It lets us do many things that we do today but differently. Look at the possibilities, and look at the savings in effort and cost.
Files are static things. They perform no actions by themselves. A program can read the contents of a file and take action on those contents, but it must consume the contents as they exist. The file may contain just the data that a program needs, or it may contain more, or less. For example, a file containing a Microsoft Word document actually contains the text of the document, revisions to the text, information about fonts and formatting, and meta-information about the author.
A program reading the contents of that file must read all of that information; it has no choice. If the task is to extract the text -- and only the text -- the program must read the entire file, revisions and fonts and meta-information included. If we want only the meta-information, the program must read the entire file, text and revisions... you get the idea.
(The more recent DOCX format does isolate the different sets of information, and makes reading a subset of the file easier. The older DOC format required reading and interpreting the entire file to obtain any part of the file.)
Web services, used by mobile/cloud systems, are not static but dynamic. (At least they have the possibility of being dynamic. You can build web services that mimic files, but you probably want the dynamic versions.)
A web service can be dynamic because there is another program processing the request and creating the response. A web service to read a document can do more than simply return the bytes in the document file. It can perform some processing on your behalf. It can accept instructions, such as "give me the text" or "only the meta-information, please". It can do these things on our behalf. We can delegate them to the web service, and our job becomes easier.
Astute readers will observe that my arrangement of dynamic web services does not reduce the work involved, it merely shifts work to different parts of the system. (The web service must still read the entire document, pick out the bits of interest, and send those to us.) That is true. Yet it is also true that once in place, the web service provide an interface to reading (and writing) documents, and we may then choose to change the implementation of that storage.
With document web services in place, our applications are completely ignorant of the storage format and the web services may change that format to suit their needs. A new version of web services may store documents not in files but in databases, or JSON files, or any method appropriate. I'm pretty sure that Google Docs uses this approach, and I suspect Microsoft's Office 365, if not using it now, will use it soon.
Moving from desktop and web to mobile/cloud lets us do many things. It lets us do many things that we do today but differently. Look at the possibilities, and look at the savings in effort and cost.
Monday, April 27, 2015
The smallest possible cloud language
How small can a language be? Specifically, how small can we make a language for cloud computing?
Programs running in the cloud need not do a number of things that traditional programs must do. A program running in the cloud does not interact with a user, for example. (There may be a user directing a mobile app which in turn directs a cloud app, but that is an indirect interaction.)
Cloud programs do not read or write disk files, either. Nor do they access databases (directly).
Here's my list of minimal functions for a cloud program:
That is the list for the smallest, simplest language for cloud computing. There is a little complexity hidden in the "operate on the JSON data"; the language must handle the values and data structures of JSON. Therefore it must handle numeric values, text values, boolean values, "null", lists, and dictionaries.
But that is it. That is the complete set of operations. The language (and its supporting libraries) does not have to handle dialogs, screen resolutions, responsive design, disk files, databases (those are handled by specialized database servers), or printing. We can remove functions that support all of those operations.
If we're clever, we can avoid the use of "generic" loops ("for i = 0; i < x; i++ ") and use the looping constructs of recent languages such as Python and Ruby ("x.times()" or ."x.each()" ).
So the smallest -- not necessarily ideal, just the smallest -- language for cloud computing may be a reduced version of Python. Or Ruby.
Or perhaps a variant of a functional programming language like Haskell or Erlang. Possibly F#, but with reductions to eliminate unused functions.
Or -- and this is a stretch -- perhaps an extended version of Forth. Forth has little overhead to start, so there is little to remove. It operates on 16-bit numeric values; we would need support for larger numeric values and for text values. Yet it could be done.
Look for them in future cloud platforms.
Programs running in the cloud need not do a number of things that traditional programs must do. A program running in the cloud does not interact with a user, for example. (There may be a user directing a mobile app which in turn directs a cloud app, but that is an indirect interaction.)
Cloud programs do not read or write disk files, either. Nor do they access databases (directly).
Here's my list of minimal functions for a cloud program:
- Accept an incoming web request (possibly with data in JSON)
- Process the request (that is, operate on the JSON data)
- Generate web requests to other servers
- Receive responses from other servers
- Send a response to the original web request (possibly with data in JSON)
That is the list for the smallest, simplest language for cloud computing. There is a little complexity hidden in the "operate on the JSON data"; the language must handle the values and data structures of JSON. Therefore it must handle numeric values, text values, boolean values, "null", lists, and dictionaries.
But that is it. That is the complete set of operations. The language (and its supporting libraries) does not have to handle dialogs, screen resolutions, responsive design, disk files, databases (those are handled by specialized database servers), or printing. We can remove functions that support all of those operations.
If we're clever, we can avoid the use of "generic" loops ("for i = 0; i < x; i++ ") and use the looping constructs of recent languages such as Python and Ruby ("x.times()" or ."x.each()" ).
So the smallest -- not necessarily ideal, just the smallest -- language for cloud computing may be a reduced version of Python. Or Ruby.
Or perhaps a variant of a functional programming language like Haskell or Erlang. Possibly F#, but with reductions to eliminate unused functions.
Or -- and this is a stretch -- perhaps an extended version of Forth. Forth has little overhead to start, so there is little to remove. It operates on 16-bit numeric values; we would need support for larger numeric values and for text values. Yet it could be done.
- Our list of candidates for cloud computing are:
- A reduced version of Python
- A reduced version of Ruby
- Haskell
- Erlang
- A reduced F#
- An enhanced Forth
Look for them in future cloud platforms.
Thursday, April 23, 2015
Small programs need small languages
The history of programming languages has been one of expansion. Programming languages start small (think BASIC, Pascal, and C) and expand to provide more capabilities to the programmer (think Visual Basic, ObjectPascal, and C++). Programming languages expand because the programs we write expand.
Computer programs have expanded over the years. Books from the early years of programming (the 1970s) classify programs by size, with small programs consisting of hundreds of lines of code, large programs consisting of tens of thousands of lines, and "humongous" programs consisting of hundreds of thousands of lines of code. Today, we may still classify programs by lines of code, but most commercial applications range in size from hundreds of thousands of lines to tens of millions.
The expansionist effect on programs is tied to their single-computer nature. When a single computer must perform the calculations, then the program it runs must do everything.
Cloud computing breaks that paradigm. With cloud computing, the system may be large, but it consists of many computers providing granular services. That design allows for, and encourages, small programs. (Side note: If you're building a cloud system with large programs, you're doing it wrong.)
Cloud computing uses collections of small programs to assemble systems. Since the programs are small, the programming languages can be -- and should be -- small. That means that our strategy of language development, in which we have (mostly) striven to broaden the capabilities of programming languages, is no longer valid. Our new strategy must be to simplify, and probably specialize, our programming languages.
Computer programs have expanded over the years. Books from the early years of programming (the 1970s) classify programs by size, with small programs consisting of hundreds of lines of code, large programs consisting of tens of thousands of lines, and "humongous" programs consisting of hundreds of thousands of lines of code. Today, we may still classify programs by lines of code, but most commercial applications range in size from hundreds of thousands of lines to tens of millions.
The expansionist effect on programs is tied to their single-computer nature. When a single computer must perform the calculations, then the program it runs must do everything.
Cloud computing breaks that paradigm. With cloud computing, the system may be large, but it consists of many computers providing granular services. That design allows for, and encourages, small programs. (Side note: If you're building a cloud system with large programs, you're doing it wrong.)
Cloud computing uses collections of small programs to assemble systems. Since the programs are small, the programming languages can be -- and should be -- small. That means that our strategy of language development, in which we have (mostly) striven to broaden the capabilities of programming languages, is no longer valid. Our new strategy must be to simplify, and probably specialize, our programming languages.
Subscribe to:
Posts (Atom)