In the late Twentieth Century, COBOL was the standard language for business applications. There were a few other contenders (IBM's RPG, assembly language, and DEC's DIBOL) but COBOL was the undisputed king of the business world. If you were running a business, you used COBOL.
If you worked in the data processing shop of a business, you knew COBOL and programmed with it.
If you were in school, you had a pretty good chance of being taught COBOL. Not everywhere, and not during the entire second half of the century. I attended an engineering school; we learned FORTRAN, Pascal, and assembly language. (We also used the packages SPSS and CSMP.)
Schools have, for the most part, stopped teaching COBOL. A few do, but most moved on to C++, or Java, or C#. A number are now teaching Python.
Business have lots of COBOL code. Lots and lots of it. And they have no reason to convert that code to C++, or Java, or C#, or the "flavor of the month" in programming languages. Business code is often complex and working business code is precious. One modifies the code only when necessary, and one converts a system to a new language only at the utmost need.
But that code, while precious, does have to be maintained. Businesses change and those changes require fixes and enhancements to the code.
Those changes and enhancements are made by COBOL programmers.
Of which very few are being minted these days. Or for the past two decades.
Which means that COBOL programmers are, as a resource, dwindling.
Now, I recognize that the production of COBOL programmers has not ceased. There are three sources that I can name with little thought.
First are the schools (real-life and on-line) that offer courses in COBOL. Several colleges still teach it, and several on-line colleges offer it.
Second is offshore programming companies. Talent is available through outsourcing.
Third is existing programmers who learn COBOL. A programmer who knows Visual Basic and C++, for example, may choose to learn COBOL (perhaps through an on-line college).
Yet I believe that, in any given year, the number of new COBOL programmers is less than the number of retiring COBOL programmers. Which means that the talent pool is now at risk, and therefore business applications may be at risk.
For many years businesses relied on the ubiquitous nature of COBOL to build their systems. I'm sure that the managers considered COBOL to be a "safe" language: stable and reliable for many years. And to be fair, it was. COBOL has been a useful language for almost half a century, a record that only FORTRAN can challenge.
The dominance of COBOL drove a demand for COBOL programmers, which in turn drove a demand for COBOL training. Now, competing languages are pulling talent out of the "COBOL pool", starving the training. Can businesses be far behind?
If you are running a business, and you rely on COBOL, you may want to think about the future of your programming talent.
* * * * *
Such an effect is not limited to COBOL. It can happen to any popular language. Consider Visual Basic, a dominant language in Windows shops in the 1990s. It has fallen out of favor, replaced by C#. Or consider C++, which like COBOL has a large base of installed (and working) code. It, too, is falling out of favor, albeit much more slowly than Visual Basic or COBOL.
Monday, May 20, 2013
Sunday, May 19, 2013
The real reason we are angry with Windows 8
Windows 8 has made a splash, and different people have different reactions. Some are happy, some are confused, and some are angry.
The PC revolution was about control, and about independence. PCs, in the early days, were about "sticking it to the man" -- being independent from the big guys. Owning a PC (or a microcomputer) meant that we were the masters of our fate. We controlled the machine. We decided what to do with it. We decided when to use it.
But that absolute control has been eroded over time.
We have gradually, slowly, gave up our control in exchange for conveniences.
Now, we have come to the realization that we are not in control of our computers. Our iPads and Android tablets update themselves, and we lack total control (unless we jail-break them).
I think what really makes people mad is the realization that they are not in control. We thought that we were in control, but we're not. The vendor calls the shots.
We thought that we were in control. We thought that we called the shots.
Windows 8, with its new user interface and its new approach to apps, makes it clear that we are not.
And we're angry when we realize it.
The PC revolution was about control, and about independence. PCs, in the early days, were about "sticking it to the man" -- being independent from the big guys. Owning a PC (or a microcomputer) meant that we were the masters of our fate. We controlled the machine. We decided what to do with it. We decided when to use it.
But that absolute control has been eroded over time.
- With CP/M (and later with MS-DOS and even later with Windows), we agreed to use a common operating system in exchange for powerful applications.
- With Wordstar (and later with Lotus 1-2-3 and even later with Word and Excel) we agreed to use common applications in exchange for the ability to share documents and spreadsheets.
- With Windows 3.1, we agreed to use the Microsoft stack in exchange for network drivers and access to servers.
- With Windows 2000 SP3, we had to accept updates from Microsoft. (The license specified that.)
We have gradually, slowly, gave up our control in exchange for conveniences.
Now, we have come to the realization that we are not in control of our computers. Our iPads and Android tablets update themselves, and we lack total control (unless we jail-break them).
I think what really makes people mad is the realization that they are not in control. We thought that we were in control, but we're not. The vendor calls the shots.
We thought that we were in control. We thought that we called the shots.
Windows 8, with its new user interface and its new approach to apps, makes it clear that we are not.
And we're angry when we realize it.
Thursday, May 16, 2013
Life without files
Mobile devices (tablets and phones) are different from PCs in that they do not use files. Yes, they have operating systems that use files, and Android has file managers which let you access some or all files on the device, but the general experience is one without files.
The PC experience was centered around files. Files were the essential element of a PC, from the boot files to word processor documents to temporary data. Everything was stored in a file. The experience of using a PC was running an application, loading data from a file, and manipulating that data. When finished, one saved the information to a file.
Unix (and Linux) also have this approach. Microsoft Windows presented data in a series of graphical windows, but it stored data in files. The "File" menu with its options of "New", "Open", "Save", "Save As", and "Exit" (oddly) were standard in the UI.
The mobile experience relies on apps, not files. The user runs an app, and the app somehow knows how to get its data. Much of the time, that data is stored in the cloud, on servers maintained by the app provider. There is no "File" menu -- or any menu.
Shifting the experience from files to apps is a subtle but significant change. One tends to miss it, since keeping track of files was tedious and required discipline for structuring directories. When apps load their data automatically, one doesn't really mind not worrying about file names and locations.
With the "loss" of files, or more specifically the access to app data outside of the app, one is more conscious of sharing information between apps. In the PC age, one didn't share information between applications -- one stored data in a public place and let other applications have their way with it.
This "data available to all" makes some sense, especially for developers. Developers use multiple tools on their data -- source code -- and need it shared liberally. Editors, compilers, debuggers, lint utilities, and metrics tools all need unfettered access to the source code.
But the "free for all" model does not work so well for other data. In most shops, data is used in a single application (most often Microsoft Word or Microsoft Excel) and occasionally sent via e-mail to others. But this latter act is simply sharing the data; a better way to share data could eliminate a lot of e-mails.
Sharing data and allowing others to update it (or not) requires an infrastructure, a mechanism to make data available and control access. It requires data to be structured in a way that it can be shared -- which may be different from the internal format used by the app. Yet defining an external format may be good for us in general: we can eliminate the task of reverse-engineering proprietary file formats.
Life without files. Because we want to get work done, not read and save files.
The PC experience was centered around files. Files were the essential element of a PC, from the boot files to word processor documents to temporary data. Everything was stored in a file. The experience of using a PC was running an application, loading data from a file, and manipulating that data. When finished, one saved the information to a file.
Unix (and Linux) also have this approach. Microsoft Windows presented data in a series of graphical windows, but it stored data in files. The "File" menu with its options of "New", "Open", "Save", "Save As", and "Exit" (oddly) were standard in the UI.
The mobile experience relies on apps, not files. The user runs an app, and the app somehow knows how to get its data. Much of the time, that data is stored in the cloud, on servers maintained by the app provider. There is no "File" menu -- or any menu.
Shifting the experience from files to apps is a subtle but significant change. One tends to miss it, since keeping track of files was tedious and required discipline for structuring directories. When apps load their data automatically, one doesn't really mind not worrying about file names and locations.
With the "loss" of files, or more specifically the access to app data outside of the app, one is more conscious of sharing information between apps. In the PC age, one didn't share information between applications -- one stored data in a public place and let other applications have their way with it.
This "data available to all" makes some sense, especially for developers. Developers use multiple tools on their data -- source code -- and need it shared liberally. Editors, compilers, debuggers, lint utilities, and metrics tools all need unfettered access to the source code.
But the "free for all" model does not work so well for other data. In most shops, data is used in a single application (most often Microsoft Word or Microsoft Excel) and occasionally sent via e-mail to others. But this latter act is simply sharing the data; a better way to share data could eliminate a lot of e-mails.
Sharing data and allowing others to update it (or not) requires an infrastructure, a mechanism to make data available and control access. It requires data to be structured in a way that it can be shared -- which may be different from the internal format used by the app. Yet defining an external format may be good for us in general: we can eliminate the task of reverse-engineering proprietary file formats.
Life without files. Because we want to get work done, not read and save files.
Tuesday, May 14, 2013
IT must be strategic as well as tactical
Anyone running an IT shop (or running a company) must divide their efforts for IT into two categories: strategic and tactical.
Strategic efforts help the company in, well, strategic ways. They introduce new products to counter (or surpass) the competition. They create new markets. They do big, bold things -- sometimes risky -- for big returns.
Tactical efforts also help the company, but it smaller and less flashier ways. They improve internal processes. They reduce waste. They increase operating efficiencies. They do not do big, bold things, but instead do small, meek things for small returns.
But to do big, bold things you need a team that has got it's act together. You need the infrastructure in place, and you need a team that can build, operate, and maintain that infrastructure -- while implementing the big strategic stuff.
If your networks are reliable, if your storage is planned, allocated, monitored, and available, if your people have access to the information they need (and only that information), if your systems are updated on time, if your sysadmins have the right tools and your developers have the right tools and your analysts have the right tools and your sales people have the right tools (and they all know how to use them), then you probably have the tactical side working. (I say 'probably' because there are other things that can cause problems. A complete list would be larger than you would want to read in a blog post.)
If you're having problems with the strategic items, look to see how well the tacticals are doing. If *they* are having problems, start fixing those. And by fixing, I mean get the right people on the team, give them the authority to do their jobs, and fund them to get tools and technologies in place. Make sure that your people can get their "real work" done before you start doing grand things.
If the tacticals are working and the strategic items are not, then you don't have a technology problem. (Well, you might, if your strategy is to build something that is not feasible. But even then you would know. When your tacticals are working you have competent people who will tell you your strategy is not possible.)
Bottom line: get the little things working and you will have a reliable platform for larger efforts.
Strategic efforts help the company in, well, strategic ways. They introduce new products to counter (or surpass) the competition. They create new markets. They do big, bold things -- sometimes risky -- for big returns.
Tactical efforts also help the company, but it smaller and less flashier ways. They improve internal processes. They reduce waste. They increase operating efficiencies. They do not do big, bold things, but instead do small, meek things for small returns.
But to do big, bold things you need a team that has got it's act together. You need the infrastructure in place, and you need a team that can build, operate, and maintain that infrastructure -- while implementing the big strategic stuff.
If your networks are reliable, if your storage is planned, allocated, monitored, and available, if your people have access to the information they need (and only that information), if your systems are updated on time, if your sysadmins have the right tools and your developers have the right tools and your analysts have the right tools and your sales people have the right tools (and they all know how to use them), then you probably have the tactical side working. (I say 'probably' because there are other things that can cause problems. A complete list would be larger than you would want to read in a blog post.)
If you're having problems with the strategic items, look to see how well the tacticals are doing. If *they* are having problems, start fixing those. And by fixing, I mean get the right people on the team, give them the authority to do their jobs, and fund them to get tools and technologies in place. Make sure that your people can get their "real work" done before you start doing grand things.
If the tacticals are working and the strategic items are not, then you don't have a technology problem. (Well, you might, if your strategy is to build something that is not feasible. But even then you would know. When your tacticals are working you have competent people who will tell you your strategy is not possible.)
Bottom line: get the little things working and you will have a reliable platform for larger efforts.
Labels:
project governance,
project management,
strategy,
tactics
Wednesday, May 8, 2013
Computers are temples, but tablets are servants
There is an interesting psychological difference between "real" computers (mainframes, servers, and PCs) and the smartphones and tablets we use for mobile computing.
In short, "real" computers are temples of worship, and mobile computers are servants.
Mainframe computers have long seen the metaphor of religion, with their attendants referenced as "high priests". Personal computers have not seen such comparisons, but I think the "temple" metaphor holds. (Or perhaps we should say "shrine".)
"Real" computers are non-mobile. They are fixed in place. When we use a computer, we go to the computer. The one exception is laptops, which we can consider to be a portable shrine.
Mobile computers, in contrast, come with us. We do not go to them; they are nearby and ready for our requests.
Tablets and smartphones are intimate. They come with us to the grocery store, the exercise club, and the library. Mainframes of course do not come with us anywhere, and personal computers stay at home. Laptops occasionally come with us, but only with significant effort. (Carry bag, laptop, power adapter and cable, extra VGA cable, VGA-DVI adapter, and goodness-know-what.)
It's nice to visit a temple, but it's nicer to have a ready and capable servant.
In short, "real" computers are temples of worship, and mobile computers are servants.
Mainframe computers have long seen the metaphor of religion, with their attendants referenced as "high priests". Personal computers have not seen such comparisons, but I think the "temple" metaphor holds. (Or perhaps we should say "shrine".)
"Real" computers are non-mobile. They are fixed in place. When we use a computer, we go to the computer. The one exception is laptops, which we can consider to be a portable shrine.
Mobile computers, in contrast, come with us. We do not go to them; they are nearby and ready for our requests.
Tablets and smartphones are intimate. They come with us to the grocery store, the exercise club, and the library. Mainframes of course do not come with us anywhere, and personal computers stay at home. Laptops occasionally come with us, but only with significant effort. (Carry bag, laptop, power adapter and cable, extra VGA cable, VGA-DVI adapter, and goodness-know-what.)
It's nice to visit a temple, but it's nicer to have a ready and capable servant.
Monday, May 6, 2013
A Risk of Big Data: Armchair Statisticians
In the mid-1980s, laser printers became affordable, word processor software became more capable, and many people found that they were able to publish their own documents. They proceeded to do so. Some showed restraint in the use of fonts; others created documents that were garish.
In the mid-1990s, web pages became affordable, web page design software became more capable, and many people found that they were able to create their own web sites. They proceeded to do so. Some showed restraint in the use of fonts, colors, and the blink tag; others created web sites that were hideous.
In the mid-2010s, storage became cheap, data became collectable, analysis tools became capable, and I suspect many people will find that they are able to collect and analyze large quantities of data. I further predict that many will do so. Some will show restraint in their analyses; others will collect some (almost) random data and create results that are less than correct.
The biggest risk of Big Data may be the amateur. Professional statisticians understand the data, understand the methods used to analyze the data, and understand the limits of those analyses. Armchair statisticians know enough to analysis the data but not enough to criticize the analysis. This is a problem because it is easy to mis-interpret the results.
Typical errors are:
It is easy to make these errors, which is why the professionals take such pains to evaluate their work. Note that none of these are obvious in the results.
When the cost of performing these analyses was high, only the professionals could play. The cost of such analyses is dropping, which means that amateurs can play. And their results will look (at first glance) just as pretty as the professionals.
In desktop publishing and web page design, it was easy to separate the professionals and the amateurs. The visual aspects of the finished product were obvious.
With big data, it is hard to separate the two. The visual aspects of the final product do not show the workmanship of the analysis. (They show the workmanship of the presentation tool.)
Be prepared for the coming flood of presentations. And be prepared to ask some hard questions about the data and the analyses. It is the only way you will be able to tell the wheat from the chaff.
In the mid-1990s, web pages became affordable, web page design software became more capable, and many people found that they were able to create their own web sites. They proceeded to do so. Some showed restraint in the use of fonts, colors, and the blink tag; others created web sites that were hideous.
In the mid-2010s, storage became cheap, data became collectable, analysis tools became capable, and I suspect many people will find that they are able to collect and analyze large quantities of data. I further predict that many will do so. Some will show restraint in their analyses; others will collect some (almost) random data and create results that are less than correct.
The biggest risk of Big Data may be the amateur. Professional statisticians understand the data, understand the methods used to analyze the data, and understand the limits of those analyses. Armchair statisticians know enough to analysis the data but not enough to criticize the analysis. This is a problem because it is easy to mis-interpret the results.
Typical errors are:
- Omitting relevant data (or including irrelevant data) due to incorrect "select" operations.
- Identifying correlation as causation. (In an economic downturn, the unemployment rate increases as does the payments for unemployment insurance. But the UI payments do not cause the UI rate; both are driven by the economy.)
- Identifying the reverse of a causal relationship (Umbrellas do not cause rain.)
- Improper summary operations (Such as calculating an average of a quantized value like processor speed. You most likely want either the median or the mode.)
It is easy to make these errors, which is why the professionals take such pains to evaluate their work. Note that none of these are obvious in the results.
When the cost of performing these analyses was high, only the professionals could play. The cost of such analyses is dropping, which means that amateurs can play. And their results will look (at first glance) just as pretty as the professionals.
In desktop publishing and web page design, it was easy to separate the professionals and the amateurs. The visual aspects of the finished product were obvious.
With big data, it is hard to separate the two. The visual aspects of the final product do not show the workmanship of the analysis. (They show the workmanship of the presentation tool.)
Be prepared for the coming flood of presentations. And be prepared to ask some hard questions about the data and the analyses. It is the only way you will be able to tell the wheat from the chaff.
Labels:
big data,
data analysis,
desktop publishing,
web page design
Thursday, May 2, 2013
Our fickleness on the important aspects of programs
Over time, we have changed our desire in program attributes. If we divide the IT age into four eras, we can see this change. Let's consider the four eras to be mainframe, PC, web, and mobile/cloud. These four eras used different technology and different languages, and praised different accomplishments.
In the mainframe era, we focussed on raw efficiency. We measured CPU usage, memory usage, and disk usage. We strove to have enough CPU, memory, and disk, with some to spare but not too much. Hardware was expensive, and too much spare capacity meant that you were paying for more than you needed.
In the PC era we focussed not on efficiency but on user-friendliness. We built applications with help screens and menus. We didn't care too much about efficiency -- many people left PCs powered on overnight, with no "jobs" running.
With web applications, we focussed on globalization, with efficiency as a sub-goal. The big effort was in the delivery of an application to a large quantity of users. This meant translation into multiple languages, the "internationalization" of an application, support for multiple browsers, and support for multiple time zones. But we didn't want to overload our servers, either, so early Perl CGI applications were quickly converted to C or other languages for performance.
With applications for mobile/cloud, we desire two aspects: For mobile apps (that is, the 'UI' portion), we want something easier than "user-friendly". The operation of an app must not merely be simple, it must be obvious. For cloud apps (that is, the server portion), we want scalability. An app must not be monolithic, but assembled from collaborative components.
The objectives for systems vary from era to era. Performance was a highly measured aspect in the mainframe era, and almost ignored in the PC era.
The shift from one era to another may be difficult for practitioners. Programmers in one era may be trained to "optimize" their code for the dominant aspect. (In the mainframe era, they would optimize for performance.) A succeeding era would demand other aspects in their systems, and programmers may not be aware of the change. Thus, a highly-praised mainframe programmer with excellent skills at algorithm design, when transferred to a PC project may find that his skills are not desired or recognized. His code may receive a poor review, since the expectation for PC systems is "user friendly" and his skills from mainframe programming do not provide that aspect.
Similarly, a skilled PC programmer may have difficulties when moving to web or mobile/cloud systems. The expectations for user interface, architecture, and efficiency are quite different.
Practitioners who start with a later era (for example, the 'young turks' starting with mobile/cloud) may find it difficult to comprehend the reasoning of programmers from an earlier era. Why do mainframe programmers care about the order of mathematical operations? Why do PC programmers care so much about in-memory data structures, to the point of writing their own?
The answers are that, at the time, these were important aspects of programs. They were pounded into the programmers of earlier eras, to a degree that those programmers design their code without thinking about these optimizations.
Experienced programmers must look at the new system designs and the context of those designs. Mobile/cloud needs scalability, and therefore needs collaborative components. The monolithic designs that optimized memory usage are unsuitable to the new environment. Experienced programmers must recognize their learned biases and discard those that are not useful in the new era. (Perhaps we can consider this a problem of cache invalidation.)
Younger programmers would benefit from a deeper understanding of the earlier eras. Art students learn study the conditions (and politics) of the old masters. Architects study the buildings of the Greeks, Romans, and medieval kingdoms. Programmers familiar with the latest era, and only the latest era, will have a difficult time communicating with programmers of earlier eras.
Each era has objectives and constraints. Learn about those objectives and constraints, and you will find a deeper appreciation of programs and a greater ability to communicate with other programmers.
In the mainframe era, we focussed on raw efficiency. We measured CPU usage, memory usage, and disk usage. We strove to have enough CPU, memory, and disk, with some to spare but not too much. Hardware was expensive, and too much spare capacity meant that you were paying for more than you needed.
With web applications, we focussed on globalization, with efficiency as a sub-goal. The big effort was in the delivery of an application to a large quantity of users. This meant translation into multiple languages, the "internationalization" of an application, support for multiple browsers, and support for multiple time zones. But we didn't want to overload our servers, either, so early Perl CGI applications were quickly converted to C or other languages for performance.
With applications for mobile/cloud, we desire two aspects: For mobile apps (that is, the 'UI' portion), we want something easier than "user-friendly". The operation of an app must not merely be simple, it must be obvious. For cloud apps (that is, the server portion), we want scalability. An app must not be monolithic, but assembled from collaborative components.
The objectives for systems vary from era to era. Performance was a highly measured aspect in the mainframe era, and almost ignored in the PC era.
The shift from one era to another may be difficult for practitioners. Programmers in one era may be trained to "optimize" their code for the dominant aspect. (In the mainframe era, they would optimize for performance.) A succeeding era would demand other aspects in their systems, and programmers may not be aware of the change. Thus, a highly-praised mainframe programmer with excellent skills at algorithm design, when transferred to a PC project may find that his skills are not desired or recognized. His code may receive a poor review, since the expectation for PC systems is "user friendly" and his skills from mainframe programming do not provide that aspect.
Similarly, a skilled PC programmer may have difficulties when moving to web or mobile/cloud systems. The expectations for user interface, architecture, and efficiency are quite different.
Practitioners who start with a later era (for example, the 'young turks' starting with mobile/cloud) may find it difficult to comprehend the reasoning of programmers from an earlier era. Why do mainframe programmers care about the order of mathematical operations? Why do PC programmers care so much about in-memory data structures, to the point of writing their own?
The answers are that, at the time, these were important aspects of programs. They were pounded into the programmers of earlier eras, to a degree that those programmers design their code without thinking about these optimizations.
Experienced programmers must look at the new system designs and the context of those designs. Mobile/cloud needs scalability, and therefore needs collaborative components. The monolithic designs that optimized memory usage are unsuitable to the new environment. Experienced programmers must recognize their learned biases and discard those that are not useful in the new era. (Perhaps we can consider this a problem of cache invalidation.)
Younger programmers would benefit from a deeper understanding of the earlier eras. Art students learn study the conditions (and politics) of the old masters. Architects study the buildings of the Greeks, Romans, and medieval kingdoms. Programmers familiar with the latest era, and only the latest era, will have a difficult time communicating with programmers of earlier eras.
Each era has objectives and constraints. Learn about those objectives and constraints, and you will find a deeper appreciation of programs and a greater ability to communicate with other programmers.
Subscribe to:
Posts (Atom)