Microsoft, after years of dominance in the market, now faces competition. That competition, in the form of Apple's Mac OSX and in the form of Linux, forces Microsoft to make some changes.
One area for change is the update process for Windows. Microsoft needs to improve their game in this area.
I have several PCs; three of them run Windows. A relatively modern desktop runs Windows 8.1, a slightly older laptop runs Windows 7, and an ancient tower unit runs Windows XP. They all started with those same versions of Windows (except for the modern desktop which started with Windows 8 and was later upgraded to Windows 8.1).
In addition to the PCs running Windows, I have several PCs running Ubuntu Linux: two laptops running the "desktop" version and three tower PCs running the "server" version.
Ubuntu Linux provides new versions every six months. They have gotten quite good at it. Each April and October, new versions are released. Each April and October, my Ubuntu systems display messages indicating that new versions are available. The server versions, which use a command-line interface, display a simple message at sign-on, along with the command to download and install the new version. The desktop versions, which use a graphic interface, display a dialog with a button that says, roughly, "upgrade now".
Ubuntu makes it easy to upgrade. The current system informs me of the upgrade and provides the instructions to install it. The process is simple: download the new package, install it, and re-start the computer. (It is the only time I have to re-start Linux.)
Windows, in contrast, offers no such support. While the Windows 8 system did download and install the Windows 8.1 update, the Windows 7 machine has said nothing about an upgrade for Windows 8. And the Windows XP machine hums along quietly, too, mentioning nothing about upgrades. (To be fair, the hardware in that ancient PC is not sufficient for Windows 8, so maybe it knows what it is doing.)
I'm not asking for free updates to Windows 8. I recognize that Canonical and Microsoft have different business models. Canonical does not charge for updates (or even the first install) of Ubuntu Linux; Microsoft charges for a new install and each major upgrade. Paying for an update should be a simple affair: one is really paying for an activation code and the software just happens to come along.
Ubuntu Linux also provides a path for old, out-of-support versions. I installed version 11.10, which ran and promply told me that it was out of support, and also prompted me to upgrade. Imagine installing Windows XP today: would it prompt you to upgrade to a later version? (Ubuntu upgrades through versions; the Windows equivalent would be to upgrade from Windows XP to Windows Vista and then to Windows 7.)
Canonical has raised the bar for operating system updates. They work, they are simple, and they encourage people to move to supported versions. Microsoft must match this level of support in their products. The benefit for Microsoft is that people move to the latest version of Windows, which improves their uptake rate. The benefit for users is that they ... move to the latest version of Windows, which provides the latest security patches.
Corporations and large shops may choose to wait for upgrades. They may wish to test them and then roll them out to their users. That's possible too, through Windows' group policies. Individual users, through, have little to lose.
Thursday, May 28, 2015
Tuesday, May 26, 2015
When technology is not the limit
The early days of computing were all about limits. Regardless of the era you pick (mainframe, minicomputer, PC, client-server, etc.) the systems were constrained and imposed hard limits on computations. CPUs were limited in speed. Memory was limited to small sizes. Disks for storage were expensive, so people used the smallest disk they could and stored as much as possible on cheaper tape.
These limitations showed through to applications.
Text editors could handle a small amount of text at one time. Some were limited to that amount and could handle only files of that size (or smaller). Other editors would "page out" a block of text and "page in" the next block, letting you work on one section of the text at a time, but the page operations worked only in the forward direction -- there was no "going back" to a previous block.
Compilers would allow for programs of only limited sizes (the limits dependent on the memory and storage available). Early FORTRAN compilers used only the first six characters of identifiers (variable names and function names) and ignored the remainder, so the variables DVALUES1 and DVALUES2 were considered to be the same variable.
In those days, programming required knowledge not only of the language but also of the system limitations. The constraints were a constant pressure, a ceiling that could not be exceeded. Such limitations drove much innovation; we were constantly yearning for more powerful instruction sets, larger memories, and more capacious and faster storage. Over time, we achieved those goals.
The history of the PC shows such growth. The original IBM PC was equipped with an 8088 CPU, a puny (by today's standards) processor that could not even handle floating-point numbers. While the processor could handle 1 MB of memory, the computer came equipped with only 64 KB of RAM and 64 KB of ROM. The display was a simple arrangement, with either high-resolution text only monochrome or low-resolution graphics in color.
Over the years, PCs acquired more powerful processors, larger address spaces, more memory, larger disk drives (well, larger capacities but smaller physical forms), and better displays.
We are at the point where a number of applications have been "solved", that is, they are not constrained by technology. Text editors can hold the entire document (up to several gigabytes) in memory and allow sophisticated editing commands. The limits on editors have been expanded such that we do not notice them.
Word processing, too, has been solved. Today's word processing systems can handle just about any function: wrapping text to column widths, accounting for typeface variations and kerning, indexing and auto-numbering, ... you name it.
Audio processing, e-mail, web browsing, ... all of these have enough technology to get the job done. We no longer look for a larger processor or more memory to solve our problems.
Which leads to an interesting conclusion: When our technology can handle our needs, an advance in technology will not help us.
A faster processor will not help our word processors. More memory will not help us with e-mail. (When one drives in suburbia on 30 MPH roads, a Honda Civic is sufficient, and a Porsche provides no benefits.)
I recognize that there are some applications that would benefit from faster processors and "more" technology. Big data (possibly, although cloud systems seems to be handling that). Factorization of numbers, for code-breaking. Artificial Intelligence (although that may be more a problem of algorithms and not raw hardware).
For the average user, today's PCs, Chromebooks, and tablets are good enough. They get the job done.
I think that this explains the longevity of Windows XP. It was a "good enough" operating system running on "good enough" hardware, supporting "good enough" applications.
Looking forward, people will have little incentive to switch from 64-bit processors to larger models (128-bit? super-scaled? variable-bit?) because they will offer little in the way of an improved experience.
The market pressure for larger systems will evaporate. What takes its place? What will drive innovation?
I see two things to spur innovation in the market: cost and security. People will look for systems with lower cost. Businesses especially are price-conscious and look to reduce expenses.
The other area is security. With more "security events" (data exposures, security breaches, and viruses) people are becoming more aware of the need for secure systems. Increased security (if there is a way to measure security) will be a selling point.
So instead of faster processors and more memory, look for cheaper systems and more secure (possibly not cheaper) offerings.
These limitations showed through to applications.
Text editors could handle a small amount of text at one time. Some were limited to that amount and could handle only files of that size (or smaller). Other editors would "page out" a block of text and "page in" the next block, letting you work on one section of the text at a time, but the page operations worked only in the forward direction -- there was no "going back" to a previous block.
Compilers would allow for programs of only limited sizes (the limits dependent on the memory and storage available). Early FORTRAN compilers used only the first six characters of identifiers (variable names and function names) and ignored the remainder, so the variables DVALUES1 and DVALUES2 were considered to be the same variable.
In those days, programming required knowledge not only of the language but also of the system limitations. The constraints were a constant pressure, a ceiling that could not be exceeded. Such limitations drove much innovation; we were constantly yearning for more powerful instruction sets, larger memories, and more capacious and faster storage. Over time, we achieved those goals.
The history of the PC shows such growth. The original IBM PC was equipped with an 8088 CPU, a puny (by today's standards) processor that could not even handle floating-point numbers. While the processor could handle 1 MB of memory, the computer came equipped with only 64 KB of RAM and 64 KB of ROM. The display was a simple arrangement, with either high-resolution text only monochrome or low-resolution graphics in color.
Over the years, PCs acquired more powerful processors, larger address spaces, more memory, larger disk drives (well, larger capacities but smaller physical forms), and better displays.
We are at the point where a number of applications have been "solved", that is, they are not constrained by technology. Text editors can hold the entire document (up to several gigabytes) in memory and allow sophisticated editing commands. The limits on editors have been expanded such that we do not notice them.
Word processing, too, has been solved. Today's word processing systems can handle just about any function: wrapping text to column widths, accounting for typeface variations and kerning, indexing and auto-numbering, ... you name it.
Audio processing, e-mail, web browsing, ... all of these have enough technology to get the job done. We no longer look for a larger processor or more memory to solve our problems.
Which leads to an interesting conclusion: When our technology can handle our needs, an advance in technology will not help us.
A faster processor will not help our word processors. More memory will not help us with e-mail. (When one drives in suburbia on 30 MPH roads, a Honda Civic is sufficient, and a Porsche provides no benefits.)
I recognize that there are some applications that would benefit from faster processors and "more" technology. Big data (possibly, although cloud systems seems to be handling that). Factorization of numbers, for code-breaking. Artificial Intelligence (although that may be more a problem of algorithms and not raw hardware).
For the average user, today's PCs, Chromebooks, and tablets are good enough. They get the job done.
I think that this explains the longevity of Windows XP. It was a "good enough" operating system running on "good enough" hardware, supporting "good enough" applications.
Looking forward, people will have little incentive to switch from 64-bit processors to larger models (128-bit? super-scaled? variable-bit?) because they will offer little in the way of an improved experience.
The market pressure for larger systems will evaporate. What takes its place? What will drive innovation?
I see two things to spur innovation in the market: cost and security. People will look for systems with lower cost. Businesses especially are price-conscious and look to reduce expenses.
The other area is security. With more "security events" (data exposures, security breaches, and viruses) people are becoming more aware of the need for secure systems. Increased security (if there is a way to measure security) will be a selling point.
So instead of faster processors and more memory, look for cheaper systems and more secure (possibly not cheaper) offerings.
Wednesday, May 13, 2015
The other shadow IT
The term "shadow IT" has come to mean IT products and services used within an organization without the blessing (or even knowledge) of the IT support team. Often such services and products are not on the list of approved software.
In the good old days before the open source movement and when software had to be purchased, it was easy to control purchases of software: the Purchasing Department would verify all purchase requests with the IT Department; any unauthorized requests would be refused.
Even with open source and "free" software, the IT Department could set policies on individual PCs and prevent people from installing software. IT remained the gatekeepers of software.
With cloud computing, those controls can be bypassed. Software can now be used in the browser. Point your browser at a web site, register, supply a credit card, and you're ready to go. No Purchasing Department, no administrator rights. This is the situation that most people associate with the term "shadow IT".
Yet there is another technology that is used without the knowledge of the IT Department. A technology that has been used by many people, to perform many tasks within the organization. Programs have been written, databases have been designed, and systems have been implemented without the involvement of the IT Department. Worse, these systems have been used in production without extensive testing (perhaps no testing), have not been audited, and have no backups or disaster recover plans.
I'm talking about spreadsheets.
Specifically, Microsoft Excel spreadsheets.
Microsoft Excel is a standard among corporate computing. Just about every "business" PC (as opposed to "developer" PC or "sysadmin" PC) runs Windows and has Excel. The technology is available, often mandated by the IT Department as a standard configuration.
Millions of people have access to Excel, and they use it. And why not? Excel is powerful, flexible, and useful. There are tutorials for it. There are web pages with hints and tips. Microsoft has made it easy to use. There is little work needed to use Excel to perform calculations and store data. One can even connect Excel to external data sources (to avoid re-typing data) and program Excel with macros in VBA.
Excel, in other words, is a system platform complete with programming language. It is used by millions of people in thousands (hundreds of thousands?) of organizations. Some small businesses may run completely on Excel. Larger business may run on a combination of "properly" designed and supported systems and Excel.
This is the other shadow IT. The spreadsheets used by people to perform mundane (or perhaps not-so-mundane) tasks. The queries to corporate databases. The programs in VBA that advertise themselves as "macros". All operating without IT's knowledge or support.
Comparing two programming languages is difficult. Different languages have different capabilities and different amounts of programming "power". One line of COBOL can do the work of many lines of assembly. Ten lines of Python can do more work than ten lines of Java.
I suspect that if we could compare Excel to the corporate-approved languages of C# and Java, we would find that there is more Excel code that corporate-approved code. That is a lot of code! It means that Excel is the "dark matter" of the IT universe: existing but not observed. (I realize that this amount is speculation. We have no measurements for Excel code.)
Excel is the shadow technology to watch. Don't ignore file-sharing and browser-based apps; they are risks too. But keep an eye on the technology we already have and use.
In the good old days before the open source movement and when software had to be purchased, it was easy to control purchases of software: the Purchasing Department would verify all purchase requests with the IT Department; any unauthorized requests would be refused.
Even with open source and "free" software, the IT Department could set policies on individual PCs and prevent people from installing software. IT remained the gatekeepers of software.
With cloud computing, those controls can be bypassed. Software can now be used in the browser. Point your browser at a web site, register, supply a credit card, and you're ready to go. No Purchasing Department, no administrator rights. This is the situation that most people associate with the term "shadow IT".
Yet there is another technology that is used without the knowledge of the IT Department. A technology that has been used by many people, to perform many tasks within the organization. Programs have been written, databases have been designed, and systems have been implemented without the involvement of the IT Department. Worse, these systems have been used in production without extensive testing (perhaps no testing), have not been audited, and have no backups or disaster recover plans.
I'm talking about spreadsheets.
Specifically, Microsoft Excel spreadsheets.
Microsoft Excel is a standard among corporate computing. Just about every "business" PC (as opposed to "developer" PC or "sysadmin" PC) runs Windows and has Excel. The technology is available, often mandated by the IT Department as a standard configuration.
Millions of people have access to Excel, and they use it. And why not? Excel is powerful, flexible, and useful. There are tutorials for it. There are web pages with hints and tips. Microsoft has made it easy to use. There is little work needed to use Excel to perform calculations and store data. One can even connect Excel to external data sources (to avoid re-typing data) and program Excel with macros in VBA.
Excel, in other words, is a system platform complete with programming language. It is used by millions of people in thousands (hundreds of thousands?) of organizations. Some small businesses may run completely on Excel. Larger business may run on a combination of "properly" designed and supported systems and Excel.
This is the other shadow IT. The spreadsheets used by people to perform mundane (or perhaps not-so-mundane) tasks. The queries to corporate databases. The programs in VBA that advertise themselves as "macros". All operating without IT's knowledge or support.
Comparing two programming languages is difficult. Different languages have different capabilities and different amounts of programming "power". One line of COBOL can do the work of many lines of assembly. Ten lines of Python can do more work than ten lines of Java.
I suspect that if we could compare Excel to the corporate-approved languages of C# and Java, we would find that there is more Excel code that corporate-approved code. That is a lot of code! It means that Excel is the "dark matter" of the IT universe: existing but not observed. (I realize that this amount is speculation. We have no measurements for Excel code.)
Excel is the shadow technology to watch. Don't ignore file-sharing and browser-based apps; they are risks too. But keep an eye on the technology we already have and use.
Labels:
Microsoft Excel,
risk management,
shadow IT,
spreadsheets
Tuesday, May 12, 2015
Cloud programs are mainframe programs, sort of
I was fortunate to start my programming career in the dawn of the age of BASIC. The BASIC language was designed with the user in mind and had several features that made it easy to use.
To truly appreciate BASIC, one must understand the languages that came before it. Comparing BASIC to JavaScript, or Swift, or Ruby makes little sense; each of those came after BASIC (long after) and built on the experience of BASIC. The advantages of BASIC are clear when compared to the languages of the time: COBOL and Fortran.
BASIC was interpreted, which meant that a program could be typed and run in one fast session. COBOL and Fortran were compiled, which meant that a program had to be typed, saved to disk, compiled, linked, and then run. With BASIC, one could change a program and re-run it; with other languages you had to go through the entire edit-save-compile-link cycle.
Where BASIC really had an advantage over COBOL and Fortran was with input. BASIC had a flexible INPUT statement that let a program read values from the user. COBOL was designed to read data from punch cards; Fortran was designed to read data from magnetic tape. Both were later modified to handle input from "the console" -- the terminal at which a programmer used for an interactive session -- but even with those changes, interactive programs were painful to write. Yet in BASIC it was easy to write a program that asked the user "would you like to run again?".
The interactive properties of BASIC made it a hit with microcomputer users. (It's availability, due to Microsoft's aggressive marketing, also helped.) Fortran and COBOL achieved minimal success with microcomputers, setting up the divide between "mainframe programming" (COBOL and Fortran) and "microcomputer programming" (BASIC, and later Pascal). Some rash young members of the computing field called the two divisions "ancient programming" and "modern programming".
But the division wasn't so much between mainframe and microcomputer (or old and new) as we thought. Instead, the division was between interactive and non-interactive. Microcomputers and their applications were interactive and mainframes and their applications were non-interactive. (Mainframe applications were also batch-oriented, which is another aspect.)
What does all of this history have to do with computing in the current day? Well, cloud computing is pretty modern stuff, and it is quite different from the interactive programming on microcomputers. I don't see anyone building cloud applications with BASIC or Pascal; people use Python or Ruby or Java or C#. But cloud computing is close to mainframe computing (yes, that "ancient" form of computing) in that it is non-interactive. A cloud application gets a request, processes it, and returns a response -- and that's it. There is no "would you like to run again?" option from cloud applications.
Which is not to say that today's systems are not interactive -- they are. But it is not the cloud portion of the system that is interactive. The interactivity with the use has been separated from the cloud; it lives in the mobile app on the user's phone, or perhaps in a JavaScript app in a browser.
With all of the user interaction in the mobile app (or browser app), cloud apps can go about their business and focus on processing. It's a pretty good arrangement.
But it does mean that cloud apps are quite similar to mainframe apps.
To truly appreciate BASIC, one must understand the languages that came before it. Comparing BASIC to JavaScript, or Swift, or Ruby makes little sense; each of those came after BASIC (long after) and built on the experience of BASIC. The advantages of BASIC are clear when compared to the languages of the time: COBOL and Fortran.
BASIC was interpreted, which meant that a program could be typed and run in one fast session. COBOL and Fortran were compiled, which meant that a program had to be typed, saved to disk, compiled, linked, and then run. With BASIC, one could change a program and re-run it; with other languages you had to go through the entire edit-save-compile-link cycle.
Where BASIC really had an advantage over COBOL and Fortran was with input. BASIC had a flexible INPUT statement that let a program read values from the user. COBOL was designed to read data from punch cards; Fortran was designed to read data from magnetic tape. Both were later modified to handle input from "the console" -- the terminal at which a programmer used for an interactive session -- but even with those changes, interactive programs were painful to write. Yet in BASIC it was easy to write a program that asked the user "would you like to run again?".
The interactive properties of BASIC made it a hit with microcomputer users. (It's availability, due to Microsoft's aggressive marketing, also helped.) Fortran and COBOL achieved minimal success with microcomputers, setting up the divide between "mainframe programming" (COBOL and Fortran) and "microcomputer programming" (BASIC, and later Pascal). Some rash young members of the computing field called the two divisions "ancient programming" and "modern programming".
But the division wasn't so much between mainframe and microcomputer (or old and new) as we thought. Instead, the division was between interactive and non-interactive. Microcomputers and their applications were interactive and mainframes and their applications were non-interactive. (Mainframe applications were also batch-oriented, which is another aspect.)
What does all of this history have to do with computing in the current day? Well, cloud computing is pretty modern stuff, and it is quite different from the interactive programming on microcomputers. I don't see anyone building cloud applications with BASIC or Pascal; people use Python or Ruby or Java or C#. But cloud computing is close to mainframe computing (yes, that "ancient" form of computing) in that it is non-interactive. A cloud application gets a request, processes it, and returns a response -- and that's it. There is no "would you like to run again?" option from cloud applications.
Which is not to say that today's systems are not interactive -- they are. But it is not the cloud portion of the system that is interactive. The interactivity with the use has been separated from the cloud; it lives in the mobile app on the user's phone, or perhaps in a JavaScript app in a browser.
With all of the user interaction in the mobile app (or browser app), cloud apps can go about their business and focus on processing. It's a pretty good arrangement.
But it does mean that cloud apps are quite similar to mainframe apps.
Labels:
BASIC,
cloud computing,
COBOL,
Fortran,
interactive processing,
mainframe,
mobile/cloud
Sunday, May 10, 2015
The artistic appeal of the white MacBook
Laptops have established their place in computing history. Some have been innovative, others bland.
The white MacBooks were appreciated by many. They were (and still are) useful. And they were unique.
The uniqueness of the white MacBooks is in their popularity for decorations. People have added stickers and artistic work to many laptops, but the white MacBooks had more. It was as if they reached out and asked people to decorate them.
Perhaps it was the allure of a flat, white space. (There was a black version of the MacBook, but it was rare. I saw none "in the wild" so I cannot comment on their decorations.) Laptops from other vendors came in black, grey, beige, and sometimes with bright colors, but I recall none in the "Imperial Stormtrooper White" used by Apple.
Perhaps it was the prestige of owning an Apple product, and not a typical PC running Windows. In that age, running an operating system other than Windows was partly an act of rebellion.
Perhaps it was because the people who owned MacBooks were typically artists, musicians, writers, designers, or other creative folks.
That was in the age of the white MacBook. Today, things are different.
I look at the laptops in today's world, and I see little in the way of decorations. Even (or perhaps I should say "especially") among the Apple laptops. Today's laptops are, mostly, plain and un-enhanced. (Yes, there are a few who add stickers to their laptops. But the number is a lot smaller than it used to be.)
Some factor inhibits decorations. It might be the new color scheme (silver, gold, or grey). It might be the texture of the dimpled surface. It might be the curve of the device covers, no longer offering the flat surface.
I will content myself with speculation, and leave the analysis to the psychologists and anthropologists.
The white MacBooks were appreciated by many. They were (and still are) useful. And they were unique.
The uniqueness of the white MacBooks is in their popularity for decorations. People have added stickers and artistic work to many laptops, but the white MacBooks had more. It was as if they reached out and asked people to decorate them.
Perhaps it was the allure of a flat, white space. (There was a black version of the MacBook, but it was rare. I saw none "in the wild" so I cannot comment on their decorations.) Laptops from other vendors came in black, grey, beige, and sometimes with bright colors, but I recall none in the "Imperial Stormtrooper White" used by Apple.
Perhaps it was the prestige of owning an Apple product, and not a typical PC running Windows. In that age, running an operating system other than Windows was partly an act of rebellion.
Perhaps it was because the people who owned MacBooks were typically artists, musicians, writers, designers, or other creative folks.
That was in the age of the white MacBook. Today, things are different.
I look at the laptops in today's world, and I see little in the way of decorations. Even (or perhaps I should say "especially") among the Apple laptops. Today's laptops are, mostly, plain and un-enhanced. (Yes, there are a few who add stickers to their laptops. But the number is a lot smaller than it used to be.)
Some factor inhibits decorations. It might be the new color scheme (silver, gold, or grey). It might be the texture of the dimpled surface. It might be the curve of the device covers, no longer offering the flat surface.
I will content myself with speculation, and leave the analysis to the psychologists and anthropologists.
Wednesday, May 6, 2015
Cloud apps may not need OOP
The rise of different programming styles follows the need for program sizes.
The earliest programs were small -- tiny by today's standards. Most would fit on a single printed page; the largest took a few pages.
Programs larger than a few pages quickly became a tangled mess. Structured Programming (SP) was a reaction to the need for larger programs. It was a technique to organize code and allow programmers to quickly learn an existing system. With SP we saw languages that used structured techniques: Pascal and C became popular, and Fortran and BASIC changed to use the structured constructs IF-THEN-ELSE and DO-WHILE.
Structured Programming was able to organize code up to a point, but it could not manage the large systems of the 1990s. Object-oriented programming (OOP) was a reaction to the need for programs larger than several hundred printed pages. With OOP we saw languages that used object-oriented techniques: Java and C# became popular, C mutated into C++, and Pascal mutated into ObjectPascal. These new languages (and new versions of old languages) used the object-oriented constructs of encapsulation, inheritance, and polymorphism.
Cloud computing brings changes to programming, but in a new way. Instead of larger programs, cloud computing allows for (and encourages) smaller programs. The need for large, well-organized programs has been replaced by a need for well-organized systems of small programs. In addition, the needs placed on the small programs are different from the needs of the old, pre-cloud programs: cloud programs must be fast, replicable, and substitutable. The core idea of cloud computing is a that a number of servers are ready to respond to requests and that any server (of a given class) can handle your request -- you don't need a specific server.
In this environment, object-oriented programming is less useful. It requires some overhead -- for the design or programs and at run time. Its strength is to organize code for large programs, but it offers little for small programs. I expect that people will move away from OOP languages for cloud systems, and towards languages than emphasize readability and reliability.
I don't expect a renaissance of Structured Programming. I don't expect anyone to move back to the older SP-inspired languages of Pascal and Fortran-77. Cloud computing may be the technology that pushes us to move to the "Functional Programming" style. Look for cloud-based applications to use functional languages such as Haskell and Erlang. (Maybe F#, for Microsoft shops.)
The earliest programs were small -- tiny by today's standards. Most would fit on a single printed page; the largest took a few pages.
Programs larger than a few pages quickly became a tangled mess. Structured Programming (SP) was a reaction to the need for larger programs. It was a technique to organize code and allow programmers to quickly learn an existing system. With SP we saw languages that used structured techniques: Pascal and C became popular, and Fortran and BASIC changed to use the structured constructs IF-THEN-ELSE and DO-WHILE.
Structured Programming was able to organize code up to a point, but it could not manage the large systems of the 1990s. Object-oriented programming (OOP) was a reaction to the need for programs larger than several hundred printed pages. With OOP we saw languages that used object-oriented techniques: Java and C# became popular, C mutated into C++, and Pascal mutated into ObjectPascal. These new languages (and new versions of old languages) used the object-oriented constructs of encapsulation, inheritance, and polymorphism.
Cloud computing brings changes to programming, but in a new way. Instead of larger programs, cloud computing allows for (and encourages) smaller programs. The need for large, well-organized programs has been replaced by a need for well-organized systems of small programs. In addition, the needs placed on the small programs are different from the needs of the old, pre-cloud programs: cloud programs must be fast, replicable, and substitutable. The core idea of cloud computing is a that a number of servers are ready to respond to requests and that any server (of a given class) can handle your request -- you don't need a specific server.
In this environment, object-oriented programming is less useful. It requires some overhead -- for the design or programs and at run time. Its strength is to organize code for large programs, but it offers little for small programs. I expect that people will move away from OOP languages for cloud systems, and towards languages than emphasize readability and reliability.
I don't expect a renaissance of Structured Programming. I don't expect anyone to move back to the older SP-inspired languages of Pascal and Fortran-77. Cloud computing may be the technology that pushes us to move to the "Functional Programming" style. Look for cloud-based applications to use functional languages such as Haskell and Erlang. (Maybe F#, for Microsoft shops.)
Subscribe to:
Posts (Atom)