Showing posts with label mobile/cloud. Show all posts
Showing posts with label mobile/cloud. Show all posts

Sunday, January 3, 2016

Predictions for 2016

It's the beginning of a new year, which means... predictions! Whee!

Let's start with some obvious predictions:

Mobile will be big in 2016.

Cloud will be big on 2016.

NoSQL and distributed databases will be big in 2016.

Predictions like these are easy.

Now for something a little less obvious: legacy applications.

With the continued interest in mobile, cloud, NoSQL, and distributed databases, these areas will see strong demand for architects, developers, designers, and testers. That demand will pull people away from legacy applications -- those applications built for classic, non-cloud web architectures as well as the remaining desktop applications and mainframe batch systems.

Which is unfortunate for the managers of those legacy applications, because I believe that 2016 is going to be the year that companies decide that they want to migrate those legacy applications to the cloud/mobile platform.

When the web appeared, lots of managers held back, waiting to see if the platform would prove itself. It did, and companies migrated most of their applications from desktop to web (either external or internal). Even Microsoft, stalwart of desktop applications, created a web-based version of Outlook.

Likewise, when mobile and cloud appeared, many managers held back and waited for the new technologies to prove themselves. With almost ten years of mobile and cloud, and many companies already using those technologies, its time for the holdouts to take action.

Look for renewed interest in converting existing desktop and classic web applications. The conversions have challenges. In one sense, the job is easier than the early conversions, because we now have experience with mobile/cloud systems and we understand the architecture. In other ways, this may be harder, as the easy conversions (the "low-hanging fruit") have already been done, which means that the remaining conversions are harder.

The architecture of mobile/cloud systems (with or without distributed databases) is different from classic web applications. (And very different from desktop applications.)

I think that 2016 will be the year of rude awakening, as companies look at the effort to convert their legacy systems to newer technologies.

But the rude awakening is delivered in two phases. The first is the cost and time to convert legacy applications. The second is the cost of maintaining legacy applications in their current form.

Why the cost of maintaining legacy applications, without changing them to newer technologies? Because of the demand for mobile/cloud is high. New entrants to the field will know the new technologies, and select jobs that let them use that knowledge. That means that the folks with knowledge of the older technologies will be, um, older.

The folks with knowledge about older languages (C++, Visual Basic) and older APIs (Flash) will be the senior developers. And senior developers are more expensive than junior developers.

So the owners of legacy applications have a rather unpleasant choice: migrate to mobile/cloud, which is expensive, or stay on the legacy platform, with will also be expensive.

Thursday, July 30, 2015

Tablets for consumption, cloudbooks for creation

Tablets and cloudbooks are mobile devices of the mobile/cloud computing world.

Tablets are small, flat, keyboardless devices with a touchscreen, processor, storage, and an internet connection. The Apple iPad is possibly the most well-known tablet. The Microsoft Surface is possibly the second most well-known. Other manufacturers offer tablets with Google's Android.

Cloudbooks are light, thin laptops. They contain a screen (possibly a touchscreen, but touch isn't a requirement), processor, storage, and internet connection, and the one thing that separates them from tablets: a keyboard. They look and feel like laptop computers, yet they are not laptops in the usual sense. They have a low-end processor and a custom operating system designed to do one thing: run a browser. The most well-known cloudbook computers are Google's Chromebooks.

I'm using the term "cloudbook" here to refer to the generic lightweight, low-powered, single-purpose laptop computer. A simple search shows that the phrase "cloudbook" (or a variation on capitalization) has been used for specific products, including an x86 laptop, a brand of e-books, a cloud services broker, and even an accounting system! Acer uses the name "cloudbook" for its, um, cloudbook devices.

Tablets and cloudbooks serve two different purposes. Tablets are designed for the consumption of data and cloudbooks are designed for the creation of data.

Tablets allow for the installation of apps, and there are apps for all sorts of things. Apps to play games. Apps to play music. Apps to chat with friends. Apps for e-mail (generally effective for reading e-mail and writing brief responses). Apps for Twitter. Apps for navigation.

Cloudbooks allow for the installation of apps too, although it is the browser that allows for apps and not the underlying operating system. On a Chromebook, it is Chrome that manages the apps. Google confuses the issue by listing web-based applications such as its Docs word processor and Sheets spreadsheet as "apps". The separation of web-based apps and browser-based apps is made more complex by Google's creation of duplicate apps for each environment to support off-line work. For off-line work, you must have a local (browser-based) app.

The apps for cloudbooks are oriented toward the composition of data: word processor, spreadsheet, editing photographs, and more.

I must point out that these differences are in orientation and not complete capabilities. One can consume data on a cloudbook. One can, with the appropriate tools and effort, create on a tablet. The two types of devices are not exclusive. In my view it is easier to consume on a tablet and easier to create on a cloudbook.

Tablets are already popular. I expect that cloudbooks will be popular with people who need to create and manage data. Two groups I expect to use cloudbooks are developers and system administrators. Cloudbooks are a convenient size for portability and capable enough to connect to cloud-based development services such as Cloud9, Codeanywhere, Cloud IDE, or Sourcekit.

Tuesday, July 7, 2015

I can write any language in FORTRAN

Experienced programmers, when learning a new programming language, often use the patterns and idioms of their old language in the new one. Thus, a programmer experienced in Java and learning Python will write code that, while legal Python, looks and smells like Java. The code is not "Pythonic".

I, when writing code in a new programming language, often write as if it were C. As I learn about the language, I change my pattern to match the language. The common saying is that a good programmer can write any language in FORTRAN. It's an old saying, probably from the age when most programmers learned COBOL and FORTRAN.

When the IT world shifted from structured programming languages (C, BASIC, FORTRAN) to object-oriented programming languages (C++, Java, C#) much of the code written in the new languages was in the style of the old languages. Eventually, we programmers learned to write object-oriented code.

Today, most programmers learn C# or Java as their first language.  Perhaps we should revise our pithy saying to: A good programmer can write any language in Java. (Or C#, if you prefer.)

Why is this important? Why think about the transition from structured programming ("procedural programming") to object-oriented programming?

Because we're going through another transition. Two, actually.

The first is the transition from object-oriented programming to functional programming. This is a slow change, one that will take several years and perhaps decades. Be prepared to see more about functional programming: articles, product releases, and services in programming platforms. And be prepared to see lots of functional programs written in a style that matches object-oriented code.

The second is the transition from web applications to mobile/cloud applications. This change is faster and is already well underway. Yet be prepared to see lots of mobile/cloud applications architected in the style of web applications.

Eventually, we will learn to write good functional programs. Eventually, we will learn to design good cloud systems. Some individuals (and organizations) will make the transition faster than others.

What does this mean for the average programmer? For starters, be aware of one's own skills. Second, have a plan to learn the new programming paradigms. Third, be aware of the skills of a hiring organization. When a company offers you a job, understand how that company's level matches your own.

What does this mean for companies? First, we have yet another transition in the IT world. (It may seem that the IT world has a lot of these inconvenient transitions.) Second, develop a plan to change your processes to use the new technology. (The changes are happening whether you like them or not.) Third, develop a plan to help your people learn the new technologies. Companies that value skilled employees will plan for training, pilot programs, and migration efforts. Companies that view employees as expensive resources that are little more than cost centers will choose to simply engage contractors with the skills and lay off those currently on their payrolls.

And in the short term, be prepared to see a lot of FORTRAN.

Tuesday, May 12, 2015

Cloud programs are mainframe programs, sort of

I was fortunate to start my programming career in the dawn of the age of BASIC. The BASIC language was designed with the user in mind and had several features that made it easy to use.

To truly appreciate BASIC, one must understand the languages that came before it. Comparing BASIC to JavaScript, or Swift, or Ruby makes little sense; each of those came after BASIC (long after) and built on the experience of BASIC. The advantages of BASIC are clear when compared to the languages of the time: COBOL and Fortran.

BASIC was interpreted, which meant that a program could be typed and run in one fast session. COBOL and Fortran were compiled, which meant that a program had to be typed, saved to disk, compiled, linked, and then run. With BASIC, one could change a program and re-run it; with other languages you had to go through the entire edit-save-compile-link cycle.

Where BASIC really had an advantage over COBOL and Fortran was with input. BASIC had a flexible INPUT statement that let a program read values from the user. COBOL was designed to read data from punch cards; Fortran was designed to read data from magnetic tape. Both were later modified to handle input from "the console" -- the terminal at which a programmer used for an interactive session -- but even with those changes, interactive programs were painful to write. Yet in BASIC it was easy to write a program that asked the user "would you like to run again?".

The interactive properties of BASIC made it a hit with microcomputer users. (It's availability, due to Microsoft's aggressive marketing, also helped.) Fortran and COBOL achieved minimal success with microcomputers, setting up the divide between "mainframe programming" (COBOL and Fortran) and "microcomputer programming" (BASIC, and later Pascal). Some rash young members of the computing field called the two divisions "ancient programming" and "modern programming".

But the division wasn't so much between mainframe and microcomputer (or old and new) as we thought. Instead, the division was between interactive and non-interactive. Microcomputers and their applications were interactive and mainframes and their applications were non-interactive. (Mainframe applications were also batch-oriented, which is another aspect.)

What does all of this history have to do with computing in the current day? Well, cloud computing is pretty modern stuff, and it is quite different from the interactive programming on microcomputers. I don't see anyone building cloud applications with BASIC or Pascal; people use Python or Ruby or Java or C#. But cloud computing is close to mainframe computing (yes, that "ancient" form of computing) in that it is non-interactive. A cloud application gets a request, processes it, and returns a response -- and that's it. There is no "would you like to run again?" option from cloud applications.

Which is not to say that today's systems are not interactive -- they are. But it is not the cloud portion of the system that is interactive. The interactivity with the use has been separated from the cloud; it lives in the mobile app on the user's phone, or perhaps in a JavaScript app in a browser.

With all of the user interaction in the mobile app (or browser app), cloud apps can go about their business and focus on processing. It's a pretty good arrangement.

But it does mean that cloud apps are quite similar to mainframe apps.

Thursday, April 30, 2015

Files are static, requests are dynamic

The transition from desktop or web applications to mobile/cloud systems is more than the re-organization of programs. It is a change from data sources: desktop and web applications often store data in files, and mobile/cloud systems store data via web services.

Files are static things. They perform no actions by themselves. A program can read the contents of a file and take action on those contents, but it must consume the contents as they exist. The file may contain just the data that a program needs, or it may contain more, or less. For example, a file containing a Microsoft Word document actually contains the text of the document, revisions to the text, information about fonts and formatting, and meta-information about the author.

A program reading the contents of that file must read all of that information; it has no choice. If the task is to extract the text -- and only the text -- the program must read the entire file, revisions and fonts and meta-information included. If we want only the meta-information, the program must read the entire file, text and revisions... you get the idea.

(The more recent DOCX format does isolate the different sets of information, and makes reading a subset of the file easier. The older DOC format required reading and interpreting the entire file to obtain any part of the file.)

Web services, used by mobile/cloud systems, are not static but dynamic. (At least they have the possibility of being dynamic. You can build web services that mimic files, but you probably want the dynamic versions.)

A web service can be dynamic because there is another program processing the request and creating the response. A web service to read a document can do more than simply return the bytes in the document file. It can perform some processing on your behalf. It can accept instructions, such as "give me the text" or "only the meta-information, please". It can do these things on our behalf. We can delegate them to the web service, and our job becomes easier.

Astute readers will observe that my arrangement of dynamic web services does not reduce the work involved, it merely shifts work to different parts of the system. (The web service must still read the entire document, pick out the bits of interest, and send those to us.) That is true. Yet it is also true that once in place, the web service provide an interface to reading (and writing) documents, and we may then choose to change the implementation of that storage.

With document web services in place, our applications are completely ignorant of the storage format and the web services may change that format to suit their needs. A new version of web services may store documents not in files but in databases, or JSON files, or any method appropriate. I'm pretty sure that Google Docs uses this approach, and I suspect Microsoft's Office 365, if not using it now, will use it soon.

Moving from desktop and web to mobile/cloud lets us do many things. It lets us do many things that we do today but differently. Look at the possibilities, and look at the savings in effort and cost.

Wednesday, January 28, 2015

The mobile/cloud revolution has no center

In some ways, the mobile/cloud market is a re-run of the PC revolution. But not completely.

The PC revolution of the 1980s (which saw rise of the IBM PC, PC-DOS, and related technologies) introduced new hardware that was cheaper and easier to use than the previous technologies of mainframes and minicomputers. Today's mobile/cloud revolution shares that aspect, with cloud-based services and mobile devices cheaper than their PC-based counterparts. It's much easier to use a phone or tablet than it is to use a PC -- ask the person who installs software.

The early PC systems, while cheaper and easier to use, were much less capable than the mainframe and minicomputer systems. People ran large corporations on mainframes and small businesses on minicomputers; PCs were barely able to print and handle a few spreadsheets. It was only after PC-compatible networks and network-aware software (Windows 3.1, Microsoft Exchange) that one could consider running a business on PCs. Mobile/cloud shares this attribute, too. Phones and tablets are network-aware, of course, but the whole "mobile and cloud" world is too new, too different, too strange to be used for business. (Except for some hard-core folks who insist on doing it.)

Yet the two revolutions are different. The PC revolution had a definite center: the IBM PC at first, and Windows later. The 1980s saw IBM as the industry leader: IBM PCs were the standard unit for business computing. Plain IBM PCs at first, and then IBM PC XT units, and later IBM PC-compatibles. There were lots of companies offering personal computers that were not IBM-compatible; these offerings (and their companies) were mostly ignored. Everyone wanted "in" on the IBM PC bandwagon: software makers, accessory providers, and eventually clone manufacturers. It was IBM or nothing.

The mobile/cloud revolution has no center, no one vendor or technology. Apple devices are popular but no vendors are attempting to sell clones in the style of PC clones. To some extent, this is due to Apple's nature and their proprietary and closed designs for their devices. (IBM allowed anyone to see the specs for the IBM PC and invited other vendors to build accessories.)

Apple is not the only game in town. Google's Android devices compete handily with the Apple iPhone and iPad. Google also offers cloud services, something Apple does not. (Apple's iCloud product is convenient storage but it is not cloud services. You cannot host an application in it.)

Microsoft is competing in the cloud services area with Azure, and doing well. It has less success with it's Surface tablets and Windows phones.

Other vendors offer cloud services (Amazon.com, IBM, Oracle, SalesForce) and mobile devices (BlackBerry). Today's market sees lots of technologies. It is a far cry from the 1980s "IBM or nothing" mindset, which may show that consumers of IT products and services have matured.

When there is one clear leader, the "safe" purchasing decision is easy: go with the leader. If your project succeeds, no one cares; if your project fails you can claim that even "big company X" couldn't handle the task.

The lack of a clear market leader makes life complicated for those consumers. With multiple vendors offering capable but different products and services, one must have a good understanding of the projects before selecting a vendor. Success is still success, but failure allows others to question your ability.

Multiple competing technologies also means competition at a higher level. In the PC revolution, IBM and Compaq competed on technology, but the basic platform (the PC) was a known quantity. In mobile/cloud, we see new technologies from start-up companies such as containers and new technologies from the established vendors such as cloud management and the Swift programming language.

The world of mobile and cloud has no center, and as such it can move faster than the old PC world. Keep that in mind when building systems and selecting vendors. Be prepared for bumps and turns.

Monday, October 6, 2014

Innovation in mobile and cloud; not in PCs

The history of IT is the history of innovation. But innovation is not evenly distributed, and it does not stay with one technology.

For a long time, innovation focussed on the PC. The "center of gravity" for innovation was, for a long time, the IBM PC and PC-DOS. Later it became the PC (not necessarily from IBM) and Windows. Windows NT, Windows 2000, and Windows XP all saw significant expansions of features.

With the rise of the Web, the center of gravity shifted to web servers and web browsers. I think that it is no coincidence that Microsoft offered Windows XP with no significant changes. People accepted Windows XP as "good enough" and looked for innovation in other areas -- web browsers, web servers, and databases.

This change broke Microsoft's business model. That business model (selling new versions of Windows and Office to individuals and corporations every so often) was broken when users decided that Windows XP was good enough, that Microsoft Office was good enough. They moved to newer versions reluctantly, not expectantly.

Microsoft is changing its business model. It is shifting to a subscription model for Windows and Office. It has Azure for cloud services. It developed the Surface tablet for mobile computing. Microsoft's Windows RT was an attempt at an operating system for mobile devices, an operating system that had reduced administrative tasks for the user. These are the areas of innovation.

We have stopped wanting new desktop software. I know of no new projects that target the desktop. I know of no new projects that are "Windows only" or "PC only". New projects are designed for mobile/cloud, or possibly web browsers and servers. With no demand for new applications on the desktop, there is no pressure to improve the desktop PC - or its operating system.

With no pressure to improve the desktop, there is no need to change the hardware or operating system. We see changes in three areas: larger memory and disks (mostly from inertia), smaller form factors, and prettier user interfaces (Windows Vista and Windows 8 "Metro"). With each of these changes, users can (rightfully) ask: what is the benefit to me?

It is a question that newer PCs and operating systems have not answered. But tablets and smartphones answer it quite well.

I think that Windows 10 is the "last hurrah" for Windows -- at least the desktop version. Innovations to Windows will be modifications for mobile/cloud technologies: better interactions with virtualization hypervisors and container managers. Aside from those, look for little changes in desktop operating systems.

Tuesday, August 26, 2014

With no clear IT leader, expect lots of changes

The introduction of the IBM PC was market-wrenching. Overnight, the small, rough-and-tumble market of microcomputers with diverse designs from various small vendors became large and centered around the PC standard.

From 1981 to 1987, IBM was the technology leader. IBM lead in sales and also defined the computing platform.

IBM's leadership fell to Compaq in 1987, when IBM introduced the PS/2 line with its new (incompatible) hardware. Compaq delivered old-style PCs with a faster buss (the EISA buss) and notably the Intel 80386 processor. (IBM stayed with the older 80286 and 8086 processors, eventually consenting to provide 80386-based PS/2 units.) Compaq even worked with Microsoft to deliver newer versions of MS-DOS that recognized larger memory capacity and optical disc readers.

But Compaq did not remain the leader. It's leadership declined gradually, to the clone makers and especially Dell, HP, and Gateway.

The mantle of leadership moved from a PC manufacturer to the Microsoft-Intel duopoly. The popularity of Windows, along with marketing skill and software development prowess led to a stable configuration for Microsoft and Intel. Together, they out-competed IBM's OS/2, Motorola's 68000 processor, DEC's Alpha processor, and Apple's Macintosh line.

That configuration held for two decades, roughly from 1990 to 2010, when Apple introduced the iPhone. The genius move was not the iPhone hardware, but the App Store and iTunes, which let one easily find and install apps on your phone (and pay for them).

Now Microsoft and Apple have the same problem: after years of competing in a well-defined market (the corporate PC market) they struggle to move into the world of mobile computing. Microsoft's attempts at mobile devices (Zune, Kin, Surface RT) have flopped. Intel is desperately attempting to design and build processors that are suitable for low-power devices.

I don't expect either Microsoft or Intel to disappear. (At least not for several years, possibly decades.) The PC market is strong, and Intel can sell a lot of its traditional (heat radiator that happen to compute data) processors. Microsoft is a competent player in the cloud arena with its Azure services.

But I will make an observation: for the first time in the PC era, we find that there is no clear leader for technology. The last time we were leaderless was prior to the IBM PC, in the "microcomputer era" of Radio Shack TRS-80 and Apple II computers. Back then, the market was fractured and tribal. Hardware ruled, and your choice of hardware defined your tribe. Apple owners were in the Apple tribe, using Apple-specific software and exchanging data on Apple-specific floppy disks. Radio Shack owners were in the Radio Shack tribe, using software specific to the TRS-80 computers and exchanging data on TRS-80 diskettes. Exchanging data between tribes was one of the advanced arts, and changing tribes was extremely difficult.

There were some efforts to unify computing: CP/M was the most significant. Built by Digital Research (a software company with no interest in hardware), CP/M ran on many different configurations. Yet even that effort could not span the differences in processors, memory layout, and video configurations.

Today we see tribes forming around multiple architectures. For cloud computing, we have Amazon.com's AWS, Microsoft's Azure, Google's App Engine. With virtualization we see VMware, Oracle's VirtualBox, the aforementioned cloud providers, and newcomer Docker as a rough analog of CP/M. Mobile computing sees Apple's iOS, Google's Android, and Microsoft's Windows RT as a (very) distant third.

With no clear leader and no clear standard, I expect each vendor to enhance their offerings and also attempt to lock in customers with proprietary features. In the mobile space, Apple's Swift and Microsoft's C# are both proprietary languages. Google's choice of Java puts them (possibly) at odds with Oracle -- although Oracle seems to be focussed on databases, servers, and cloud offerings, so there is no direct conflict. Things are a bit more collegial in the cloud space, with vendors supporting OpenStack and Docker. But I still expect proprietary enhancements, perhaps in the form of add-ons.

All of this means that the technology world is headed for change. Not just change from desktop PC to mobile/cloud, but changes in mobile/cloud. The competition from vendors will lead to enhancements and changes, possibly significant changes, in cloud computing and mobile platforms. The mobile/cloud platform will be a moving target, with revisions as each vendor attempts to out-do the others.

Those changes mean risk. As platforms change, applications and systems may break or fail in unexpected ways. New features may offer better ways of addressing problems and the temptation to use those new features will be great. Yet re-designing a system to take advantage of new infrastructure features may mean that other work -- such as new business features -- waits for resources.

One cannot ignore mobile/cloud computing. (Well, I suppose one can, but that is probably foolish.) But one cannot, with today's market, depend on a stable platform with slow, predictable changes like we had with Microsoft Windows.

With such an environment, what should one do?

My recommendations:

Build systems of small components  This is the Unix mindset, with small tools to perform specific tasks. Avoid large, monolithic systems.

Use standard interfaces  Use web services (either SOAP or REST) to connect components into larger systems. Use JSON and Unicode to exchange data, not proprietary formats.

Hedge your bets  Gain experience in at least two cloud platforms and two mobile platforms. Resist the temptation of "corporate standards". Standards are good with a predictable technology base. The current base is not predictable, and placing your eggs in one vendor's basket is risky.

Change your position  After a period of use, examine your systems, your tools, and your talent. Change vendors -- not for everything, but for small components. (You did build your system from small, connected components, right?) Migrate some components to another vendor; learn the process and the difficulties. You'll want to know them when you are forced to move to a different vendor.

Many folks involved in IT have been living in the "golden age" of a stable PC platform. They may have weathered the change from desktop to web -- which saw a brief period of uncertainty. More than likely, they think that the stable world is the norm. All that is fine -- except we're not in the normal world with mobile/cloud. Be prepared for change.

Wednesday, February 19, 2014

The great puzzle of Microsoft Office

What will Microsoft do with Office? Or, what should Microsoft do with Office?

Microsoft built an empire with Office. Office was the most powerful word processor and spreadsheet package. It used proprietary formats. It read files from other word processors and spreadsheets but did not write to those formats, making the trip for data one-way: into Microsoft Office. Through marketing, fierce competition, and the network effect, Microsoft convinced most businesses and most home users to use (and buy) Microsoft Office.

Those were the days.

The world is changing.

Large businesses still use Windows for their desktop environment. Small businesses, especially technology start-ups, are using Mac OS or Linux.

Large businesses still use Microsoft Office. Small businesses are looking at LibreOffice (an open source desktop package with word processing and spreadsheets) or Google Apps (an on-line office package with word processing, spreadsheets, e-mail, calendaring, and other things).

The tablet world is dominated by iOS (on iPads) and Android (on just about everything else). Windows holds a tiny share. The same goes for smart phones.

These are the pieces of the great puzzle that Microsoft must solve. What is a software giant to do?

First, some observations.

Microsoft is the latecomer
 Microsoft is late to the market, but they have been in this position before and succeeded. They were late with C#/.NET after Java. They were late with Internet Explorer after Netscape Navigator. They were late with spreadsheets after Lotus 1-2-3. They were late with word processors after Wordstar and WordPerfect. They were late with databases after dBase and R:Base. Being a latecomer has not doomed Microsoft yet.

New hardware platforms Microsoft must live (and compete) in a world beyond the PC. Phones and tablets must be part of the solution. Tablets and phones are a very different arena for software design, due to the size of the screen, the touch interface, and intermittent connectivity. Any product on the tablet or phone is a different creature than it's PC counterpart.

Multiple software platforms Microsoft must live (and compete) in a fractured world of software, with multiple operating systems (some not of Microsoft's making or control). Offerings from Microsoft must work with iOS and Android as well as Windows.

The desktop software model doesn't work on mobile devices Microsoft's past technique of selling premium software and obtaining market share through marketing won't work on the mobile platform.

Giving these conditions, Microsoft needs a new approach. Here are some ideas:

Sell services, not software Microsoft will not focus on selling copies of Office for the mobile world. Instead, it will focus on subscribers to its services. The mobile versions of Word and Excel and Outlook will be offered at low prices -- perhaps at no cost -- but they will be useless without the service.

Cloud storage, not local files Microsoft Office will store data in the cloud (Microsoft's cloud).

Not documents and workbooks, but pieces assembled Instead of entire documents and complete spreadsheets, Microsoft services will stitch together fragments of documents and spreadsheets. Think of it as an advanced form of OLE. (Remember OLE and our excitement at embedding a spreadsheet in a document?)

Versioning and tracked changes Microsoft's cloud will keep track of the versions of each document (or document fragment), allowing us to see changes over time and the notes for each change.

Access control (for enterprise users) With all of these fragments floating in the cloud, enterprise users (businesses and their support teams) will want to control access by users.

Promotion and publication (also for enterprise) Users will be able to publish data to other users. Users will also be able to work on new versions of data, reviewing it with other members of their team, revising it, and eventually marking it as "available to everyone". Or maybe "available to selected users".

The idea of Office as a service seems a natural fit for mobile devices. Notice this this vision does not demand Windows tablets -- one can use it with iPads and Android devices. I expect Microsoft to move in this direction.

Sunday, February 9, 2014

Mobile/cloud chips away at batch processing

Consider a particular type of PC application, one that I call "the spreadsheet app". It processes information. It accepts large quantities of data in (a spreadsheet, or multiple spreadsheets), processes the data, and then provides the results in large quantities of data (another spreadsheet with multiple pages).

In some ways, it is a design of mainframe batch processing: collect all of the input data up front, process it in one large calculation, and provide the results in one large batch.

The PC revolution was supposed to change all of that. The PC revolution was supposed to slay the batch processing beast and make data processing interactive. Yet here we are, thirty years later, still processing data in large batches.

I think that tablets (or more specifically, mobile/cloud) will succeed where the PC revolution failed.

Moving a spreadsheet app to mobile cloud is not easy. A direct port makes little sense: tablets have small screens and data entry into spreadsheets requires fine control for the selection of cells.

A "native" tablet app would take advantage of the strengths of mobile/cloud (interactive displays and fast processing on the cloud servers) and avoid the weaknesses (constrained data entry). Instead of data entry into a large grid of numbers, data must be entered in smaller units. This is possible with lots of spreadsheet apps; their data is often structured into collections (and sometimes collections of collections). Tablets can handle data entry in smaller chunks.

The shift from the spreadsheet grid to a collection of units is the same as the shift from batch processing to interactive processing. The app must accept small units of data, incorporating each smaller unit into the larger whole.

It's possible to build the entire large batch of data in this manner, and then process the batch at once, but its also possible to process each unit of data (a small batch) as it is entered.

Even if each unit of input requires a complete re-calculation of the data, that may be okay. The calculation would be performed on a server (or a set of cloud-based servers) and computing is getting faster and faster. Pushing data to cloud servers for calculations makes sense.

Tablets still have a limited size, and displaying the results may be problematic. One cannot display a raft of numbers on a tablet. (Well, one can display a raft of numbers, by using a very small typeface size. But the results would be undesirable.)

Instead of a text display, apps for tablets will (most likely) use graphical representations for data, with the ability to focus on small sections and see the underlying numbers. We could use this technique today with PC applications, but spreadsheets are limited in their abilities to present information graphically. (I've used Microsoft Excel for years, and the charts and graphic capabilities have remained static for quite a while.)

So that's what I see. Mobile/cloud apps will accept small units of data, process them on fast servers, and present the results in graphical form. The shift from batch processing to interactive processing. The PC revolution will eventually occur and liberate us from the tyranny of batch processing. Ironically, it will occur not on PCs but on tablets and cloud servers.

Sunday, November 17, 2013

The end of complex PC apps

Businesses are facing a problem with technology: PCs (and tablets, and smart phones) are changing. Specifically, they are changing faster than businesses would like.

Corporations have many programs that they use internally. Some corporations build their own software, others buy software "off the shelf". Many companies use a combination of both.

All of the companies with whom I have worked wanted stable platforms on which to build their systems and processes. Whether it was a complex program built in C++, a comprehensive model built in a spreadsheet, or an office suite (word processor, spreadsheet, and e-mail), companies want to invest their effort in their custom solutions. They did not want to spend money or time on upgrades and changes to the operating system or commercially available applications.

While they dislike change, corporations are willing to upgrade systems. Corporations want long upgrade cycles. They want gentle upgrade paths, with easy transitions from one version to the next. They were happy with the old Microsoft world: Windows NT, Windows 2000, and Windows XP were excellent examples of the long, gentle upgrades desired by corporations.

That is no longer the world of PCs. The new world sees fast update cycles for operating systems, major updates that require changes to applications. For companies with custom-made applications, they have to invest time and effort in updating their applications to match the new operating systems. (Consider Windows Vista and Windows 8.) For companies with off-the-shelf applications, they have to purchase new versions that run on the new operating systems.

What is a corporation to do?

My guess is that corporations will seek out other platforms and move their apps to those platforms. My guess is that corporations will recognize the cost of frequent change in the PC and mobile platforms, and look for other solutions with lower cost.

If they do, then PCs will lose their title to the development world. The PC platform will not be the primary target for applications.

What are the new platforms? I suspect the two "winning" platforms will be web apps (browsers and servers), and mobile/cloud (tablets and phones with virtualized servers). While the front ends for these systems undergo frequent changes, the back ends are relatively stable. The browsers for web apps are mostly stable and they buffer the app from changes to the operating system. Tablets and smart phones undergo frequent updates; this cost can be minimized with simple apps that can be updated easily.

The big trend is away from complex PC applications. These are too expensive to maintain in the new world of frequent updates to operating systems.

Monday, September 9, 2013

Microsoft is not DEC

Some have pointed out the comparisons of Microsoft to the long-ago champion of mini-computers DEC.

The commonalities seem to be:

  • DEC and Microsoft were both large
  • DEC and Microsoft had strong cultures
  • DEC missed the PC market; Microsoft is missing the mobile market
  • DEC and Microsoft changed their CEOs

Yet there are differences:

DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.

DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.

The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)

PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.

Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.

Thursday, August 22, 2013

Migrating from desktop to cloud is no small deal

The popularity of smart phones and tablets has made mobile (and its server-side colleague cloud) the new darling technology. They are the new buzzwords that garner attention -- and probably funding. They get attention for new apps, and some folks have been migrating existing desktop and web apps.

If you are embarking on such a project, I encourage you to look into three technologies: mobile devices, cloud computing, and functional programming.

Functional programming is a specific way of designing programs, much like structured programming and object-oriented programming. But in contrast to those techniques, functional programming guides one to smaller units of code. Structured programming and object-oriented programming allow one to build small units, but do not encourage it. As a result, most (non-homework) code built with structured and object-oriented techniques contains collections of large modules and objects.

Cloud computing is a specific way of designing systems out of smaller components, many of which are part of the cloud infrastructure. Cloud-based applications use data stores, message queues, and a measured amount of code. (Previous architectures such as desktop and web often saw applications made of whole cloth with little use of pre-made components.)

Mobile devices force a split of code between the server (typically a cloud-based server) and the user interface on the device. The device usually performs some processing with most occurring on the server. (A few apps perform all processing on the device, such as in the game "Angry Birds". But they are exceptions.) Splitting the code between the device and the server forces you to build not a single application but two programs that communicate.

The new technology set encourages us to build systems of smaller programs. Mobile/cloud is for collaborating programs, not monolithic ones. Functional programming techniques reward small programs and penalize large ones.

This "impedance mismatch" between mobile/cloud and desktop applications (and to a somewhat lesser extent, web applications) means that porting to mobile/cloud is difficult. The legacy systems on the earlier platforms were built to take advantage of the strengths of the platform, and small, collaborating programs were not efficient.

The mismatch between object-oriented systems and functional programming is even greater. Object-oriented programming is often about objects holding state -- mutable state, so that objects change over time -- and functional programming is about immutable objects.

A few well-designed (from the mobile/cloud perspective) applications for the desktop and web can be migrated to mobile/cloud. Most, I think, will be difficult. Some may be close to impossible.

My advice: Build some new apps for mobile/cloud, to gain experience with the new design paradigm. Try out the functional programming languages (or at least, your favorite object-oriented language is with immutable objects). Once you have that experience, evaluate your legacy apps and select a small number (perhaps two) of "easy" apps for migration. I'm sure that the exercise will give you greater insight into the effort for the other (more difficult) legacy apps.

Then, armed with experience of the new technologies and a good understanding of your code base, should you migrate from desktop to mobile/cloud.

Friday, May 31, 2013

The rise of the simple UI

User interfaces are about to become simpler.

This change is driven by the rise of mobile devices. The UI for mobile apps must be simpler. A cell phone has a small screen and (when needed) a virtual keyboard. The user interacts through the touchscreen, not a keyboard and mouse. Tablets, while larger and often accompanied by a real (small-form) keyboard, also interact through the touchscreen.

For years, PC applications have accumulated features and complexity. Consider the Microsoft Word and Microsoft Excel applications. Each version has introduced new features. The 2007 versions introduced the "ribbon menu", which was an adjustment to the UI to accommodate the increase.

Mobile devices force us to simplify the user interface. Indirectly, they force us to simplify applications. In the desktop world, the application with the most features was (generally) considered the best. In the mobile world, that calculation changes. Instead of selecting an application on the raw number of features, we are selecting applications on simplicity and ease of use.

It is a trend that is ironic, as the early versions of Microsoft Windows were advertised as easy to use (a common adjective was "intuitive"). Yet while "intuitive" and "easy", Windows was never designed to be simple; configuration and administration were always complex. That complexity remained even with networks and Active Directory -- the complexity was centralized but not eliminated.

Apps on mobile don't have to be simple, but simple apps are the better sellers. Simple apps fit better on the small screens. Simple apps fit better into the mobile/cloud processing model. Even games demonstrate this trend (compare "Angry Birds" against the PC games like "Doom" or even "Minesweeper").

The move to simple apps on mobile devices will flow back to web applications and PC applications. The trend of adding features will reverse. This will affect the development of applications and the use of technology in offices. Job requisitions will list user interface (UI) and user experience (UX) skills. Office workflows will become more granular. Large, enterprise systems (like ERP) will mutate into collections of apps and collections of services. This will allow mobile apps, web apps, and PC apps to access the corporate data and perform work.

Sellers of PC applications will have to simplify their current offerings. It is a change that will affect the user interface and the internal organization of their application. Such a change is non-trivial and requires some hard decisions. Some features may be dropped, others may be deferred to a future version. Every feature must be considered and placed in either the mobile client or the cloud back-end, and such decisions must account for many aspects of mobile/cloud design (network accessibility, storage, availability of data on multiple devices, among others).

Thursday, May 2, 2013

Our fickleness on the important aspects of programs

Over time, we have changed our desire in program attributes. If we divide the IT age into four eras, we can see this change. Let's consider the four eras to be mainframe, PC, web, and mobile/cloud. These four eras used different technology and different languages, and praised different accomplishments.

In the mainframe era, we focussed on raw efficiency. We measured CPU usage, memory usage, and disk usage. We strove to have enough CPU, memory, and disk, with some to spare but not too much. Hardware was expensive, and too much spare capacity meant that you were paying for more than you needed.

In the PC era we focussed not on efficiency but on user-friendliness. We built applications with help screens and menus. We didn't care too much about efficiency -- many people left PCs powered on overnight, with no "jobs" running.

With web applications, we focussed on globalization, with efficiency as a sub-goal. The big effort was in the delivery of an application to a large quantity of users. This meant translation into multiple languages, the "internationalization" of an application, support for multiple browsers, and support for multiple time zones. But we didn't want to overload our servers, either, so early Perl CGI applications were quickly converted to C or other languages for performance.

With applications for mobile/cloud, we desire two aspects: For mobile apps (that is, the 'UI' portion), we want something easier than "user-friendly". The operation of an app must not merely be simple, it must be obvious. For cloud apps (that is, the server portion), we want scalability. An app must not be monolithic, but assembled from collaborative components.

The objectives for systems vary from era to era. Performance was a highly measured aspect in the mainframe era, and almost ignored in the PC era.

The shift from one era to another may be difficult for practitioners. Programmers in one era may be trained to "optimize" their code for the dominant aspect. (In the mainframe era, they would optimize for performance.) A succeeding era would demand other aspects in their systems, and programmers may not be aware of the change. Thus, a highly-praised mainframe programmer with excellent skills at algorithm design, when transferred to a PC project may find that his skills are not desired or recognized. His code may receive a poor review, since the expectation for PC systems is "user friendly" and his skills from mainframe programming do not provide that aspect.

Similarly, a skilled PC programmer may have difficulties when moving to web or mobile/cloud systems. The expectations for user interface, architecture, and efficiency are quite different.

Practitioners who start with a later era (for example, the 'young turks' starting with mobile/cloud) may find it difficult to comprehend the reasoning of programmers from an earlier era. Why do mainframe programmers care about the order of mathematical operations? Why do PC programmers care so much about in-memory data structures, to the point of writing their own?

The answers are that, at the time, these were important aspects of programs. They were pounded into the programmers of earlier eras, to a degree that those programmers design their code without thinking about these optimizations.

Experienced programmers must look at the new system designs and the context of those designs. Mobile/cloud needs scalability, and therefore needs collaborative components. The monolithic designs that optimized memory usage are unsuitable to the new environment. Experienced programmers must recognize their learned biases and discard those that are not useful in the new era. (Perhaps we can consider this a problem of cache invalidation.)

Younger programmers would benefit from a deeper understanding of the earlier eras. Art students learn study the conditions (and politics) of the old masters. Architects study the buildings of the Greeks, Romans, and medieval kingdoms. Programmers familiar with the latest era, and only the latest era, will have a difficult time communicating with programmers of earlier eras.

Each era has objectives and constraints. Learn about those objectives and constraints, and you will find a deeper appreciation of programs and a greater ability to communicate with other programmers.

Sunday, April 21, 2013

The post-PC era is about coolness or lack thereof

Some have pointed to the popularity of tablets as the indicator for the imminent demise of the PC. I look at the "post PC" era not as the death of the PC, but as something worse: PCs have become boring.

Looking back, we can see that PCs started with lots of excitement and enthusiasm, yet that excitement has diminished over time.

First, consider hardware:

  • Microcomputers were cool (even with just a front panel and a tape drive for storage)
  • ASCII terminals were clearly better than front panels
  • Storing data on floppy disks was clearly better than storing data on tape
  • Hard drives were better than floppy disks
  • Color monitors were better than monochrome displays
  • High resolution color monitors were better than low resolution color monitors
  • Flat panel monitors were better than CRT monitors

These were exciting improvements. These changes were *cool*. But the coolness factor has evaporated. Consider these new technologies:

  • LED monitors are better than LCD monitors, if you're tracking power consumption
  • Solid state drives are better than hard drives, if you look hard enough
  • Processors after the original Pentium are nice, but not excitingly nice

Consider operating systems:

  • CP/M was exciting (as anyone who ran it could tell you)
  • MS-DOS was clearly better than CP/M
  • Windows 3.1 on DOS was clearly better than plain MS-DOS
  • Windows 95 was clearly better than Windows 3.1
  • Windows NT (or 2000) was clearly better than Windows 95 (or 98, or ME)

But the coolness factor declined with Windows XP and its successors:

  • Windows XP was *nice* but not *cool*
  • Windows Vista was not clearly better than Windows XP -- and many have argued that it was worse
  • Windows 7 was better than Windows Vista, in that it fixed problems
  • Windows 8 is (for most people) not cool

The loss of coolness is not limited to Microsoft. A similar effect happened with Apple's operating systems.

  • DOS (Apple's DOS for Apple ][ computers) was cool
  • MacOS was clearly better than DOS
  • MacOS 9 was clearly better than MacOS 8
  • Mac OSX was clearly better than MacOS 9

But the Mac OSX versions have not been clearly better than their predecessors. They have some nice features, but the improvements are small, and a significant number of people might say that the latest OSX is not better than the prior version.

The problem for PCs (including Apple Macintosh PCs) is the loss of coolness. Tablets are cool; PCs are boring. The "arc of coolness" for PCs saw its greatest rise in the 1980s and 1990s, a moderate rise in the 2000s, and now sees decline.

This is the meaning of the "post PC era". It's not that we give up PCs. It's that PCs become dull and routine. PC applications become dull and routine.

It also means that there will be few new things developed for PCs. In a sense, this happened long ago, with the development of the web. Then, the Cool New Things were developed to run on servers and in browsers. Now, the Cool New Things will be developed for the mobile/cloud platform.

So don't expect PCs and existing PC applications to vanish. They will remain; it is too expensive to re-build them on the mobile/cloud platform.

But don't expect new PC applications.

Welcome to the post-PC era.

Sunday, April 7, 2013

Mobile/cloud apps will be different than PC apps

As a participant in the PC revolution, I was comfortable with the bright future of personal computers. I *knew* -- that is, I strongly believed -- that PCs were superior to mainframes.

It turned out that PCs were *different* from mainframes, but not necessarily superior.

Mainframe programs were, primarily, accounting systems. Oh, there were programs to compute ballistics tables, and programs for engineering and astronomy, and system utilities, but the big use of mainframe computers was accounting (general ledger, inventory, billing, payment processing, payables, receivables, and market forecasts). These uses were shaped by the entities that could afford mainframe computers (large corporations and governments) and the data that was most important to those organizations.

But the data was also shaped by technology. Computers read input on punch cards and stored data on magnetic tape. The batch processing systems were useful for certain types of processing and made efficient use of transactions and master files. Even when terminals were invented, the processing remained in batch mode.

Personal computers were more interactive than mainframes. They started with terminals and interactive applications. From the beginning, personal computers were used for tasks very different than the tasks of mainframe computers. The biggest applications for PCs were word processors and spreadsheets. (They still are today.)

Some "traditional" computer applications were ported to personal computers. There were (and still are) systems for accounting and database management. There were utility programs and programming languages: BASIC, FORTRAN, COBOL, and later C and Pascal. But the biggest applications were the interactive ones, the ones that broke from the batch processing mold of mainframe computing.

(I am simplifying greatly here. There were interactive programs for mainframes. The BASIC language was designed as an interactive environment for programming, on mainframe computers.)

I cannot help but think that the typical mainframe programmer, looking at the new personal computers that appeared in the late 1970s, could only puzzle at what possible advantage they could offer. Personal computers were smaller, slower, and less capable than mainframes in every degree. Processors were slower and less capable. Memory was smaller. Storage was laughably primitive. PC software was also primitive, with nothing approaching the sophistication of mainframe operating systems, database management systems, or utilities.

The only ways in which personal computers were superior to mainframes were the BASIC language (Microsoft BASIC was more powerful than mainframe BASIC), word processors, and spreadsheets. Notice that these are all interactive programs. The cost and size of a personal computer made it possible for a person to own one, but the interactive nature of applications made it sensible for a person to own one.

That single attribute of interactive applications made the PC revolution possible. The success of modern-day PCs and the Microsoft empire was built on interactive applications.

I suspect that the success of cell phones and tablets will be built on a single attribute. But what that attribute is, I do not know. It may be portability. It may be location-aware capabilities. It may be a different level of interactivity.

I *know* -- that is, I feel very strongly -- that mobile/cloud is going to have a brilliant future.

I also feel that the key applications for mobile/cloud will be different from traditional PC applications, just as PC applications are different from mainframe applications. Any attempt to port PC applications to mobile/cloud will be doomed to failure, just as mainframe applications failed to port to PCs.

Mainframe applications live on, in their batch mode glory, to this day. Large companies and governments need accounting systems, and will continue to need them. PC applications will live through the mobile/cloud revolution, although some may fade; PowerPoint-style presentations may be better served on synchronized mobile devices than with a single PC and a projector.

Expect mobile/cloud apps to surprise us. They will not be word processors and spreadsheets. (Nor will they be accounting systems.) They will be more like Twitter and Facebook, with status updates and connections to our network of people.

Wednesday, March 20, 2013

Moving to mobile/cloud requires project management

A few years ago (perhaps less than ten), the term "SOA" appeared in the tech scene. "Service Oriented Architecture" was discussed, examined, and eventually dropped from the world of tech. (Or at least the Serious Peoples' world of tech.) People derided it with clever sayings like "SOA is DOA".

SOA defined software as a set of services, and a system as a collection of connected services. It was a reasonable approach to system design, but very different from the designs for desktop and web applications. For desktop PC applications it was overkill and for web applications it was nice but not necessary.

The change to SOA design entailed a large effort, one that was not easily justified. The rejection of SOA is quite understandable: there is no payback, no return on investment.

Until we get to mobile/cloud systems.

Mobile/cloud apps, in contrast to desktop and web apps, require SOA. The small UI app on a cell phone or tablet presents the data it receives from services running on servers. There is no other way to build mobile apps. From Twitter to Facebook, from Yahoo Mail to Yelp, mobile apps are designed with SOA.

Building new apps for mobile/cloud with SOA designs was easy to justify. A new app has no legacy code, no existing design, no political turf that people or groups try to protect.

Converting existing applications is much harder. The change from non-SOA to SOA design is significant, and requires a lot of thought. Moreover, it can often require a reorganization of responsibilities, changing the political landscape. Which groups are providing which services? Who is negotiating and measuring the service level agreements (SLAs)? Who is coordinating the consolidation of redundant services?

These reorganizations, if not anticipated, can increase the management effort for the simplest of conversion projects. (Even anticipated reorganizations can increase the management effort.)

Changing the organization is not a technology problem. Moving from non-SOA (desktop PC apps) to SOA (mobile/cloud apps) is in part a technology problem and in part a management problem. You need to address both sides for a successful project.

Wednesday, February 6, 2013

New dimensions in system design


The technology environment is changing. The change is significant, affecting the basic dimensions with which we measure systems.

The center of the old technology world was Windows. Manufacturers built PCs to run Windows. Suppliers built software to run on Windows. Systems were local -- that is, they ran in a single location. Before the internet era, everything lived in your data center. Even with the internet, systems ran in a single data center. (Or possibly two, for redundancy.)

With everything in one data center,  

The age of "Windows as the center of the universe" has passed. Today, we have multiple environments. The simple universe of one technology has been replaced by a universe of multiple galaxies.

Today, systems that are spread across multiple hardware installations. Systems consist of one (or several) front end "user clients", possibly for the iPhone, Android phones, and web browsers. The back end consists of not one but many cooperative services, each handling a small portion of the work. These can include web servers, database servers, queue managers, and content distribution networks.

But the change is larger than that. The set of basic elements is changing. In the old world, systems were built from a database, access, and programs written in a specified language. Those "dimensions" of the application defined its "space" in the data center. Vendors were defined by how well they met the needs of the organization in those dimensions.

In the new world, systems are more complex. Instead of a single database, a system may use several. Instead of a single web server, an application may use multiple, depending on the transaction being processed. We use virtualized processors to host servers, and cloud computing to manage the virtualized servers. We no longer think of a system as "a Java application"; we think of it as "a Java, Javascript, SQL, NoSQL, message queues, and some HTML and CSS" system.

In such a complex world, we design our systems with a new collection of "building blocks", and we look for vendors to provide those building blocks. Those elements of system design are: hardware, software, content, and services. These are the dimensions for system architecture and vendor offerings.
Let's look at the big providers:

Microsoft clearly supplies software: Windows, Office, Visual Studio, and many other packages. They are getting into the hardware game. Microsoft has offered hardware for years, in the form of the Microsoft Mouse, the Microsoft Keyboard, and today with the XBOX, the Kinect, and the Surface. They also offer cloud services and content (music, books, and movies).

Google offers some hardware (the Nexus phones and tablets) but also works with manufacturers. They offer the Chromebook hardware that runs their Chrome browser. They offer cloud services. They have also gotten into the music and video markets, with Google play and YouTube.

Amazon.com offers hardware, but only the the form of the Kindle. They do not offer PCs or phones. They have a rich offering of services with their cloud systems.

Apple offers hardware, software, content, and little in the way of cloud services. Their MacBooks and iPads (and the operating systems) are respected by all. They provide content in the form of music, movies, books, and newspaper subscriptions. Yet their focus is on consumers, not the enterprise. This is clear in their iCloud offerings, which are designed for individuals.

In this light of hardware, software, services, and content, the other "big names" of the tech world are lacking:

Barnes and Noble offers hardware (Nook tablets and e-readers) and the software to run them, but nothing for development or the office. They offer content but not services.

IBM offers hardware and software but not content. They don't sell books or music. They offer cloud services, in addition to their old-school consulting services.

Facebook offers services and content, but not hardware and software. (This is why the rumor of a Facebook phone keeps returning.) Facebook uses cloud computing for their servers but doesn't offer it to customers. They offer other services, such as a platform for games and an authentication service for web site log-ins.

Yahoo offers services and some (limited) content but not hardware or software.

I'm not claiming that a vendor needs all four of these components to be profitable. IBM and Yahoo are doing rather well. (Barnes and Noble not so much, but that is a problem specific to their market.) But the presence (or absence) of these four components is important, and decides how a vendor can assist a client.

The new dimensions of hardware, software, services, and content affect hiring organizations too. Systems will need all of these components (to one degree or another) to satisfy a customer's needs. When looking for solutions from vendors, and when looking to hire, companies will have to weigh the strengths of the vendor (or the candidate) in all of these areas.
The smart, big vendors are offering all services. The smart, small vendors will offer a carefully selected subset and ensure that they can provide quality. The same goes for candidates. The heavyweights will have expertise in all areas, the middleweights light expertise in all and strength in a few, and the lightweights will be capable in one or two.

The task for vendors (and candidates) is to build their skills along these dimensions, and present them to clients.

Wednesday, January 9, 2013

Tablets will not replace PCs -- at least not as you think they might

Tablets have seen quite a bit of popularity, and some people now ask: When will tablets replace PCs?

To which I answer: They won't, at least not in the way most folks think of replacing PCs.

The issue of replacement is subtle.

To examine the subtleties, let's review a previous revolution that saw one form of technology replaced by another.

Did PCs replace mainframes? In one sense, they have, yet in another they have not. PCs are the most common form of computing, used in businesses, schools, and homes. Yet mainframes remain, processing millions of transactions each day.

When microcomputers were first introduced, one could argue that PCs had replaced mainframes as the dominant form of computing. Even before the IBM PC, in the days of the Apple II and the Radio Shack TRS-80, there were more microcomputers than mainframes. (I don't have numbers, but I am confident that several tens of thousands of microcomputers were sold. I'm also confident that there were fewer than several tens of thousands of mainframes at the time.)

But sheer numbers is not an accurate measure. For one thing, a mainframe serves multiple users and a PC serves one. So maybe PCs replaced mainframes when the number of PC users exceeded the number of mainframe users.

The measure of users is not particularly clear, either. Many users of microcomputers were hobbyists, not serious day-in-day-out users. And certainly mainframes processed more business transactions than the pre-internet PCs.

Maybe we should measure not the number of devices or the number of users, but instead the conversations about the devices. When did people talk about PCs more than mainframes? We cannot count tweets or blog posts (neither had been invented) but we can look at magazines and publications. Prior to the IBM PC, most technology magazines discussed mainframes (or minicomputers) and ignored the microcomputer systems. There were some microcomputer-specific magazines, the most popular one being BYTE.

The introduction of the IBM PC generated a number of magazines. At this point the magazine content shifted towards PCs.

There are other measures: number of want ads, number of conferences and expos, number of attendees at conferences and expos. We could argue for quite some time about the appropriate metric.

But I think that the notion of one generation of computing technology replacing another is a slippery one, and perhaps a false one. I'm not sure that one technology really replaces an earlier one.

Consider: We have advanced from the mainframe era to the PC era, and then to the networked PC era, and then to the internet era, and now we are venturing into the mobile/cloud era. Each successive "wave" of technology has not replaced the previous "wave", but instead expanded the IT sphere. PCs did not replace mainframes; they replaced typewriters and dedicated word-processing systems and expanded the computational realm with spreadsheets. The web did not replace PCs, it expanded the computation realm with on-line transactions -- many of which are handled with back-end systems running on... mainframes!

A new technology expands our horizons.

Tablets will expand our computing realm. They will enable us to do new things. Some tasks of today that are handled by PCs will be handled by tablets, but not all of them.

So I don't see tablets replacing PCs in the coming year. Or the next year. Or the year after.

But if you insist on a metric, some measurement to clearly define the shift from PC to tablet, I suggest that we look at not numbers or users or apps but conversations. The number people who talk about tablets, compared to the number of people who talk about PCs. I would include blogs, tweets, web articles, advertisements, podcasts, and books as part of the conversation. You may want to include a few more.

Whatever you pick, look at them, take some measurements, and mark when the lines cross.

And then blog or tweet about it.