Wednesday, August 26, 2015

The coming split in computing

Personal computers have been rarely personal. The IBM PC (the original Personal Computer) may have been designed for individuals but was used by corporations. Earlier computers (the Apple II or the Radio Shack TRS-80) were used by a few hardy hobbyists, and a few hardy businesses.

The two markets for PCs -- the home market and the corporate market -- have used the same hardware and software since the IBM PC.  Adoption rates increased dramatically with the introduction of Microsoft Windows, and there were some differences between Windows 95/98 and Windows NT, but both home and business used the same "stuff". Microsoft's dominance in the market helped create a set of uniform products. After 2000, everyone used Windows 2000 (later Windows XP) and Microsoft Office (Word, Excel, and Outlook).

The two markets are driven by difference forces, however, and those differences will drive a change in the markets. The uniform environment of one operating system and one set of software will split.

Consumers are driven primarily by cost. Not just in PC markets, but in all markets. (Consider air travel. Consumers have consistently selected smaller seats and less comforts because of the lower price.)

Large organizations (corporations and governments) are sensitive to cost, but they also avoid risk. Risk avoidance is often handled by minimize change. Large organizations often have governance processes that standardize hardware and software and keep changes to a minimum. Small updates (such as security patches) are deployed in well-specified "off hours" time periods. Large updates (like a new operating system) are delayed until necessary, and implemented in a well-coordinated upgrade project.

The long life of Windows XP can be explained by both of these behaviors, but the reasoning in the two markets is different. For consumers, upgrading from Windows XP to Windows Vista (or Windows 7, or Windows 8) was a cost, and a cost with no apparent benefit. Yes, some individuals upgraded, but the majority kept the operating system that came with their PC.

Large organizations stayed with Windows XP, but to avoid risk. Also seeing no immediate benefit from a new version of an operating system (and recognizing the risks of programs or device drivers failing) those organizations chose to stay with Windows XP.

This difference in behavior is recognized by vendors. Microsoft has introduced two "tracks" for updates: the consumer track and the business track. Consumers get updates immediately; large organizations can defer updates. Linux provides Red Hat and Ubuntu also make available two tracks: Red Hat with (free) Fedora for consumers and (paid for) RHEL for organizations, Ubuntu with "regular" versions and "long term support" versions.

Notably, Apple does not have two tracks. They revise their hardware and software annually, if not more frequently. They are focussed on the consumer, not the enterprise.

Also notably, Android has only one track -- for consumers.

With Apple and Android constantly revising hardware and software, the only player with a long-term mobile strategy may be Microsoft. We have insufficient experience with Microsoft hardware to decide.

Viewing the market as split between price-sensitive consumers and risk-averse large organizations, I expect that hardware and software will also split. Consumer software will remain free or low-priced, and move to the mobile/cloud environment -- and away from PCs. Enterprise software will stay on PCs (and servers) with little movement onto mobile devices.

Which is not to say that mobile devices won't been seen in large organizations. They will be, but they will be truly personal devices. Individuals will use them to check e-mail, confirm appointments, and surf the web during boring meetings. Enterprise work, however, will remain on PCs and on PC-like devices, including lightweight laptops such as Chromebooks and Cloudbooks.

Monday, August 24, 2015

The file format wars are over -- and text won

When I first started with computers, files were simple things. Most of them were source code, and a few of them were executables. The source code (BASIC, FORTRAN, and assembly) were all plain text files. The executables were in binary, since they contained machine instructions.

That simple world changed with the PC revolution and the plethora of applications that it brought. Wordstar used a format that was almost text, with ASCII characters and the end of each word marked with a regular character with its 8th bit set. Lotus 1-2-3 used a special file format for its worksheets. dBase II (and dBase III, and dBase IV) used a special format for its data.

There was a "carboniferous explosion" of binary formats. Each and every application had its own format. Binary formatted data was smaller to store, easier to parse, and somewhat proprietary. The last was important for the commercial market; once a customer had lots of data locked in a proprietary format they were unwilling to change to a competitor's product.

The conversion from DOS to Windows changed little. Applications kept their proprietary, binary formats.

Yet recently (that is, with the rise of web services and mobile computing) binary formats have declined. The new favorites are text-based formats: XML, JSON, and YAML.

I have seen no new proprietary, binary format lately. New formats have been one of the text-based formats. Even Microsoft has changed its Office applications (Word, Excel, Powerpoint, and others) to use an XML-based set of files.

This is a big change. Why did it happen?

I can think of several reasons:

First is the existence of the formats. In the "age of binary formats", a binary format was how one stored data. Everyone did it.

Second is the abundance of storage. With limited storage space, a binary format is smaller and a better fit. With today's available storage that pressure does not exist.

Third is the availability of libraries to parse and construct the text formats. We can easily read and write XML (or JSON, or YAML) with commonly-available, tested, working libraries. A proprietary format requires a new (untested) library.

Fourth is the pressure of legislation. Some countries (and some large companies) have mandated the use of open formats, to prevent the lock-in of proprietary data formats.

All of these are good reasons, yet I think there is another factor.

In the past, a file format served the application program. In the data processing world, our mindsets considered applications to "own" the data, with files being nothing more than a convenient holding space to be used when the application was not running (or when it was processing data from a different file). Programs did not share data -- or on the rare occasions when they did, it was through databases or plain text files.

Today, our mobile device apps share data with cloud-based systems. The cloud-based systems are collections of independent applications performing coordinated work. The nature of mobile/cloud is to share data from one application to another. This sharing between programs (sometimes written in different languages) is easier with standard formats and difficult with proprietary formats.

New systems will be developed with open (text) formats for storage and exchange. That means that our existing systems, the dinosaurs of the processing world with their proprietary formats, will fall out of favor.

I don't expect them to vanish completely. They work, which is an important virtue. Replacing them with a new system (or simply modifying them to use text formats) would be expensive with little apparent return on investment. Yet continuing to use them implies that some amount of data (a significant amount) will be locked within proprietary non-text formats.

Expect calls for people with skills in these file formats.

* * * * *

The recent supreme court decision about Java's API (in which the court decided not to hear an appeal) means that for now APIs and file formats can be considered intellectual property. It may be difficult to reverse-engineer the formats for old systems without the expressed permission of the vendor. (And if the vendor is out of business or sold to a larger company, it may be very difficult to obtain such permission.)

Companies may want to evaluate the risk of their data formats.

Wednesday, August 12, 2015

Ten percent better is not enough

The mobile operating system market is ruled by Apple's iOS and Google's Android. Other contenders are striving for market share, and they face quite the challenge.

The contenders are Microsoft's Windows Mobile (or whatever they are calling it now), Mozilla's Firefox OS, Cyanogen's operating system, and Samsung and Intel's Tizen. (One could add Blackberry to the list.)

The challenge that all of these contenders face is one of value. They must deliver a product that is superior to the existing iOS and Android products. Delivering a product of lower value is meaningless, and matching value is a losing proposition due to the cost of switching.

It's not enough to simply be a little bit better than the established leaders. To win the hearts of users (a significant number of users), their product must be clearly better. The benefits of the product must be obvious and large enough to offset the cost of changing from the existing.

Apple did not have this problem. When they introduced the iPhone, there was no competition. The existing phones provided less functionality.

Google did have this problem, as it competed with the (then) existing iPhone. Google's advantage was an open platform -- or so it advertised. (Also, Google was not Apple -- or Microsoft -- which worked to its advantage.

Today's contenders must provide something better than Apple's iOS and Google's Android. The advantage must be clear, and it must be greater than a small increase.

One would think that Microsoft has an advantage, in that they can leverage the existing corporate infrastructure of Windows hardware. Yet this is not the case, as phones and tablets remain personal devices -- corporations have not figured out how to use them. (Some organizations have started using tablets and phones, but their use tends to be limited to specific applications.)

Any mobile OS marketer, to gain ground against the current leaders, will have to provide something much better than the current products, as judged by individuals.

Friday, July 31, 2015

Locking mistakes on web pages

As a professional in the IT industry, I occasionally visit web sites. I do so to get information about new technologies and practices, new devices and software, and current events. There are a number of web sites that provide a magazine format of news stories. I visit more than IT web sites; I also read about social, political, and economic events.

I find many stories informative, some pertinent, and a few silly. And a find a number of them to contain errors. Not factual errors, but simple typographic errors. Simple errors, yet these errors should be obvious to anyone reading the story. For example, a misspelling of the name 'Boston' as 'Botson'. Or the word 'compromise' appearing as 'compromse'.

Spelling errors are bad enough. What makes it worse is that they remain. The error may be on the web site at 10:00 in the morning. It is still there at 4:00 in the afternoon.

A web page is run by a computer (a web server, to be precise). The computer waits for a request, and when it gets one, it builds an HTML page and sends back the response. The HTML page can be static (a simple file read from disk) or dynamic (a collection of files and content merged into a single HTML file). But static or dynamic, the source is the same: files on a computer. And files can be changed.

The whole point of the web was to allow for content to be shared, content that could be updated.

Yet here are these web sites with obviously incorrect content. And they (the people running the web sites) do nothing about it.

I have a few theories behind this effect:

  • The people running the web site don't care
  • The errors are intentional
  • The people running the site don't have time
  • The people running the site don't know how

It's possible that the people running the web site do not care about these errors. They may have a cavalier attitude towards their readers. Perhaps they focus only on their advertisers. It is a short-sighted strategy and I tend to doubt that it would be in effect at so many web sites.

It's also possible that the errors are intentional. They may be made specifically to "tag" content, so that if another web side copies the content then it can be identified as coming from the first web site. Perhaps there is an automated system that makes these mistakes. I suspect that there are better ways to identify copied content.

More likely is that the people running the web site either don't have time to make corrections or don't know how to make corrections. (They are almost the same thing.)

I blame our Content Management Systems. These systems (CMSs) manage the raw content and assemble it into HTML form. (Remember that dynamic web pages must combine information from multiple sources. A CMS does that work, combining content into a structured document.)

I suspect (and it is only a suspicion, as I have not used any of the CMS systems) that the procedures to administrate a CMS are complicated. I suspect that CMSs, like other automated systems, have grown in complexity over the years, and now require deep technical knowledge.

I also suspect that these web sites with frequent typographical errors are run with a minimal crew of moderately skilled people. The staff has enough knowledge (and time) to perform the "normal" tasks of publishing stories and updating advertisements. It does not have the knowledge (and the time) to do "extraordinary" tasks like update a story.

I suspect the "simple" process for a CMS would be to re-issue a fixed version of the story, but the "simple" process would add the fixed version as a new story and not replace the original. A web site might display the two versions of the story, possibly confusing readers. The more complex process of updating the original story and fixing it in the CMS is probably so labor-intensive and risk-prone that people judge it as "not worth the effort".

That's a pretty damning statement about the CMS: The system is too complicated to use to correct content.

It's also a bit of speculation on my part. I haven't worked with CMSs. Yet I have worked with automated systems, and observed them over time. The path of simple to complex is all too easy to follow.

Thursday, July 30, 2015

Tablets for consumption, cloudbooks for creation

Tablets and cloudbooks are mobile devices of the mobile/cloud computing world.

Tablets are small, flat, keyboardless devices with a touchscreen, processor, storage, and an internet connection. The Apple iPad is possibly the most well-known tablet. The Microsoft Surface is possibly the second most well-known. Other manufacturers offer tablets with Google's Android.

Cloudbooks are light, thin laptops. They contain a screen (possibly a touchscreen, but touch isn't a requirement), processor, storage, and internet connection, and the one thing that separates them from tablets: a keyboard. They look and feel like laptop computers, yet they are not laptops in the usual sense. They have a low-end processor and a custom operating system designed to do one thing: run a browser. The most well-known cloudbook computers are Google's Chromebooks.

I'm using the term "cloudbook" here to refer to the generic lightweight, low-powered, single-purpose laptop computer. A simple search shows that the phrase "cloudbook" (or a variation on capitalization) has been used for specific products, including an x86 laptop, a brand of e-books, a cloud services broker, and even an accounting system! Acer uses the name "cloudbook" for its, um, cloudbook devices.

Tablets and cloudbooks serve two different purposes. Tablets are designed for the consumption of data and cloudbooks are designed for the creation of data.

Tablets allow for the installation of apps, and there are apps for all sorts of things. Apps to play games. Apps to play music. Apps to chat with friends. Apps for e-mail (generally effective for reading e-mail and writing brief responses). Apps for Twitter. Apps for navigation.

Cloudbooks allow for the installation of apps too, although it is the browser that allows for apps and not the underlying operating system. On a Chromebook, it is Chrome that manages the apps. Google confuses the issue by listing web-based applications such as its Docs word processor and Sheets spreadsheet as "apps". The separation of web-based apps and browser-based apps is made more complex by Google's creation of duplicate apps for each environment to support off-line work. For off-line work, you must have a local (browser-based) app.

The apps for cloudbooks are oriented toward the composition of data: word processor, spreadsheet, editing photographs, and more.

I must point out that these differences are in orientation and not complete capabilities. One can consume data on a cloudbook. One can, with the appropriate tools and effort, create on a tablet. The two types of devices are not exclusive. In my view it is easier to consume on a tablet and easier to create on a cloudbook.

Tablets are already popular. I expect that cloudbooks will be popular with people who need to create and manage data. Two groups I expect to use cloudbooks are developers and system administrators. Cloudbooks are a convenient size for portability and capable enough to connect to cloud-based development services such as Cloud9, Codeanywhere, Cloud IDE, or Sourcekit.

Tuesday, July 28, 2015

Apple risks becoming the iPhone company

Apple has a diverse product line. For now.

Apple's products can be summarized as:

  • phones (iPhone and Apple Watch)
  • tablets (iPad and iPod)
  • laptops (MacBook and MacBook Pro)
  • desktops (iMac)
  • accessories
  • streaming services

I'm not quite sure where the Apple Watch fits in this list. I group it with the iPhone, since it is usable only with an iPhone.

Close examination of this grouping, and Apple's financial results, shows that it makes the bulk of its profits from iPhones (and Apple Watch products). Sales of iPads have plateaued and may be declining. (The re-introduction of the iPod may be a result of that decline.) Laptops are selling well.

The other groups (desktops, accessories, and steaming services) count for little. (Accessories counts for little because without the base product line, accessories are not necessary.)

Apple's desktop line is of little consequence. The products are vanity products, present because Apple can make them. The waste-basket iMac looks nice and has lots of processing power, and while I know many people who would like one I know precious few people who have actually bought one.

So the two big product groups are phones and laptops. And as I see it, laptops are at risk.

Apple's MacBooks and MacBook Pros are popular. Developers use them. Individuals who are not developers use them. Startup businesses use them. (Established businesses not so much.) Yet their use, at least among developers, makes little sense. More and more, I see them used (by developers and startup businesses) as access points to servers. MacBooks are not used as development tools but as smart terminals (*very* smart terminals) to the servers with development tools.

The problem for Apple is that competing tools can be had for less. Apple has always charged a premium for its products, but the competition is now significantly less. A Windows-based laptop can be had for half the price of a MacBook, and a Chromebook for less than one quarter of the price. Windows and ChromeOS run browsers and ssh just as well as Mac OS, and developers know it.

Developers are not the only ones shifting to the "PC as a terminal" model. Businesses are using virtual desktops and terminal sessions for many of their systems. They, too, can count.

When MacBooks lose their appeal, Apple will lose a chunk of business, but more importantly, its business will become less diverse. Should that happen, Apple may focus more on its profitable line (iPhones) and reduce development of other lines. Just as Apple let its desktop line wither into a vanity high-end product, it may do the same for laptops.

Wednesday, July 22, 2015

Locked out of the walled gardens

A dark side of the walled gardens of technology is becoming visible.

The old paradigm was one of home ownership. You purchased your equipment (a PC) and then you installed whatever software you wanted. You decided. There were constraints, of course. PCs typically ran Windows (or before that, DOS) and the software had to run under the operating system. PCs had various configurations and the software had to fit (a floppy-only PC could not run large software that required a hard drive, for example).

Once installed, you had to see to the care and feeding of the software. You had to ensure that updates were applied. You had to ensure that data you exchanged with other users was compatible with their systems.

The benefit of this paradigm is the open market and the freedoms that come with it. Vendors are free to enter the market and offer their wares. You are free to choose among those products. You could pick from a number of word processors, spreadsheets, databases, compilers and IDEs, project managers, and other product categories.

The walled gardens of iOS and Android (and soon Windows and MacOS X) provide a different paradigm. If the old paradigm was one of home ownership, the new paradigm is one of renting an apartment. You still have a place for your stuff, yet a lot of the tedious chores of ownership have been removed.

With Apple's walled garden of iOS (and the gatekeeper iTunes), updates are automatic, and software is guaranteed to be compatible. The same holds for Google's Android garden and its gatekeeper 'Play Store'. They guard against 'unfriendly' software.

But the price of living inside the walled garden is that one loses the open market. Only selected vendors may enter the market, and those that do may offer a limited selection of products. Apple and Google enforce requirements for products in their walled gardens, through their registration and gatekeepers. Apple forbids a number of products in iOS and limits others. Web browsers, for example, must use Apple's WebKit engine and not install their own; Apple also forbids programming languages or scripting languages.

We're now seeing the Flash technology being pushed out of the walled gardens. Apple has prohibited it from the beginning. Google has deprecated it on YouTube. How long before Microsoft kicks it out of its garden?

The expulsion of Flash may foreshadow other exclusions of technology. Apple could, at any time, remove Microsoft's apps from iTunes. Google could remove Adobe apps from the Play Store. Microsoft could kick out Oracle apps from the (soon to be revived, I think) Microsoft App Store.

The ability to remove apps from the garden is something that the enterprise folks will want to think about. Building a business on a walled garden has the risk of existing at the whim of the gardener.