Friday, July 31, 2015

Locking mistakes on web pages

As a professional in the IT industry, I occasionally visit web sites. I do so to get information about new technologies and practices, new devices and software, and current events. There are a number of web sites that provide a magazine format of news stories. I visit more than IT web sites; I also read about social, political, and economic events.

I find many stories informative, some pertinent, and a few silly. And a find a number of them to contain errors. Not factual errors, but simple typographic errors. Simple errors, yet these errors should be obvious to anyone reading the story. For example, a misspelling of the name 'Boston' as 'Botson'. Or the word 'compromise' appearing as 'compromse'.

Spelling errors are bad enough. What makes it worse is that they remain. The error may be on the web site at 10:00 in the morning. It is still there at 4:00 in the afternoon.

A web page is run by a computer (a web server, to be precise). The computer waits for a request, and when it gets one, it builds an HTML page and sends back the response. The HTML page can be static (a simple file read from disk) or dynamic (a collection of files and content merged into a single HTML file). But static or dynamic, the source is the same: files on a computer. And files can be changed.

The whole point of the web was to allow for content to be shared, content that could be updated.

Yet here are these web sites with obviously incorrect content. And they (the people running the web sites) do nothing about it.

I have a few theories behind this effect:

  • The people running the web site don't care
  • The errors are intentional
  • The people running the site don't have time
  • The people running the site don't know how

It's possible that the people running the web site do not care about these errors. They may have a cavalier attitude towards their readers. Perhaps they focus only on their advertisers. It is a short-sighted strategy and I tend to doubt that it would be in effect at so many web sites.

It's also possible that the errors are intentional. They may be made specifically to "tag" content, so that if another web side copies the content then it can be identified as coming from the first web site. Perhaps there is an automated system that makes these mistakes. I suspect that there are better ways to identify copied content.

More likely is that the people running the web site either don't have time to make corrections or don't know how to make corrections. (They are almost the same thing.)

I blame our Content Management Systems. These systems (CMSs) manage the raw content and assemble it into HTML form. (Remember that dynamic web pages must combine information from multiple sources. A CMS does that work, combining content into a structured document.)

I suspect (and it is only a suspicion, as I have not used any of the CMS systems) that the procedures to administrate a CMS are complicated. I suspect that CMSs, like other automated systems, have grown in complexity over the years, and now require deep technical knowledge.

I also suspect that these web sites with frequent typographical errors are run with a minimal crew of moderately skilled people. The staff has enough knowledge (and time) to perform the "normal" tasks of publishing stories and updating advertisements. It does not have the knowledge (and the time) to do "extraordinary" tasks like update a story.

I suspect the "simple" process for a CMS would be to re-issue a fixed version of the story, but the "simple" process would add the fixed version as a new story and not replace the original. A web site might display the two versions of the story, possibly confusing readers. The more complex process of updating the original story and fixing it in the CMS is probably so labor-intensive and risk-prone that people judge it as "not worth the effort".

That's a pretty damning statement about the CMS: The system is too complicated to use to correct content.

It's also a bit of speculation on my part. I haven't worked with CMSs. Yet I have worked with automated systems, and observed them over time. The path of simple to complex is all too easy to follow.

Thursday, July 30, 2015

Tablets for consumption, cloudbooks for creation

Tablets and cloudbooks are mobile devices of the mobile/cloud computing world.

Tablets are small, flat, keyboardless devices with a touchscreen, processor, storage, and an internet connection. The Apple iPad is possibly the most well-known tablet. The Microsoft Surface is possibly the second most well-known. Other manufacturers offer tablets with Google's Android.

Cloudbooks are light, thin laptops. They contain a screen (possibly a touchscreen, but touch isn't a requirement), processor, storage, and internet connection, and the one thing that separates them from tablets: a keyboard. They look and feel like laptop computers, yet they are not laptops in the usual sense. They have a low-end processor and a custom operating system designed to do one thing: run a browser. The most well-known cloudbook computers are Google's Chromebooks.

I'm using the term "cloudbook" here to refer to the generic lightweight, low-powered, single-purpose laptop computer. A simple search shows that the phrase "cloudbook" (or a variation on capitalization) has been used for specific products, including an x86 laptop, a brand of e-books, a cloud services broker, and even an accounting system! Acer uses the name "cloudbook" for its, um, cloudbook devices.

Tablets and cloudbooks serve two different purposes. Tablets are designed for the consumption of data and cloudbooks are designed for the creation of data.

Tablets allow for the installation of apps, and there are apps for all sorts of things. Apps to play games. Apps to play music. Apps to chat with friends. Apps for e-mail (generally effective for reading e-mail and writing brief responses). Apps for Twitter. Apps for navigation.

Cloudbooks allow for the installation of apps too, although it is the browser that allows for apps and not the underlying operating system. On a Chromebook, it is Chrome that manages the apps. Google confuses the issue by listing web-based applications such as its Docs word processor and Sheets spreadsheet as "apps". The separation of web-based apps and browser-based apps is made more complex by Google's creation of duplicate apps for each environment to support off-line work. For off-line work, you must have a local (browser-based) app.

The apps for cloudbooks are oriented toward the composition of data: word processor, spreadsheet, editing photographs, and more.

I must point out that these differences are in orientation and not complete capabilities. One can consume data on a cloudbook. One can, with the appropriate tools and effort, create on a tablet. The two types of devices are not exclusive. In my view it is easier to consume on a tablet and easier to create on a cloudbook.

Tablets are already popular. I expect that cloudbooks will be popular with people who need to create and manage data. Two groups I expect to use cloudbooks are developers and system administrators. Cloudbooks are a convenient size for portability and capable enough to connect to cloud-based development services such as Cloud9, Codeanywhere, Cloud IDE, or Sourcekit.

Tuesday, July 28, 2015

Apple risks becoming the iPhone company

Apple has a diverse product line. For now.

Apple's products can be summarized as:

  • phones (iPhone and Apple Watch)
  • tablets (iPad and iPod)
  • laptops (MacBook and MacBook Pro)
  • desktops (iMac)
  • accessories
  • streaming services

I'm not quite sure where the Apple Watch fits in this list. I group it with the iPhone, since it is usable only with an iPhone.

Close examination of this grouping, and Apple's financial results, shows that it makes the bulk of its profits from iPhones (and Apple Watch products). Sales of iPads have plateaued and may be declining. (The re-introduction of the iPod may be a result of that decline.) Laptops are selling well.

The other groups (desktops, accessories, and steaming services) count for little. (Accessories counts for little because without the base product line, accessories are not necessary.)

Apple's desktop line is of little consequence. The products are vanity products, present because Apple can make them. The waste-basket iMac looks nice and has lots of processing power, and while I know many people who would like one I know precious few people who have actually bought one.

So the two big product groups are phones and laptops. And as I see it, laptops are at risk.

Apple's MacBooks and MacBook Pros are popular. Developers use them. Individuals who are not developers use them. Startup businesses use them. (Established businesses not so much.) Yet their use, at least among developers, makes little sense. More and more, I see them used (by developers and startup businesses) as access points to servers. MacBooks are not used as development tools but as smart terminals (*very* smart terminals) to the servers with development tools.

The problem for Apple is that competing tools can be had for less. Apple has always charged a premium for its products, but the competition is now significantly less. A Windows-based laptop can be had for half the price of a MacBook, and a Chromebook for less than one quarter of the price. Windows and ChromeOS run browsers and ssh just as well as Mac OS, and developers know it.

Developers are not the only ones shifting to the "PC as a terminal" model. Businesses are using virtual desktops and terminal sessions for many of their systems. They, too, can count.

When MacBooks lose their appeal, Apple will lose a chunk of business, but more importantly, its business will become less diverse. Should that happen, Apple may focus more on its profitable line (iPhones) and reduce development of other lines. Just as Apple let its desktop line wither into a vanity high-end product, it may do the same for laptops.

Wednesday, July 22, 2015

Locked out of the walled gardens

A dark side of the walled gardens of technology is becoming visible.

The old paradigm was one of home ownership. You purchased your equipment (a PC) and then you installed whatever software you wanted. You decided. There were constraints, of course. PCs typically ran Windows (or before that, DOS) and the software had to run under the operating system. PCs had various configurations and the software had to fit (a floppy-only PC could not run large software that required a hard drive, for example).

Once installed, you had to see to the care and feeding of the software. You had to ensure that updates were applied. You had to ensure that data you exchanged with other users was compatible with their systems.

The benefit of this paradigm is the open market and the freedoms that come with it. Vendors are free to enter the market and offer their wares. You are free to choose among those products. You could pick from a number of word processors, spreadsheets, databases, compilers and IDEs, project managers, and other product categories.

The walled gardens of iOS and Android (and soon Windows and MacOS X) provide a different paradigm. If the old paradigm was one of home ownership, the new paradigm is one of renting an apartment. You still have a place for your stuff, yet a lot of the tedious chores of ownership have been removed.

With Apple's walled garden of iOS (and the gatekeeper iTunes), updates are automatic, and software is guaranteed to be compatible. The same holds for Google's Android garden and its gatekeeper 'Play Store'. They guard against 'unfriendly' software.

But the price of living inside the walled garden is that one loses the open market. Only selected vendors may enter the market, and those that do may offer a limited selection of products. Apple and Google enforce requirements for products in their walled gardens, through their registration and gatekeepers. Apple forbids a number of products in iOS and limits others. Web browsers, for example, must use Apple's WebKit engine and not install their own; Apple also forbids programming languages or scripting languages.

We're now seeing the Flash technology being pushed out of the walled gardens. Apple has prohibited it from the beginning. Google has deprecated it on YouTube. How long before Microsoft kicks it out of its garden?

The expulsion of Flash may foreshadow other exclusions of technology. Apple could, at any time, remove Microsoft's apps from iTunes. Google could remove Adobe apps from the Play Store. Microsoft could kick out Oracle apps from the (soon to be revived, I think) Microsoft App Store.

The ability to remove apps from the garden is something that the enterprise folks will want to think about. Building a business on a walled garden has the risk of existing at the whim of the gardener.

Tuesday, July 14, 2015

Public, private, and on-premise clouds

The cloud platform is flexible. It's primary degree of flexibility is scalability -- the ability to add (or remove) processing nodes as needed. Yet it has more possibilities. Clouds can be public, private, or on-premise.

Public cloud The cloud services offered by the well-known vendors (Amazon.com, Microsoft, Rackspace). The public cloud consists of virtual machines running on shared hardware. My virtual server may be on the same physical server as your virtual server. (At least today; tomorrow our virtual servers might be hosted on other shared hardware. The cloud is permitted to shift virtual servers to suit its needs.)

Private cloud These are servers and services offered by big vendors (Amazon.com, Microsoft, IBM, Oracle, and more) with dedicated hardware. (Sometimes. Different vendors have different ideas of "private cloud".) The cost is higher, but the private cloud offers more consistent performance and (theoretically) higher security as only your servers are running on the hardware.

On-premise cloud Virtual servers running on hardware that is located in your data center. The selling point is that you have control over physical access to the hardware. (You also pay for the hardware.)

Which configuration is best? The answer, as with many questions about systems, is: "it depends".

Some might think that on-premise clouds are better (even with the higher cost) because you have the most control. That's a debatable point, in today's connected world.

An aspect of the on-premise cloud configuration you may want to consider is scalability. The whole point of the cloud is to get more processors on-line quickly (within minutes) and avoid the long procurement, installation, and configuration processes associated with traditional data centers. On-premise clouds let you do that, provided that you have enough hardware to support the top level of demand. With the public cloud you share the hardware; increasing hardware capacity is the cloud vendors responsibility. With an on-premise cloud, you must plan for the capacity. If you need more hardware, you're back in the procurement, installation, and configuration bureaucracies.

Startups that want to prepare for rapid growth benefit from the public cloud. They can defer paying for servers until they need them. (With an on-premise cloud, you have to buy the hardware to support your servers. Once bought, the hardware is yours.)

Established companies with consistent workloads benefit little from cloud processing. (Unless they are looking to distribute their processing among multiple data centers, and use cloud design for resiliancy.)

Even companies with spiky workloads may want to stay with traditional data centers -- if they can accurately predict their needs. A consistent pattern over the year can be used to plan hardware for servers.

The one group that can benefit from on-premise clouds is large companies with dynamic workloads. By "dynamic", I mean a workload that shifts internally over time. If the on-line sales website needs the bulk of the processing during the day and the accounting systems need the bulk of the processing at night, and the workloads are about the same, then on on-premise cloud makes some sense. The ability to "slosh" computing power from one department to another (or one subsidiary to another) while keeping the total computing capacity (relatively) constant fits well with the on-premise cloud.

I expect that most companies will look for hybrid configurations, blending private and public clouds. The small, focussed, virtual servers for cloud allow for rapid re-deployment to different platforms. A company could run everything on their private cloud when business is slow, and when business (and processing) is heavy shift non-critical tasks to public clouds, keeping the critical items in-house (or "in-cloud").

Such a design requires an evaluation of the workload and the classification of tasks. You have to know which servers can be sent to the public cloud. I have yet to see anyone discussing this aspect of cloud systems -- but I won't be surprised when they do.

Tuesday, July 7, 2015

I can write any language in FORTRAN

Experienced programmers, when learning a new programming language, often use the patterns and idioms of their old language in the new one. Thus, a programmer experienced in Java and learning Python will write code that, while legal Python, looks and smells like Java. The code is not "Pythonic".

I, when writing code in a new programming language, often write as if it were C. As I learn about the language, I change my pattern to match the language. The common saying is that a good programmer can write any language in FORTRAN. It's an old saying, probably from the age when most programmers learned COBOL and FORTRAN.

When the IT world shifted from structured programming languages (C, BASIC, FORTRAN) to object-oriented programming languages (C++, Java, C#) much of the code written in the new languages was in the style of the old languages. Eventually, we programmers learned to write object-oriented code.

Today, most programmers learn C# or Java as their first language.  Perhaps we should revise our pithy saying to: A good programmer can write any language in Java. (Or C#, if you prefer.)

Why is this important? Why think about the transition from structured programming ("procedural programming") to object-oriented programming?

Because we're going through another transition. Two, actually.

The first is the transition from object-oriented programming to functional programming. This is a slow change, one that will take several years and perhaps decades. Be prepared to see more about functional programming: articles, product releases, and services in programming platforms. And be prepared to see lots of functional programs written in a style that matches object-oriented code.

The second is the transition from web applications to mobile/cloud applications. This change is faster and is already well underway. Yet be prepared to see lots of mobile/cloud applications architected in the style of web applications.

Eventually, we will learn to write good functional programs. Eventually, we will learn to design good cloud systems. Some individuals (and organizations) will make the transition faster than others.

What does this mean for the average programmer? For starters, be aware of one's own skills. Second, have a plan to learn the new programming paradigms. Third, be aware of the skills of a hiring organization. When a company offers you a job, understand how that company's level matches your own.

What does this mean for companies? First, we have yet another transition in the IT world. (It may seem that the IT world has a lot of these inconvenient transitions.) Second, develop a plan to change your processes to use the new technology. (The changes are happening whether you like them or not.) Third, develop a plan to help your people learn the new technologies. Companies that value skilled employees will plan for training, pilot programs, and migration efforts. Companies that view employees as expensive resources that are little more than cost centers will choose to simply engage contractors with the skills and lay off those currently on their payrolls.

And in the short term, be prepared to see a lot of FORTRAN.

Sunday, July 5, 2015

Apple wins the race for laptops -- now what?

Apple is successful, in part, due to its hardware design. Its products are lovely to look at and comfortable to hold. Its portable devices are thinner and lighter than the competition. Apple has defined the concept of laptop for the last several years -- perhaps the last decade. Apple has set the standard and other manufacturers have followed.

The latest MacBook design is light, sleek, and capable. It is the ultimate in laptop design. And by "ultimate", I mean "ultimate". Apple has won the laptop competition. And now, they have a challenge: what to do next?

Apple's advantage of the MacBook will last for only a short time. Already other manufacturers are creating laptops that are just as thin and sleek as the MacBook. But Apple cannot develop a thinner, sleeker MacBook. The problem is that there are limits to physical size. There are certain things that make a laptop a laptop. You need a display screen and a keyboard, along with processor, memory, and I/O ports. While the processor, memory, and I/O ports can be reduced in size (or in number), the screen and keyboard must be a certain size, due to human physiology.

So how will Apple maintain its lead in the laptop market?

They can use faster processors, more memory, and higher resolution displays.

They can add features to Mac OS X.

They can attempt thinner and sleeker designs.

They may add hardware features, such as wireless charging and 3G/4G connections.

But there is only so much room left for improvement.

I think the laptop market is about to change. I think that laptops have gotten good enough -- and that there are lots of improvements in other markets. I expect manufacturers will look for improvements in tablets, phones, and cloud technologies.

Such a change is not unprecedented. It happened in the desktop PC market. After the initial IBM PC was released, and after Compaq made the first clone, desktop PCs underwent an evolution that lead to the PC units we have today -- and have had for the past decade. Desktop PCs became "good enough" and manufacturers moved on to other markets.

Now that laptops have become good enough, look for the design of laptops to stabilize, the market to fragment among several manufacturers with no leader, prices to fall, and innovation to occur in other markets.

Which doesn't mean that laptops will become unimportant. Just as PCs had a long life after they became "good enough" and their design stabilized, laptops will have a long life. They will be an important part of the IT world.

Wednesday, July 1, 2015

Oracle grabs Java and goes home

The US Supreme Court has declined to hear Google's appeal in the decision to allow Oracle property rights to the Java API. This has caused some consternation in the tech industry. Rightfully so, I believe. But the biggest damage may be to Oracle.

I'm sure Oracle is protecting their property -- or so they think. By obtaining control of the Java API they have maintained ownership of the Java language. They have prevented anyone else from re-implementing Java, as Google did with Android.

This all makes sense from a corporate point of view. A corporation must do what is best for its shareholders, and it must keep control over its properties. If Google (or anyone else) re-implemented Java without Oracle's permission (read that as 'license' and 'royalty payments') then Oracle could be seen as delinquent.

Yet I cannot help but think that Oracle's actions in this case have devalued the Java property. Consider:

Google, the immediate target, will pay Oracle for the Java API in Android. But these payments will be for a short term, perhaps the next five years. Can one doubt that Google will redesign Android to use a different language (perhaps Go) for its apps? Oracle will get payments from Google in the short term but not in the long term.

Other tech suppliers will move away from Java. Apple and Microsoft, for example, have little to gain by supporting Java. Microsoft has been burned by Java in the past (see "Visual Java") and doesn't need to be burned again.

Users of Java, namely large corporations, may re-think their commitment to Java. Prior to Oracle's grab, Java was seen as a neutral technology, one not tied to a large tech supplier. C# and Visual Basic are tied to Microsoft; Objective C and Swift are tied to Apple. C and C++ are not tied to specific vendors, but they are considered older technologies and expensive. Other languages, languages not tied to vendors (Python, Ruby) have appeal. When other languages gain, Java loses.

The open source world will look at Oracle's actions dimly. Open source is about sharing, and Oracle's move is all about *not* sharing. The open source movement was already suspicious of Oracle; this move will push developers away.

Java has been losing market share. The Tiobe index has seen a steady decline in Java's popularity over the past fifteen years. Oracle, by shouting "mine!", has perhaps accelerated that decline.

In the long term, Oracle may have damaged the Java property. Which is not good for their shareholders.

Java is a valuable property only while people use it. If everyone (except Oracle) abandons Java, Oracle will have a property with little value.