Tuesday, July 28, 2015

Apple risks becoming the iPhone company

Apple has a diverse product line. For now.

Apple's products can be summarized as:

  • phones (iPhone and Apple Watch)
  • tablets (iPad and iPod)
  • laptops (MacBook and MacBook Pro)
  • desktops (iMac)
  • accessories
  • streaming services

I'm not quite sure where the Apple Watch fits in this list. I group it with the iPhone, since it is usable only with an iPhone.

Close examination of this grouping, and Apple's financial results, shows that it makes the bulk of its profits from iPhones (and Apple Watch products). Sales of iPads have plateaued and may be declining. (The re-introduction of the iPod may be a result of that decline.) Laptops are selling well.

The other groups (desktops, accessories, and steaming services) count for little. (Accessories counts for little because without the base product line, accessories are not necessary.)

Apple's desktop line is of little consequence. The products are vanity products, present because Apple can make them. The waste-basket iMac looks nice and has lots of processing power, and while I know many people who would like one I know precious few people who have actually bought one.

So the two big product groups are phones and laptops. And as I see it, laptops are at risk.

Apple's MacBooks and MacBook Pros are popular. Developers use them. Individuals who are not developers use them. Startup businesses use them. (Established businesses not so much.) Yet their use, at least among developers, makes little sense. More and more, I see them used (by developers and startup businesses) as access points to servers. MacBooks are not used as development tools but as smart terminals (*very* smart terminals) to the servers with development tools.

The problem for Apple is that competing tools can be had for less. Apple has always charged a premium for its products, but the competition is now significantly less. A Windows-based laptop can be had for half the price of a MacBook, and a Chromebook for less than one quarter of the price. Windows and ChromeOS run browsers and ssh just as well as Mac OS, and developers know it.

Developers are not the only ones shifting to the "PC as a terminal" model. Businesses are using virtual desktops and terminal sessions for many of their systems. They, too, can count.

When MacBooks lose their appeal, Apple will lose a chunk of business, but more importantly, its business will become less diverse. Should that happen, Apple may focus more on its profitable line (iPhones) and reduce development of other lines. Just as Apple let its desktop line wither into a vanity high-end product, it may do the same for laptops.

Wednesday, July 22, 2015

Locked out of the walled gardens

A dark side of the walled gardens of technology is becoming visible.

The old paradigm was one of home ownership. You purchased your equipment (a PC) and then you installed whatever software you wanted. You decided. There were constraints, of course. PCs typically ran Windows (or before that, DOS) and the software had to run under the operating system. PCs had various configurations and the software had to fit (a floppy-only PC could not run large software that required a hard drive, for example).

Once installed, you had to see to the care and feeding of the software. You had to ensure that updates were applied. You had to ensure that data you exchanged with other users was compatible with their systems.

The benefit of this paradigm is the open market and the freedoms that come with it. Vendors are free to enter the market and offer their wares. You are free to choose among those products. You could pick from a number of word processors, spreadsheets, databases, compilers and IDEs, project managers, and other product categories.

The walled gardens of iOS and Android (and soon Windows and MacOS X) provide a different paradigm. If the old paradigm was one of home ownership, the new paradigm is one of renting an apartment. You still have a place for your stuff, yet a lot of the tedious chores of ownership have been removed.

With Apple's walled garden of iOS (and the gatekeeper iTunes), updates are automatic, and software is guaranteed to be compatible. The same holds for Google's Android garden and its gatekeeper 'Play Store'. They guard against 'unfriendly' software.

But the price of living inside the walled garden is that one loses the open market. Only selected vendors may enter the market, and those that do may offer a limited selection of products. Apple and Google enforce requirements for products in their walled gardens, through their registration and gatekeepers. Apple forbids a number of products in iOS and limits others. Web browsers, for example, must use Apple's WebKit engine and not install their own; Apple also forbids programming languages or scripting languages.

We're now seeing the Flash technology being pushed out of the walled gardens. Apple has prohibited it from the beginning. Google has deprecated it on YouTube. How long before Microsoft kicks it out of its garden?

The expulsion of Flash may foreshadow other exclusions of technology. Apple could, at any time, remove Microsoft's apps from iTunes. Google could remove Adobe apps from the Play Store. Microsoft could kick out Oracle apps from the (soon to be revived, I think) Microsoft App Store.

The ability to remove apps from the garden is something that the enterprise folks will want to think about. Building a business on a walled garden has the risk of existing at the whim of the gardener.

Tuesday, July 14, 2015

Public, private, and on-premise clouds

The cloud platform is flexible. It's primary degree of flexibility is scalability -- the ability to add (or remove) processing nodes as needed. Yet it has more possibilities. Clouds can be public, private, or on-premise.

Public cloud The cloud services offered by the well-known vendors (Amazon.com, Microsoft, Rackspace). The public cloud consists of virtual machines running on shared hardware. My virtual server may be on the same physical server as your virtual server. (At least today; tomorrow our virtual servers might be hosted on other shared hardware. The cloud is permitted to shift virtual servers to suit its needs.)

Private cloud These are servers and services offered by big vendors (Amazon.com, Microsoft, IBM, Oracle, and more) with dedicated hardware. (Sometimes. Different vendors have different ideas of "private cloud".) The cost is higher, but the private cloud offers more consistent performance and (theoretically) higher security as only your servers are running on the hardware.

On-premise cloud Virtual servers running on hardware that is located in your data center. The selling point is that you have control over physical access to the hardware. (You also pay for the hardware.)

Which configuration is best? The answer, as with many questions about systems, is: "it depends".

Some might think that on-premise clouds are better (even with the higher cost) because you have the most control. That's a debatable point, in today's connected world.

An aspect of the on-premise cloud configuration you may want to consider is scalability. The whole point of the cloud is to get more processors on-line quickly (within minutes) and avoid the long procurement, installation, and configuration processes associated with traditional data centers. On-premise clouds let you do that, provided that you have enough hardware to support the top level of demand. With the public cloud you share the hardware; increasing hardware capacity is the cloud vendors responsibility. With an on-premise cloud, you must plan for the capacity. If you need more hardware, you're back in the procurement, installation, and configuration bureaucracies.

Startups that want to prepare for rapid growth benefit from the public cloud. They can defer paying for servers until they need them. (With an on-premise cloud, you have to buy the hardware to support your servers. Once bought, the hardware is yours.)

Established companies with consistent workloads benefit little from cloud processing. (Unless they are looking to distribute their processing among multiple data centers, and use cloud design for resiliancy.)

Even companies with spiky workloads may want to stay with traditional data centers -- if they can accurately predict their needs. A consistent pattern over the year can be used to plan hardware for servers.

The one group that can benefit from on-premise clouds is large companies with dynamic workloads. By "dynamic", I mean a workload that shifts internally over time. If the on-line sales website needs the bulk of the processing during the day and the accounting systems need the bulk of the processing at night, and the workloads are about the same, then on on-premise cloud makes some sense. The ability to "slosh" computing power from one department to another (or one subsidiary to another) while keeping the total computing capacity (relatively) constant fits well with the on-premise cloud.

I expect that most companies will look for hybrid configurations, blending private and public clouds. The small, focussed, virtual servers for cloud allow for rapid re-deployment to different platforms. A company could run everything on their private cloud when business is slow, and when business (and processing) is heavy shift non-critical tasks to public clouds, keeping the critical items in-house (or "in-cloud").

Such a design requires an evaluation of the workload and the classification of tasks. You have to know which servers can be sent to the public cloud. I have yet to see anyone discussing this aspect of cloud systems -- but I won't be surprised when they do.

Tuesday, July 7, 2015

I can write any language in FORTRAN

Experienced programmers, when learning a new programming language, often use the patterns and idioms of their old language in the new one. Thus, a programmer experienced in Java and learning Python will write code that, while legal Python, looks and smells like Java. The code is not "Pythonic".

I, when writing code in a new programming language, often write as if it were C. As I learn about the language, I change my pattern to match the language. The common saying is that a good programmer can write any language in FORTRAN. It's an old saying, probably from the age when most programmers learned COBOL and FORTRAN.

When the IT world shifted from structured programming languages (C, BASIC, FORTRAN) to object-oriented programming languages (C++, Java, C#) much of the code written in the new languages was in the style of the old languages. Eventually, we programmers learned to write object-oriented code.

Today, most programmers learn C# or Java as their first language.  Perhaps we should revise our pithy saying to: A good programmer can write any language in Java. (Or C#, if you prefer.)

Why is this important? Why think about the transition from structured programming ("procedural programming") to object-oriented programming?

Because we're going through another transition. Two, actually.

The first is the transition from object-oriented programming to functional programming. This is a slow change, one that will take several years and perhaps decades. Be prepared to see more about functional programming: articles, product releases, and services in programming platforms. And be prepared to see lots of functional programs written in a style that matches object-oriented code.

The second is the transition from web applications to mobile/cloud applications. This change is faster and is already well underway. Yet be prepared to see lots of mobile/cloud applications architected in the style of web applications.

Eventually, we will learn to write good functional programs. Eventually, we will learn to design good cloud systems. Some individuals (and organizations) will make the transition faster than others.

What does this mean for the average programmer? For starters, be aware of one's own skills. Second, have a plan to learn the new programming paradigms. Third, be aware of the skills of a hiring organization. When a company offers you a job, understand how that company's level matches your own.

What does this mean for companies? First, we have yet another transition in the IT world. (It may seem that the IT world has a lot of these inconvenient transitions.) Second, develop a plan to change your processes to use the new technology. (The changes are happening whether you like them or not.) Third, develop a plan to help your people learn the new technologies. Companies that value skilled employees will plan for training, pilot programs, and migration efforts. Companies that view employees as expensive resources that are little more than cost centers will choose to simply engage contractors with the skills and lay off those currently on their payrolls.

And in the short term, be prepared to see a lot of FORTRAN.

Sunday, July 5, 2015

Apple wins the race for laptops -- now what?

Apple is successful, in part, due to its hardware design. Its products are lovely to look at and comfortable to hold. Its portable devices are thinner and lighter than the competition. Apple has defined the concept of laptop for the last several years -- perhaps the last decade. Apple has set the standard and other manufacturers have followed.

The latest MacBook design is light, sleek, and capable. It is the ultimate in laptop design. And by "ultimate", I mean "ultimate". Apple has won the laptop competition. And now, they have a challenge: what to do next?

Apple's advantage of the MacBook will last for only a short time. Already other manufacturers are creating laptops that are just as thin and sleek as the MacBook. But Apple cannot develop a thinner, sleeker MacBook. The problem is that there are limits to physical size. There are certain things that make a laptop a laptop. You need a display screen and a keyboard, along with processor, memory, and I/O ports. While the processor, memory, and I/O ports can be reduced in size (or in number), the screen and keyboard must be a certain size, due to human physiology.

So how will Apple maintain its lead in the laptop market?

They can use faster processors, more memory, and higher resolution displays.

They can add features to Mac OS X.

They can attempt thinner and sleeker designs.

They may add hardware features, such as wireless charging and 3G/4G connections.

But there is only so much room left for improvement.

I think the laptop market is about to change. I think that laptops have gotten good enough -- and that there are lots of improvements in other markets. I expect manufacturers will look for improvements in tablets, phones, and cloud technologies.

Such a change is not unprecedented. It happened in the desktop PC market. After the initial IBM PC was released, and after Compaq made the first clone, desktop PCs underwent an evolution that lead to the PC units we have today -- and have had for the past decade. Desktop PCs became "good enough" and manufacturers moved on to other markets.

Now that laptops have become good enough, look for the design of laptops to stabilize, the market to fragment among several manufacturers with no leader, prices to fall, and innovation to occur in other markets.

Which doesn't mean that laptops will become unimportant. Just as PCs had a long life after they became "good enough" and their design stabilized, laptops will have a long life. They will be an important part of the IT world.

Wednesday, July 1, 2015

Oracle grabs Java and goes home

The US Supreme Court has declined to hear Google's appeal in the decision to allow Oracle property rights to the Java API. This has caused some consternation in the tech industry. Rightfully so, I believe. But the biggest damage may be to Oracle.

I'm sure Oracle is protecting their property -- or so they think. By obtaining control of the Java API they have maintained ownership of the Java language. They have prevented anyone else from re-implementing Java, as Google did with Android.

This all makes sense from a corporate point of view. A corporation must do what is best for its shareholders, and it must keep control over its properties. If Google (or anyone else) re-implemented Java without Oracle's permission (read that as 'license' and 'royalty payments') then Oracle could be seen as delinquent.

Yet I cannot help but think that Oracle's actions in this case have devalued the Java property. Consider:

Google, the immediate target, will pay Oracle for the Java API in Android. But these payments will be for a short term, perhaps the next five years. Can one doubt that Google will redesign Android to use a different language (perhaps Go) for its apps? Oracle will get payments from Google in the short term but not in the long term.

Other tech suppliers will move away from Java. Apple and Microsoft, for example, have little to gain by supporting Java. Microsoft has been burned by Java in the past (see "Visual Java") and doesn't need to be burned again.

Users of Java, namely large corporations, may re-think their commitment to Java. Prior to Oracle's grab, Java was seen as a neutral technology, one not tied to a large tech supplier. C# and Visual Basic are tied to Microsoft; Objective C and Swift are tied to Apple. C and C++ are not tied to specific vendors, but they are considered older technologies and expensive. Other languages, languages not tied to vendors (Python, Ruby) have appeal. When other languages gain, Java loses.

The open source world will look at Oracle's actions dimly. Open source is about sharing, and Oracle's move is all about *not* sharing. The open source movement was already suspicious of Oracle; this move will push developers away.

Java has been losing market share. The Tiobe index has seen a steady decline in Java's popularity over the past fifteen years. Oracle, by shouting "mine!", has perhaps accelerated that decline.

In the long term, Oracle may have damaged the Java property. Which is not good for their shareholders.

Java is a valuable property only while people use it. If everyone (except Oracle) abandons Java, Oracle will have a property with little value.

Tuesday, June 30, 2015

Apple's browser

Does Apple need a browser?

Apple has Safari, but maintains it poorly. Is Apple serious about its browser?

Come to think of it, why does Apple need a browser?

I can think of several reasons:

Ego Apple has to have a browser to satisfy its ego. It needs to have a browser to meet some internal need.

Freudian? Yes. Reasonable? No.

Apple is a smart company. It doesn't invest in products to suit its ego. It invests to improve revenue.

Keeping up with Microsoft Microsoft has a browser. Apple wants, at some level, to compete with Microsoft. Therefore, Apple needs a browser.

Doubtful. Apple doesn't have to match Microsoft one-for-one on features. They never have.

Superlative web experience Apple knows Mac OSX and iOS better then anyone. They, and only they, can build the browser that provides the proper user experience.

Possible. But only if Apple thinks that a good web experience is necessary.

Avoid dependence on others Max OSX (and iOS), despite Apple's most fervent wishes, still needs a browser. Without an Apple browser, Apple would have to rely on another browser. Perhaps Google Chrome. Perhaps Mozilla Firefox. But relying on Google is risky -- Google and Apple are not the best of friends. Relying on Mozilla is also risky, but in another sense: Mozilla may not be around much longer, thanks to the popularity of other browsers (of which Safari is one).

All of these strategies have one thing in common: the assumption that Apple considers the web important.

I'm not sure Apple thinks that. It may be that Apple thinks, in the long run, that the web is unimportant. Apple focusses on native apps, not HTML 5 apps. The dominant app design processes data on the local device and uses the cloud for storage, but nothing more. Apple doesn't provide cloud services for computing. Apple has no service that matches Windows Azure or Google's computer engine. In Apple's world, devices compute and the cloud stores.

* * * * *

The more I think about it, the more I convince myself that Apple doesn't need a browser. Apple has already delayed several improvements to Safari. Maybe Apple thinks that Safari is good enough for the rest of us.

In the Apple ecosystem, they may be right.