Showing posts with label smartphones. Show all posts
Showing posts with label smartphones. Show all posts

Wednesday, October 19, 2016

We prefer horizontal layers, not vertical stacks

Looking back at the 60-plus years of computer systems, we can see a pattern of design preferences. That pattern is an initial preference for vertical design (that is, a complete system from top to bottom) followed by a change to a horizontal divide between a platform and applications on that platform.

A few examples include mainframe computers, word processors, and smart phones.

Mainframe computers, in the early part of the mainframe age, were special-purpose machines. IBM changed the game with its System/360, which was a general-purpose computer. The S/360 could be used for commercial, scientific, or government organizations. It provided a common platform upon which ran application programs. The design was revolutionary, and it has stayed with us. Minicomputers followed the "platform and applications" pattern, as did microcomputers and later IBM's own Personal Computer.

When we think of the phrase "word processor", we think of software, most often Microsoft's "Word" application (which runs on the Windows platform). But word processors were not always purely software. The original word processors were smart typewriters, machines with enhanced capabilities. In the mid-1970s, a word processor was a small computer with a keyboard, display, processing unit, floppy disks for storage, a printer, and software to make it all go.

But word processors as hardware did not last long. We moved away from the all-in-one design. In its place we used the "application on platform" approach, using PCs as the hardware and a word processing application program.

More recently, smart phones have become the platform of choice for photography, music, and navigation. We have moved away from cameras (a complete set of hardware and software for taking pictures), moved away from MP3 players (a complete set of hardware and software for playing music), and moved away from navigation units (a complete set of hardware and software for providing directions). In their place we use smart phones.

(Yes, I know that some people still prefer discrete cameras, and some people still use discrete navigation systems. I myself still use an MP3 player. But the number of people who use discrete devices for these tasks is small.)

I tried thinking of single-use devices that are still popular, and none came to mind. (I also tried thinking of applications that ran on platforms that moved to single-use devices, and also failed.)

It seems we have a definite preference for the "application on platform" design.

What does this mean for the future? For smart phones, possibly not so much -- other than they will remain popular until a new platform arrives. For the "internet of things", it means that we will see a number of task-specific devices such as thermostats and door locks until an "internet of things" platform comes along, and then all of those task-specific devices will become obsolete (like the task-specific mainframes or word processor hardware).

For cloud systems, perhaps the cloud is the platform and the virtual servers are the applications. Rather than discrete web servers and database servers the cloud is the platform for web server and database server "applications" that will be containerized versions of the software. The "application on platform" pattern means that cloud and containers will endure for some time, and is a good choice for architecture.

Thursday, January 8, 2015

Hardwiring the operating system

I tend to think of computers as consisting of four conceptual parts: hardware, operating system, application programs, and my data.

I know that computers are complex objects, and each of these four components has lots of subcomponents. For example, the hardware is a collection of processor, memory, video card, hard drive, ports to external devices, and "glue" circuitry to connect everything. (And even that is omitting some details.)

These top-level divisions, while perhaps not detailed, are useful. They allow me to separate the concerns of a computer. I can think about my data without worrying about the operating system. I can consider application programs without bothering with hardware.

It wasn't always this way. Oh, it was for personal computers, even those from the pre-IBM PC days. Hardware like the Altair was sold as a computing box with no operating system or software. Gary Kildall at Digital Research created CP/M to run on the various hardware available and designed it to have a dedicates unit for interfacing with hardware. (That dedicated unit was the Basic Input-Output System, or 'BIOS'.)

It was the very early days of computers that saw a close relationship between hardware, software, and data. Very early computers had no operating systems (operating systems themselves designed to separate the application program from the hardware). Computers were specialized devices, tailored to the task.

IBM's System/360 is recognized as the first general computer: a single computer that could be programmed for different applications, and used within an organization for multiple purposes. That computer began us on the march to separate hardware and software.

The divisions are not simply for my benefit. Many folks who work to design computers, build applications, and provide technology services find these divisions useful.

The division of computers into these four components allows for any one of the components to be swapped out, or moved to another computer. I can carry my documents and spreadsheets (data) from my PC to another one in the office. (I may 'carry' them by sending them across a network, but you get the idea.)

I can replace a spreadsheet application with a different spreadsheet application. Perhaps I replace Excel 2010 with Excel 2013. Or maybe change from Excel to another PC-based spreadsheet. The new spreadsheet software may or may not read my old data, so the interchangeability is not perfect. But again, you get the idea.

More than half a century later, we are still separating computers into hardware, operating system, application programs, and data.

And that may be changing.

I have several computing devices. I have a few PCs, including one laptop I use for my development work and e-mail. I have a smart phone, the third I have owned. I have a bunch of tablets.

For my PCs, I have installed different operating systems and changed them over time. The one Windows PC started with Windows 7. I upgraded it to Windows 8 and it now runs Windows 8.1. My Linux PCs have all had different releases of Ubuntu, and I expect to update them with the next version of Ubuntu. Not only do I get major versions, but I receive minor updates frequently.

But the phones and tablets are different. The phones (an HTC and two Samsung phones) ran a single operating system since I took them out of the box. (I think one of them got an update.) On of my tablets is an old Viewsonic gTablet running Android 2.2. There is no update to a later version of Android -- unless I want to 'root' the tablet and install another variant of Android like Cyanogen.

PCs get new versions of operating systems (and updates to existing versions). Tablets and phones get updates for applications, but not for operating systems. At least nowhere near as frequently as PCs.

And I have never considered (seriously) changing the operating system on a phone or tablet.

Part of this change is due, I believe, to the change in administration. We who own PCs administer the PC and decide when to update software. But we who think we own phones and tablets do *not* administer the tablet. We do not decide when to update applications or operating systems. (Yes, there are options to disable or defer updates, in Android and iOS.)

It is the equipment supplier, or the cell service provider, who decides to update operating systems on these devices. And they have less incentive to update the operating system than we do. (I suspect updates to operating systems generate a lot of calls from customers, either because they are confused or the update broke some functionality.)

So I see the move to smart phones and tablets, and its corresponding shift of administration from user to provider, as a step in synchronizing hardware and operating system. And once hardware and operating system are synchronized, they are not two items but one. We may, in the future, see operating systems baked in to devices with no (or limited) ways to update them. Operating systems may be part of the device, burned into a ROM.

Thursday, December 20, 2012

The Cheapening of IT

The prices for computing equipment, over the years, have moved in one direction: down. I believe that the decrease in prices for hardware has an affect on our willingness to pay for software.

In the early 1960s, a memory expansion for the IBM 1401 provided 8K of what we today call RAM, at a price of $258,000. That was only the expansion pack of memory; the entire system cost several times that amount. With an investment of over a million dollars for hardware, an additional investment of several tens of thousands of dollars for software was quite the bargain.

In 1977, a Heathkit 8-bit microcomputer with an 8080 processor, 4K of RAM, and a cassette tape recorder/player (used for long-term storage prior to floppy disks), cost almost $1500. Software for such a computer ran from $20 (for a simple text editor) to $400 (for the Microsoft COBOL compiler).

Today, smart phone or tablet costs range from $200 to $1000. (Significantly less than the Heathkit 8-bit system, once you account for inflation.) Tablet apps can cost as much as $10. Some are more, and some are free.

What affect does this decrease in the hardware cost have on the cost of software?

Here's my theory: as the cost of hardware decreases, the amount that we are willing to pay for software also decreases. I can justify spending $400 for software when the hardware costs several times that amount. But I have a harder time spending $400 on software when the hardware costs less than that. My bias is for hardware, and I am assigning higher intrinsic value to the hardware than the software. (The reasons behind this are varied, from the physical nature of hardware to the relationship with the vendor. I'm pretty sure that one could find a Master's thesis in this line of study.)

But if a cheapening of the hardware leads to a cheapening of the software, how does that change the industry? Assuming that the theory is true, we should see downward pressure on the cost of applications. And I think that we have seen this. The typical phone and tablet app holds a retail price that is significantly less than the price for a typical desktop PC application. "Angry Birds" costs only a fraction of the price of Microsoft Office.

I expect that this cost bias will extend to PC apps that move to tablets. Microsoft Word on the Surface will be priced at under $40 (perhaps as an annual subscription) and possibly less. The initial release of the Surface includes a copy of Word, although it is restricted to non-commercial use.

I also expect that the price of desktop PC apps will fall, keeping close to the prices of tablet apps. Why spend $400 for Word on the PC when one can get it for $40 on the tablet? The reduced price of apps on one platform drives down the price of apps on all platforms.

The cheapening affect may go beyond off-the-shelf PC applications. As the prices of desktop applications fall, we may see pressure to reduce the price of server-based systems, or server components of multiplatform systems. Again, this will be driven not by technology but by psychology: I cannot justify a multi-thousand dollar cost for a server component when the corresponding desktop applications have low costs. The reduced prices of desktop applications drives down the prices of equivalent server applications. Not all server applications, mind you; only the server applications that have desktop equivalents, and only then when those desktop equivalents are reduced in price to match tablet apps.

The general reduction of prices for desktop and server applications may create difficulties for the big consulting shops. These shops charge high prices for the development of custom applications for businesses. Psychology may cause headaches for their sales teams: why should I spend hundreds of thousands of dollars on a custom app (which includes clients for desktop PCs, tablets, and smartphones, of course) when I can see that powerful, competent apps are marketed for less than $10 per user? While there is value is a custom application, and while a large company may need many "downloads" for their many users, the argument for such high prices becomes difficult. Is a custom app really adding that much value?

Look for the large consulting houses to move into new technologies such as cloud and "big data" as ways of keeping their rates high. By selling these new technologies, the consulting houses can offer something that is not readily apparent in the off-the-shelf apps. (At least until their customers figure out that the off-the-shelf apps are also using cloud and "big data" tech.)

All of this leads to downward pressure on the prices of apps, whether they are simple games or complex systems. That pressure, in turn, will put downward pressure on development costs and upward pressure for productivity. Where a project was run with a project manager, three tech leads, ten developers, three testers, two analysts, and a technical writer, future projects may be run with a significantly smaller team. Perhaps the team will consist of one project manager, one tech lead, three developers, and one analyst. I'm afraid the "do more with less" exhortation will be with us for a while.

Tuesday, October 30, 2012

BYOD can be easy with tablets

The "bring your own device" movement has caused quite a bit of heartburn among the corporate IT and security folks. More than is necessary, I think.

For those unfamiliar with the term "bring your own devices" (BYOD), it means this: employees select their own devices, bring them to the office, and use them for work. Such a notion causes panic for IT. It upsets the well-balanced apple cart of company-supplied PCs and laptops. Corporations have invested in large efforts to minimize the costs (purchase costs and support costs) of PCs and laptops. If employees were allowed to bring their own hardware, the following would happen (in the thinking of the corporate cost-minimizers):

  • Lots of employees would have problems connecting to the company network, therefore they would call the help desk and drive up support costs
  • Employee-selected hardware would vary from the corporate standard, increase the number of hardware and software combinations, and drive up support costs

And in the minds of IT security:

  • Employee-selected hardware would be vulnerable to viruses and other malware, allowing such things behind the corporate firewall

But these ideas are caused by misconceptions. The first is that employees want to bring their own PCs (or laptops). But employees don't. (Or at least not the folks with whom I have spoken.) Employees want to bring cell phones and tablets, not laptops and certainly not desktop PCs.

The second misconception is that smartphones and tablets are the same as PCs, except smaller. This is also false. Yes, smartphones and tablets have processors and memory and operating systems, just like PCs (and mainframes, if you want to get technical). But we use tablets and smartphones differently than we use PCs and laptops.

We use laptops and PCs as members of a network with shared resources. These laptops and PCs are granted access to various network resources (printers, NAS units, databases, etc.) based on the membership of the PC (or laptop) within a domain and the membership of the logged-in user of domain-controlled groups. The membership of the PC within a domain gives it certain privileges, and these privileges can create vectors for malware.

Smartphones and tablets are different. We don't make them members of a domain. They are much closer to a browser on a home PC, used for shopping or online banking. My bank allows me to sign on, view balances, pay bills, and request information, all without being part of their domain or their security network.

How is this possible? I'm sure that banks (and other companies) have security policies that specify that only corporate-owned equipment can be connected to the corporate-owned network. I'm also sure that they have lots of customers, some of whom have infected PCs. Yet I can connect to their computers with my non-approved, non-certified, non-domained laptop and perform work.

The arrangement works because my PC is never directly connected to their network, and my work is limited to the capabilities of the banking web pages. Once I sign in, I have a limited set of possibilities, not the entire member-of-a-network smorgasbord.

We should think of smartphones and tablets as devices that can run apps, not as small PCs that are members of a domain. Let the devices run apps that connect to back-end servers; let those servers offer a limited set of functions. In other words, convert all business applications to smartphone apps.

I recognize that changing the current (large) suite of business applications to smartphone apps is a daunting task. Lots of applications have been architected for large, multi-window screens. Many business processes assume that uses can store files on their own PCs. Moving these applications and processes to smartphone apps (or tablet apps) requires thought, planning, and expertise. It is a large job, larger than installing "mobile device management" packages and added new layers of security bureaucracy for mobile devices.

A large job, yet a necessary one. Going the route of "device management" locks us into the existing architecture of domain-controlled devices. In the current scheme, all new devices and innovations must be added to the model of centralized security.

Better to keep security through user authentication and isolate corporate hardware from the user hardware. Moving applications and business processes to tablet apps, separating the business task from the underlying hardware, gives us flexibility and freedom to move to other devices in the future.

And that is how we can get to "bring your own device".

Friday, March 23, 2012

The default solution

For decades, mainframes were the default solution to computing problems. When you needed something done, you did it on a mainframe, unless you had a compelling reason for a different platform.

For decades, IBM called the shots in the computer industry. The popularity of IBM hardware gave IBM the ability to strongly influence (some might say dictate) hardware and software standards. That power diminished with the rise of personal computers (ironically helped by the IBM PC). IBM ceded the control of software to Microsoft, first with DOS and later with Windows.

For decades, PCs were the default solution to computing problems. When you needed something done, you did it on a PC, unless you had a compelling reason for a different platform.

For decades, Microsoft called the shots. The popularity of Windows and Office gave Microsoft the ability to strongly influence (some might say dictate) hardware and software standards. That power diminished with the rise of hand-held computers (specifically iPods and iPhones). Microsoft ceded the market to Apple, after several failed attempts at moving Windows to hand-sized devices.

Now, smartphones and tablets are the default solution to computing problems. When you need something done, you do it on a smartphone or tablet, unless you have a compelling reason for a different platform.

The popular platforms are the default solutions, and the company with the dominant platform can set the standards and the direction of the technology. Notice that it is the popular platform that defines the default solution, not the most cost-effective or the most reliable. The default solution is defined by the market, specifically what customers are buying. It is not a democracy, but neither is it an inherited rank. A company has a leadership role because the market gives that company the role.

And the market can take away that role.

The change in the market from mainframe to PC was an expansion, not a revolution.

The events that unseated IBM were not market revolutions, in which one competitor replaced another. IBM the mainframe manufacturer was not ousted by another mainframe manufacturer.They defended themselves against competitors, but failed to expand to new markets.

The PC revolution expanded the market. (It may have killed dedicated word processing systems, but overall it expanded the market.) The new market of word processing software, spreadsheets, and even primitive databases was something that IBM did not pursue with mainframes. It is possible that IBM was unable to pursue that market, as the PCs were small, inexpensive, and purchased by people who did not have a squadron of lawyers to review purchase and support contracts.

The market expanded but mainframes stayed constant, and that allowed PCs to become the default solution.

We have a similar situation with PCs and tablets.

The smartphone revolution (along with tablets) is expanding the market. The new market of location-aware apps, easy-to-install apps, and touchscreen interfaces is a market that Microsoft is only now beginning to pursue with Windows 8 and the Metro UI, and this effort is by no means guaranteed. (Many long-time supporters of Microsoft are grumbling at Windows 8.)

The market is expanding and PCs are mostly staying constant. That allows smartphones to become the default solution.

But PCs are not simply sitting still. PCs, and more specifically, PC operating systems, are adapting the ideas of the smartphone market. Microsoft's Windows 8 is the most prominent example of this effect, with its new GUI and the new Microsoft Windows App Store. Apple's "Lion" release of OSX bring it closer to smartphone operating systems. Some Linux distributions are morphing their user interfaces to something closer to smartphones and are simplifying their package managers.

In the end, I think PCs will have a limited role. Data centers have never been fond of the tower-style units, preferring rack-mounted servers and now preferring virtual PCs running on mainframes, of all things! Home users will find that smartphones and tablets less expensive, easier to use, and good enough to get the job done. Corporate users are the last bastion of PCs, and even they are looking at smartphones and tablets in the "Bring Your Own Device" movement.

PCs won't die out. Some tasks are handled by PCs better than on tablets. (Just as some tasks are handled by mainframes better than PCs, even today.) Some people will keep them because they are "tried and true" solutions, others will be unwilling to move to different platforms. Hobbyists will keep them out of nostalgia.

But they won't be the default solution.

Tuesday, May 31, 2011

Fast food software

Our industry of software development has used the "large software" model for development. By "large software", I mean software that costs a lot. Not just in purchase cost (or support licenses) but in care and feeding.

I recently attended a science fiction convention. I noticed that the con used little in the way of tech. The folks running the con used computers, but few attendees used them. The con made no effort to interact with people via tech. The con had a static web site and a PDF with a list of events.

In contrast, the O'Reilly conferences use tech to involve people. They have a web site. You can create a profile. The web site has a schedule of sessions, and you can create your personal schedule, picking those sessions that interest you.

The difference between the two organizations is technical firepower. O'Reilly has the wherewithal to hire folks and make the interaction happen. Science fiction conventions are often volunteer-run and have little technical expertise on staff.

I suspect that a lot of small organizations, companies, and government agencies are in similar situations. They probably use PCs with Windows, MS-Office, and Internet Explorer... because that's the software that is on the PC when they take it out of the box. Small organizations don't have IT support teams and cannot adopt today's high-tech solutions (the effort outstrips the one guy who sets up the PCs).

Small shops are too often run on a bunch of PCs, a router, and shared spreadsheets. (Or worse, passed-around spreadsheets, and we're not really sure who has the latest changes.) They don't have the expertise to install and support custom-made systems. (I suspect that many small shops don't have the experience to install MS-Office and track licenses and activation codes.)

Smartphones and tablets (and the cloud) are a possible solution here. The model provided by smartphones and tablets is a good fit for these small organizations. Installation of software is handled by a single click. (No license, no activation code, no special instructions.) Smartphone and tablet software is also priced within the budget of a small organization. The local Mom-and-Pop store will never, ever, purchase and install an ERP system (nor should they) but they can record expenses on their phones.

The business model for small software is different from the business model for large software. Large software is like a dinner at a high-end restaurant: You have personal service and you meal is prepared according to your specifications. Small software is fast food: You pick a set of items from a limited menu and your meal is handed to you from a stack of prepared items, with little or no personalization. It is not elegant... but it is fast, predictable, and cheap.

With the right combination of front-end software (on tablets and phones) and back-end software (in the cloud), small organizations will be able to use tech and involve their customers. (And involve their customers in a way that is easy for customers to use.)

There is a market here. It's different from the current market, in terms of tech and business. It is a high-volume, low-margin market. And for those who exploit it capably, very profitable.