The advent of "bring your own device" (BYOD) poses a problem for companies and their system administrators and support teams. What hardware/software combination shall we choose as the company standard?
In the "old world" of desktop PCs, the choice was easy: Microsoft. Microsoft was the dominant player with the largest market share and the largest set of technologies. Microsoft was also the familiar choice; companies had a lot of Microsoft technology and companies knew that new offerings from Microsoft would work with the existing tech.
The brave new world of tablet/cloud technologies offers us choices, but there is no easy choice. The dominant vendors are not the familiar ones. Microsoft, the familiar choice, has some offerings in cloud technologies but they are not the dominant player. (That role is held by Amazon.com.) With tablets, the leader is Apple. With cell phones, the leader is Apple (or Google with Android, depending on how you look at the numbers). In any of the new tech of phones, tablets, and cloud, Microsoft is not the leader.
If one cannot choose Microsoft, can one at least choose a new single vendor for all the tech? Yes, with some complications. First, the new tech will coordinate poorly -- if at all -- with the existing (probably Microsoft) equipment. Second, the technology lead is split among vendors. Apple may lead in the cell phone and tablet market, but its iCloud offering is significantly less than Amazon.com's AWS offerings. Amazon.com offers the most popular cloud environment, but their tablets are designed for e-books and they have no cell phones. Google has a strong contender with Android for cell phones and tablets, but its cloud offerings are limited compared to AWS and Microsoft's Azure.
The situation is similar to the game of "rock-paper-scissors": no one vendor wins in all categories. If the game depends on cloud tech, then Amazon.com is the safest bet. If instead the game depends on tablets, you may want to use Apple technology. But who can tell the win conditions for the game?
My position is that the choice of a new standard for BYOD is a false one. Or at least the choice of a single set of technology across the company is the wrong choice.
The idea of a single standard, a single set of hardware (and limited software) is a hold-over from the desktop PC model of hardware. In that model, companies purchased, supplied, and supported the hardware and software. When you purchase and supply the equipment, you have to support it. Support costs money (you have to train people) and more configurations means higher costs. Fewer configurations means lower support costs, which is why Southwest Airline operates only Boeing 737 aircraft.
But with BYOD, the support is shifted from the company to the employee. The employee purchases the equipment. They purchase the software. They maintain the device. Forcing a single configuration (or a limited number) is possible, but it saves you nothing. If anything, it will irritate the employees who will see no reason for the restrictions. (It is similar to insisting that your employees all drive cars made by Ford. Unless you are the Ford Motor Company, you have no moral standing for such an edict.)
I recognize that the different tablets and phones run on different platforms. Apps written for iOS must be re-written to run on Android. There are some tools to assist with multi-platform apps. Some apps are best converted to HTML5 and Javascript with style sheets for the different platforms. But this question of platforms is applicable to only your custom, in-house applications. The "big" apps, the popular, general-business apps will move to all of the platforms, just as Facebook and Twitter run on the major platforms. You care about platforms only for your custom applications.
Transitioning from the "we supply everything" model to the "bring your own device" model requires changes for the company and for the employees. It is more complex than a simple "go get your own device" memo. Employees have to take ownership of their devices. Employers must let go of certain decisions and some control. Don't make it harder than it needs to be.
Sunday, July 29, 2012
Saturday, July 28, 2012
Centralizing or decentralizing? What's happening now?
One might be confused with the direction of today's PC technologies. Are they becoming more distributed or are they becoming more centralized?
There are two large-scale changes in today's technologies. Depending on which you look at, you will see the trend for distributed control or the trend for centralized control. (Kind of like those optical illusions of stairs that go up, or down, depending on your visual perception.)
The distribution of PC applications are becoming centralized: Apple's business model is centralized. Applications for iPhones, iPods, and iPads are distributed through iTunes and through iTunes only. This is very different from the approach used by Microsoft for PC-DOS and Windows applications, in which anyone could write and distribute an application, and anyone could purchase and install an application (provided they had administrator privileges) without any involvement or supervision from Microsoft. Apple has complete control over iTunes and one can distribute an iPad/iPhone/iPod app only with Apple's permission.
Apple has declared intentions to move the Mac world from the "open distributor" model to an "Apple centric" model with the "App Store". Indeed, applications that use iCloud must be distributed within the App Store. Applications distributed via the "open distributor" model cannot use iCloud services.
Microsoft is considering to closed distribution model with Windows 8 and the Metro environment.
The selection and ownership of PCs are becoming decentralized: The "bring your own device" fad (for now, let us call it a fad) shifts the ownership of PCs from employers to employees. The previous model saw employers specifying, providing, and provisioning PCs for workers. Often a company would have a standard configuration of hardware and software, issued to all employees. A standard configuration reduced support costs, since there was one (or a limited number) of hardware and software combinations.
With the "bring your own device" fad, employees pick the device, employees own the device, and employees provision and maintain the device. One person may pick a Windows laptop PC, another may pick a MacBook, and a third may pick a Linux tablet. This is clearly a decentralization of decisions -- although the employer retains decisions for the development of company-specific applications. An employer may develop custom software for their employees and build it for one or a limited number of platforms. (Such as custom software for insurance adjusters that runs only on iPads. You can be sure that insurance adjusters at that company will select iPads.)
The two shifts are symptoms of larger changes: Software is becoming a commodity, with the important packages running on multiple platforms (or equivalents running on different platforms). Second, power is shifting from PC customers (large user corporations) to PC platform manufacturers (Apple, Google, Microsoft).
Software is a commodity, and the different packages offer no compelling advantages. For word processors, Microsoft Office is just as good as Libre Office. And Libre Office is just as good as Microsoft Office. In the past, Microsoft Office did offer compelling advantages: it ran on Windows, it ran efficiently and reliably, and it used proprietary formats that demanded that new uses have the same software. Those advantages have disappeared, for various reasons.
Software is a commodity, and current products are "good enough". There is little to be gained by adding new features to a word processor, to a spreadsheet, to an e-mail/calendar application. I may sound a little like the "we don't need a patent office because we have invented everything" argument, but bear with me.
The core packages used to run offices (word processors, spreadsheets, e-mail, calendars, presentation, etc.) are good enough, the data is interchangeable (or convertable), and the user interfaces are easy enough to understand.
If we need additional functionality in an office, a company will get it (either by building it or buying it). But they will do so with extra software, not through extensions to the core packages. (The one possible exception might be spreadsheet macros.) The core office software are commodities.
Apple knows this. It spends no effort building its own version of office tools. I suspect that Microsoft understands this too, and is preparing for the day when lots of customers move away from Microsoft Office.
Apple and Microsoft are building new mechanisms to extract value from their customers: walled gardens in which they are the gatekeepers. The application software will be less important and the walls around the garden will be more important. Apple uses its iTunes as a tollbooth, extracting a percentage of every sale. I expect Microsoft to do the same.
What this means for "the rest of us" (individuals, user companies, developers, etc.) remains to be seen.
There are two large-scale changes in today's technologies. Depending on which you look at, you will see the trend for distributed control or the trend for centralized control. (Kind of like those optical illusions of stairs that go up, or down, depending on your visual perception.)
The distribution of PC applications are becoming centralized: Apple's business model is centralized. Applications for iPhones, iPods, and iPads are distributed through iTunes and through iTunes only. This is very different from the approach used by Microsoft for PC-DOS and Windows applications, in which anyone could write and distribute an application, and anyone could purchase and install an application (provided they had administrator privileges) without any involvement or supervision from Microsoft. Apple has complete control over iTunes and one can distribute an iPad/iPhone/iPod app only with Apple's permission.
Apple has declared intentions to move the Mac world from the "open distributor" model to an "Apple centric" model with the "App Store". Indeed, applications that use iCloud must be distributed within the App Store. Applications distributed via the "open distributor" model cannot use iCloud services.
Microsoft is considering to closed distribution model with Windows 8 and the Metro environment.
The selection and ownership of PCs are becoming decentralized: The "bring your own device" fad (for now, let us call it a fad) shifts the ownership of PCs from employers to employees. The previous model saw employers specifying, providing, and provisioning PCs for workers. Often a company would have a standard configuration of hardware and software, issued to all employees. A standard configuration reduced support costs, since there was one (or a limited number) of hardware and software combinations.
With the "bring your own device" fad, employees pick the device, employees own the device, and employees provision and maintain the device. One person may pick a Windows laptop PC, another may pick a MacBook, and a third may pick a Linux tablet. This is clearly a decentralization of decisions -- although the employer retains decisions for the development of company-specific applications. An employer may develop custom software for their employees and build it for one or a limited number of platforms. (Such as custom software for insurance adjusters that runs only on iPads. You can be sure that insurance adjusters at that company will select iPads.)
The two shifts are symptoms of larger changes: Software is becoming a commodity, with the important packages running on multiple platforms (or equivalents running on different platforms). Second, power is shifting from PC customers (large user corporations) to PC platform manufacturers (Apple, Google, Microsoft).
Software is a commodity, and the different packages offer no compelling advantages. For word processors, Microsoft Office is just as good as Libre Office. And Libre Office is just as good as Microsoft Office. In the past, Microsoft Office did offer compelling advantages: it ran on Windows, it ran efficiently and reliably, and it used proprietary formats that demanded that new uses have the same software. Those advantages have disappeared, for various reasons.
Software is a commodity, and current products are "good enough". There is little to be gained by adding new features to a word processor, to a spreadsheet, to an e-mail/calendar application. I may sound a little like the "we don't need a patent office because we have invented everything" argument, but bear with me.
The core packages used to run offices (word processors, spreadsheets, e-mail, calendars, presentation, etc.) are good enough, the data is interchangeable (or convertable), and the user interfaces are easy enough to understand.
If we need additional functionality in an office, a company will get it (either by building it or buying it). But they will do so with extra software, not through extensions to the core packages. (The one possible exception might be spreadsheet macros.) The core office software are commodities.
Apple knows this. It spends no effort building its own version of office tools. I suspect that Microsoft understands this too, and is preparing for the day when lots of customers move away from Microsoft Office.
Apple and Microsoft are building new mechanisms to extract value from their customers: walled gardens in which they are the gatekeepers. The application software will be less important and the walls around the garden will be more important. Apple uses its iTunes as a tollbooth, extracting a percentage of every sale. I expect Microsoft to do the same.
What this means for "the rest of us" (individuals, user companies, developers, etc.) remains to be seen.
Labels:
app store,
Apple,
business models,
iTunes,
Microsoft,
software as commodity,
walled garden
Sunday, July 22, 2012
NoSQL is no big deal -- and that is a big deal
Things move fast in tech, or so they say. I saw this effect in action at the OSCON conference, just last week.
I attended this conference last year, so I can compare the topics of interest for this year against last year.
Last year, NoSQL was the big thing. People were talking about it. Vendors were hawking their NoSQL databases. Developers were talking (incessantly, at times) about their projects that use NoSQL databases. Presenters gave classes on the concepts and proper use of NoSQL databases.
This year, no one was talking about it. (Well, a few people were, but they were a tiny minority.) It wasn't that people had rejected NoSQL databases. In fact, quite the opposite: people had accepted them as normal technology. NoSQL databases are now considered to be "just another tool in our set".
So in the space of twelve months, NoSQL has gone from "the cool new thing" to "no big deal".
And that, I think, is a big deal.
I attended this conference last year, so I can compare the topics of interest for this year against last year.
Last year, NoSQL was the big thing. People were talking about it. Vendors were hawking their NoSQL databases. Developers were talking (incessantly, at times) about their projects that use NoSQL databases. Presenters gave classes on the concepts and proper use of NoSQL databases.
This year, no one was talking about it. (Well, a few people were, but they were a tiny minority.) It wasn't that people had rejected NoSQL databases. In fact, quite the opposite: people had accepted them as normal technology. NoSQL databases are now considered to be "just another tool in our set".
So in the space of twelve months, NoSQL has gone from "the cool new thing" to "no big deal".
And that, I think, is a big deal.
Tuesday, July 17, 2012
How big is "big"?
A recent topic of interest in IT has been "big data", sometimes spelled with capitals: "Big Data". We have no hard and fast definition of big data, no specific threshold to cross from "data" to "big data". Does one terabyte constitute "big data"? If not, what about one petabyte?
This puzzle is similar to the question of "real time". Some systems must perform actions in "real time", yet we do not have a truly standard definition of them. If I design a dashboard system for an automobile and equip the automobile with sensors that report data every two seconds, then a real-time dashboard system must process all of the incoming data, by definition. Should I replace the sensors with units that report data every 1/2 second and the dashboard cannot keep up with the faster rate, then the system is not "real time".
But this means that the definition of "real time" depends not only on the design of the processing unit, but also the devices to which it communicates. The system may be considered "real time" until we change a component, then it is not.
I think that the same logic holds for "big data" systems. Today, we consider multiple petabytes to be "big data". Yet in in 1990 when PCs had disks of 30 megabytes, a data set of one gigabyte would be considered "big data". And in the 1960s, a data set of one megabyte would be "big data".
I think that, in the end, the best we can say is that "big" is as big as we want to define it, and "real time" is as fast as we want to define it. "Big data" will always be larger than the average organization can comfortably handle, and "real time" will always be fast enough to process the incoming transactions.
Which means that we will always have some systems that handle big data (and some that do not), and some systems that run in real time (and some that do not). Using the terms properly will rely not on the capabilities of the core components alone, but on our knowledge of the core and peripheral components. We must understand the whole system to declare it to be "big data" or "real time".
This puzzle is similar to the question of "real time". Some systems must perform actions in "real time", yet we do not have a truly standard definition of them. If I design a dashboard system for an automobile and equip the automobile with sensors that report data every two seconds, then a real-time dashboard system must process all of the incoming data, by definition. Should I replace the sensors with units that report data every 1/2 second and the dashboard cannot keep up with the faster rate, then the system is not "real time".
But this means that the definition of "real time" depends not only on the design of the processing unit, but also the devices to which it communicates. The system may be considered "real time" until we change a component, then it is not.
I think that the same logic holds for "big data" systems. Today, we consider multiple petabytes to be "big data". Yet in in 1990 when PCs had disks of 30 megabytes, a data set of one gigabyte would be considered "big data". And in the 1960s, a data set of one megabyte would be "big data".
I think that, in the end, the best we can say is that "big" is as big as we want to define it, and "real time" is as fast as we want to define it. "Big data" will always be larger than the average organization can comfortably handle, and "real time" will always be fast enough to process the incoming transactions.
Which means that we will always have some systems that handle big data (and some that do not), and some systems that run in real time (and some that do not). Using the terms properly will rely not on the capabilities of the core components alone, but on our knowledge of the core and peripheral components. We must understand the whole system to declare it to be "big data" or "real time".
Sunday, July 15, 2012
A little knowledge...
Let's start with knowledge. Knowledge on a project can cover a lot of ground, from basic procedures (how to build the programs) to the corporate history and mission and how it affects the design of programs. Some knowledge can be encoded in written procedures, step-by-step lists of operations that provide a specific result. Other knowledge is subtle, and requires good judgement.
This second type of knowledge is hard to acquire and is something that cannot be easily transferred to another person. It is learned only through experience, through trial-and-error, through guidance and mentoring. Knowledge of the first time, in contrast, can be transferred by simply handling a document to a person.
This division of knowledge is not limited to development projects or even IT. Groups such as the Boy Scouts cope with these two types of knowledge. The scout manual covers the basics, but the "good judgement" skills require experience and leadership from other scouts and scoutmasters.
So here is my "great insight":
With a team that has high turnover, the only knowledge that one can expect is the knowledge of the first type, the easily documented procedures. Members of a high-turnover team do not invest enough time in the project to learn the "judgement stuff". Therefore, members of a high-turnover team cannot be expected to "use good judgement" to make decisions and resolve issues; they can only use the documented procedures.
This has ramifications in two areas. One is the documented procedures, the other is the tasks that can be assigned. Documented procedures must be clear and simple, and based on a base of expected experience. They must be of a limited size, something that can be read in a reasonable time. They cannot be a large tome of all possible conditions and desired outcomes -- people will not be able to absorb all of it or navigate to the proper section.
Tasks assigned to a high-turnover team are limited. One cannot assign tasks that require judgement based on knowledge of the corporate culture and project history -- team members will never have this knowledge. (If the outsourcing provider claims that the team can handle such tasks, then the provider must show that the team will have this long-learned knowledge. But I suspect providers will limit their claims to procedure-driven tasks such as coding and execution of test scripts.)
Knowing these limits on outsourcing guides the client company's strategy. With a high-turnover team, the client company must prepare clear and comprehensive procedures for specific tasks. The client can assign procedure-driven tasks but not judgement-driven tasks.
If the client wants to outsource judgement-driven tasks (project management, UI design, test script composition) then the outsourcing company must provide long-term, low-turnover teams. If they cannot, then the client cannot assign those tasks. (At least, not if they want to be successful.)
Friday, July 13, 2012
This time it will be different... for Microsoft, for Apple, and all of us
This month saw another lackluster performance in PC sales. Some folks have indicated that this plateau of sales marks the start of the "Post-PC Era". I think that they are right, except that we will call it the "Tablet/cloud Era". We didn't call the PC Era the "Post Mainframe Era".
This new market is different for Microsoft. They enter as a challenger, not as the leader. They have entered other markets as a challenger (the XBOX is the most obvious example), but mostly they have lived in a market in which they were the top banana. Microsoft struck a fortuitous deal with IBM to supply PC-DOS for the IBM PC, won the OS/2-Windows battle, and had advantages in the development of applications for Windows. (Some have charged that Microsoft had unfair advantages, in that they had detailed technical knowledge of the inner workings of Windows and used that knowledge to build superior offerings, but that debate belongs in the 1990s.)
In the tablet/cloud market, Microsoft enters as a late-comer, after Apple and Google. Microsoft cannot rely on the automatic support of the business market. Success will depend not only on their ability to build hardware and operating systems, but to grow an ecosystem of apps and app developers.
This new market is different for Apple. They enter as the leader, not as a niche provider. Will they keep their lead? Apple has experience as the "odd guy" with a small fraction of the market. Apple is the only manufacturer from the pre-IBM PC age that is still around. Prior to the IBM PC, Apple had a large share but was not a leader. After the IBM PC, Apple kept a small share but remained a side player.
In the tablet/cloud market, Apple is a leader. They provide the most-coveted hardware. Companies copy their designs (and get sued by Apple). Success will require Apple to keep providing new, cool hardware and easy-to-use software.
For Apple and Microsoft, this time it really is different.
This new market is different for Microsoft. They enter as a challenger, not as the leader. They have entered other markets as a challenger (the XBOX is the most obvious example), but mostly they have lived in a market in which they were the top banana. Microsoft struck a fortuitous deal with IBM to supply PC-DOS for the IBM PC, won the OS/2-Windows battle, and had advantages in the development of applications for Windows. (Some have charged that Microsoft had unfair advantages, in that they had detailed technical knowledge of the inner workings of Windows and used that knowledge to build superior offerings, but that debate belongs in the 1990s.)
In the tablet/cloud market, Microsoft enters as a late-comer, after Apple and Google. Microsoft cannot rely on the automatic support of the business market. Success will depend not only on their ability to build hardware and operating systems, but to grow an ecosystem of apps and app developers.
This new market is different for Apple. They enter as the leader, not as a niche provider. Will they keep their lead? Apple has experience as the "odd guy" with a small fraction of the market. Apple is the only manufacturer from the pre-IBM PC age that is still around. Prior to the IBM PC, Apple had a large share but was not a leader. After the IBM PC, Apple kept a small share but remained a side player.
In the tablet/cloud market, Apple is a leader. They provide the most-coveted hardware. Companies copy their designs (and get sued by Apple). Success will require Apple to keep providing new, cool hardware and easy-to-use software.
For Apple and Microsoft, this time it really is different.
Labels:
Apple,
cloud computing,
market share,
Microsoft,
tablet computing
Wednesday, July 11, 2012
A keyboard just for me
We've had the QWERTY keyboard for ages. I learned to type on a real, honest-to-goodness manual typewriter with the QWERTY layout. I learned to program in BASIC on a Teletype ASR-33 (with a QWERTY keyboard), and 8080 assembly on a Heathkit H-89 (with a QWERTY) keyboard. All of these devices had the keyboards built in, part of the device itself.
The IBM PC came with a QWERTY keyboard (at least in the US). Unlike the previous devices, it had a detachable keyboard, and one could replace it with a different layout. (Not that many people did.)
I am sure that some folks still use the Dvorak layout. Some have called for a new standard.
With the invention of the smartphone and the tablet, we now have virtual keyboards. They appear on the display and are composed of programmable bits.
It strikes me that we don't really need a standard keyboard layout. When keyboards were physical things, hard-wired to the main device, a standard layout made sense. (Even when keyboards were mechanical components of manual typewriters, a standard layout made sense.) Physical keyboards could not be changed, and a standard let people easily move from one device to another.
With virtual keyboards we can create individual keyboards and let them follow us from device to device. When keyboards are programmed bits on a screen, it is easy to program those bits for our preferences. We don't need a standard keyboard that everyone agrees to use; better to have our custom keyboard to appear when we use the device.
Those custom keyboards can be any layout. They can be QWERTY. They can be Dvorak. They can be Dextr. They can be any of the above with slight changes. They can be wildly different. They can be programmed with additional keys for characters and glyphs outside of our normal set. (While I use the US layout, I often write the name "Mylène Farmer" and I need the accented 'è'.)
Beyond characters, we can add commonly used words. Android devices often add a key for ".com". We could add custom keys for e-mail addresses. When writing code, we could have special keyboards with keywords of the language ('if', 'while', etc.). Keyboards might interact with development environments. (We see something of this with the smart matching that suggests words as we type.)
I see little need to stay with the QWERTY layout. (Designed, long ago, to prevent people from typing too quickly and jamming the mechanical keys in a manual typewriter.)
Let my keyboard layout be mine!
The IBM PC came with a QWERTY keyboard (at least in the US). Unlike the previous devices, it had a detachable keyboard, and one could replace it with a different layout. (Not that many people did.)
I am sure that some folks still use the Dvorak layout. Some have called for a new standard.
With the invention of the smartphone and the tablet, we now have virtual keyboards. They appear on the display and are composed of programmable bits.
It strikes me that we don't really need a standard keyboard layout. When keyboards were physical things, hard-wired to the main device, a standard layout made sense. (Even when keyboards were mechanical components of manual typewriters, a standard layout made sense.) Physical keyboards could not be changed, and a standard let people easily move from one device to another.
With virtual keyboards we can create individual keyboards and let them follow us from device to device. When keyboards are programmed bits on a screen, it is easy to program those bits for our preferences. We don't need a standard keyboard that everyone agrees to use; better to have our custom keyboard to appear when we use the device.
Those custom keyboards can be any layout. They can be QWERTY. They can be Dvorak. They can be Dextr. They can be any of the above with slight changes. They can be wildly different. They can be programmed with additional keys for characters and glyphs outside of our normal set. (While I use the US layout, I often write the name "Mylène Farmer" and I need the accented 'è'.)
Beyond characters, we can add commonly used words. Android devices often add a key for ".com". We could add custom keys for e-mail addresses. When writing code, we could have special keyboards with keywords of the language ('if', 'while', etc.). Keyboards might interact with development environments. (We see something of this with the smart matching that suggests words as we type.)
I see little need to stay with the QWERTY layout. (Designed, long ago, to prevent people from typing too quickly and jamming the mechanical keys in a manual typewriter.)
Let my keyboard layout be mine!
Labels:
dextr,
dvorak,
keyboard layout,
qwerty,
tablet,
typewriter,
virtual keyboards
Subscribe to:
Posts (Atom)