The introduction of the IBM PC was market-wrenching. Overnight, the small, rough-and-tumble market of microcomputers with diverse designs from various small vendors became large and centered around the PC standard.
From 1981 to 1987, IBM was the technology leader. IBM lead in sales and also defined the computing platform.
IBM's leadership fell to Compaq in 1987, when IBM introduced the PS/2 line with its new (incompatible) hardware. Compaq delivered old-style PCs with a faster buss (the EISA buss) and notably the Intel 80386 processor. (IBM stayed with the older 80286 and 8086 processors, eventually consenting to provide 80386-based PS/2 units.) Compaq even worked with Microsoft to deliver newer versions of MS-DOS that recognized larger memory capacity and optical disc readers.
But Compaq did not remain the leader. It's leadership declined gradually, to the clone makers and especially Dell, HP, and Gateway.
The mantle of leadership moved from a PC manufacturer to the Microsoft-Intel duopoly. The popularity of Windows, along with marketing skill and software development prowess led to a stable configuration for Microsoft and Intel. Together, they out-competed IBM's OS/2, Motorola's 68000 processor, DEC's Alpha processor, and Apple's Macintosh line.
That configuration held for two decades, roughly from 1990 to 2010, when Apple introduced the iPhone. The genius move was not the iPhone hardware, but the App Store and iTunes, which let one easily find and install apps on your phone (and pay for them).
Now Microsoft and Apple have the same problem: after years of competing in a well-defined market (the corporate PC market) they struggle to move into the world of mobile computing. Microsoft's attempts at mobile devices (Zune, Kin, Surface RT) have flopped. Intel is desperately attempting to design and build processors that are suitable for low-power devices.
I don't expect either Microsoft or Intel to disappear. (At least not for several years, possibly decades.) The PC market is strong, and Intel can sell a lot of its traditional (heat radiator that happen to compute data) processors. Microsoft is a competent player in the cloud arena with its Azure services.
But I will make an observation: for the first time in the PC era, we find that there is no clear leader for technology. The last time we were leaderless was prior to the IBM PC, in the "microcomputer era" of Radio Shack TRS-80 and Apple II computers. Back then, the market was fractured and tribal. Hardware ruled, and your choice of hardware defined your tribe. Apple owners were in the Apple tribe, using Apple-specific software and exchanging data on Apple-specific floppy disks. Radio Shack owners were in the Radio Shack tribe, using software specific to the TRS-80 computers and exchanging data on TRS-80 diskettes. Exchanging data between tribes was one of the advanced arts, and changing tribes was extremely difficult.
There were some efforts to unify computing: CP/M was the most significant. Built by Digital Research (a software company with no interest in hardware), CP/M ran on many different configurations. Yet even that effort could not span the differences in processors, memory layout, and video configurations.
Today we see tribes forming around multiple architectures. For cloud computing, we have Amazon.com's AWS, Microsoft's Azure, Google's App Engine. With virtualization we see VMware, Oracle's VirtualBox, the aforementioned cloud providers, and newcomer Docker as a rough analog of CP/M. Mobile computing sees Apple's iOS, Google's Android, and Microsoft's Windows RT as a (very) distant third.
With no clear leader and no clear standard, I expect each vendor to enhance their offerings and also attempt to lock in customers with proprietary features. In the mobile space, Apple's Swift and Microsoft's C# are both proprietary languages. Google's choice of Java puts them (possibly) at odds with Oracle -- although Oracle seems to be focussed on databases, servers, and cloud offerings, so there is no direct conflict. Things are a bit more collegial in the cloud space, with vendors supporting OpenStack and Docker. But I still expect proprietary enhancements, perhaps in the form of add-ons.
All of this means that the technology world is headed for change. Not just change from desktop PC to mobile/cloud, but changes in mobile/cloud. The competition from vendors will lead to enhancements and changes, possibly significant changes, in cloud computing and mobile platforms. The mobile/cloud platform will be a moving target, with revisions as each vendor attempts to out-do the others.
Those changes mean risk. As platforms change, applications and systems may break or fail in unexpected ways. New features may offer better ways of addressing problems and the temptation to use those new features will be great. Yet re-designing a system to take advantage of new infrastructure features may mean that other work -- such as new business features -- waits for resources.
One cannot ignore mobile/cloud computing. (Well, I suppose one can, but that is probably foolish.) But one cannot, with today's market, depend on a stable platform with slow, predictable changes like we had with Microsoft Windows.
With such an environment, what should one do?
My recommendations:
Build systems of small components This is the Unix mindset, with small tools to perform specific tasks. Avoid large, monolithic systems.
Use standard interfaces Use web services (either SOAP or REST) to connect components into larger systems. Use JSON and Unicode to exchange data, not proprietary formats.
Hedge your bets Gain experience in at least two cloud platforms and two mobile platforms. Resist the temptation of "corporate standards". Standards are good with a predictable technology base. The current base is not predictable, and placing your eggs in one vendor's basket is risky.
Change your position After a period of use, examine your systems, your tools, and your talent. Change vendors -- not for everything, but for small components. (You did build your system from small, connected components, right?) Migrate some components to another vendor; learn the process and the difficulties. You'll want to know them when you are forced to move to a different vendor.
Many folks involved in IT have been living in the "golden age" of a stable PC platform. They may have weathered the change from desktop to web -- which saw a brief period of uncertainty. More than likely, they think that the stable world is the norm. All that is fine -- except we're not in the normal world with mobile/cloud. Be prepared for change.
Showing posts with label process. Show all posts
Showing posts with label process. Show all posts
Tuesday, August 26, 2014
Sunday, November 28, 2010
Measuring the load
If I worked for a large consulting house, and offered you a new process for your software development projects, would you buy it?
Not without asking a few other questions, of course. Besides the price (and I'm amazed that companies will pay good money for a process-in-a-box) you'll ask about the phases used and the management controls and reports. You'll want to be assured that the process will let you manage (read that as "control") your projects. You'll want to know that other companies are using the process. You'll want to know that you're not taking a risk.
But here's what you probably won't ask: How much of a burden is it to your development team?
It's an important question. Every process requires that people participate in the data entry and reporting for the project: project definition, requirements, task assignment, task status updates, defects, defect corrections, defect verifications, build and test schedules, ... the list goes on. A process is not a thing as much as a way of coordinating the team's efforts.
So how much time does the process require? If I offered you a process that required four hours a day from each developer, seven hours a day from requirements analysts, and nine hours a day from architects, would you buy my process? Certainly not! Involvement in the process takes people away from their real work, and putting such a heavy "load" on your teams diverts too much effort away from the business goal.
Managers commit two sins: failure to ask the question and failure to measure the load. They don't ask the salesmen about the workload on their teams. Instead, they worry about licensing costs and compatibility with existing processes. (Both important, and neither should be ignored, but their view is too limited. One must worry about the affect on the team.) Managers should research the true cost of the process, like any other purchase.
After implementing the process, managers should measure the load the process imposes on their teams. That is, they should not take the salesman's word for it, assuming the salesman was able to supply an estimate. An effective manager will measure the "load" several times, since the load may change as people become familiar with the process. The load may also change during the development cycle, with variations at different phases.
Not measuring is bad, but assuming that the process imposes no load is worse. Any process will have some load on your staff. Using a process and thinking that it is "free" is self-delusion and will cause distortions in your project schedule. Even with a light load of one hour per day, the assumption of zero load introduces an error of 12%. You think you have eight hours available from each person, when in fact you have only seven. That missing hour will eventually catch up with you.
It's said that the first step of management is measurement. If true, then the zeroth step of management is acknowledgment. You must acknowledge that a process has a cost, a non-zero cost, and that it can affect your team. Only after that can you start to measure the cost, and only then can you start to manage it.
Subscribe to:
Posts (Atom)