Showing posts with label rate of change. Show all posts
Showing posts with label rate of change. Show all posts

Tuesday, June 10, 2014

Slow and steady wins the race -- or does it?

Apple and Google run at a faster pace than their predecessors. Apple introduces new products often: new iPhones, new iPad tablets, new versions of iOS; Google does the same with Nexus phones and Android.

Apple and Google's quicker pace is not limited to the introduction of new products. They also drop items from their product line.

The "old school" was IBM and Microsoft. These companies moved slowly, introduced new products and services after careful planning, and supported their customers for years. New versions of software were backwards compatible. New hardware platforms supported the software from previous platforms. When a product was discontinued, customers were offered a path forward. (For example, IBM discontinued the System/36 minicomputers and offered the AS/400 line.)

IBM and Microsoft were the safe choices for IT, in part because they supported their customers.

Apple and Google, in contrast, have dropped products and services with no alternatives. Apple dropped .Mac. Google dropped their RSS reader. (I started this rant when I learned that Google dropped their conversion services from App Engine.)

I was about to chide Google and Apple for their inappropriate behavior when I thought of something.

Maybe I am wrong.

Maybe this new model of business (fast change, short product life) is the future?

What are the consequences of this business model?

For starters, businesses that rely on these products and services will have to change. These businesses can no longer rely on long product lifetimes. They can no longer rely on a guarantee of "a path forward" -- at least not with Apple and Google.

Yet IBM and Microsoft are not the safe havens of the past. IBM is out of the PC business, and getting out of the server business. Microsoft is increasing the frequency of operating system releases. (Windows 9 is expected to arrive in 2015. The two years of Windows 8's life are much shorter than the decade of Windows XP.) The "old school" suppliers of PC technology are gone.

Companies no longer have the comfort of selecting technology and using it for decades. Technology will "rev" faster, and the new versions will not always be backwards compatible.

Organizations with large IT infrastructures will find that their technologies are less homogeneous. Companies can no longer select a "standard PC" and purchase it over a period of years. Instead, every few months will see new hardware.

Organizations will see software change too. New versions of operating systems. New versions of applications. New versions of online services (software as a service, platform as a service, infrastructure as a service, web services) will occur -- and not always on a convenient schedule.

More frequent changes to the base upon which companies build their infrastructure will mean that companies spend more time responding to those changes. More frequent changes to the hardware will mean that companies have more variations of hardware (or they spend more time and money keeping everyone equipped with the latest).

IT support groups will be stressed as they must learn the new hardware and software, and more frequently. Roll-outs of internal systems will become more complex, as the target base will be more diverse.

Development groups must deliver new versions of their products on a faster schedule, and to a broader set of hardware (and software). It's no longer acceptable to deliver an application for "Windows only". One must include MacOS, the web, tablets, and phones. (And maybe Kindle tablets, too.)

Large organizations (corporations, governments, and others) have developed procedures to control the technology within (and to minimize costs). Those procedures often include standards, centralized procurement, and change review boards (in other words, bureaucracy). The outside world (suppliers and competitors) cares not one whit about a company's internal bureaucracy and is changing.

The slow, sedate pace of application development is a thing of the past. We live in faster times.

"Slow and steady" used to win. The tortoise would, in the long run, win over the hare. Today, I think the hare has the advantage.

Sunday, July 22, 2012

NoSQL is no big deal -- and that is a big deal

Things move fast in tech, or so they say. I saw this effect in action at the OSCON conference, just last week.

I attended this conference last year, so I can compare the topics of interest for this year against last year.

Last year, NoSQL was the big thing. People were talking about it. Vendors were hawking their NoSQL databases. Developers were talking (incessantly, at times) about their projects that use NoSQL databases. Presenters gave classes on the concepts and proper use of NoSQL databases.

This year, no one was talking about it. (Well, a few people were, but they were a tiny minority.) It wasn't that people had rejected NoSQL databases. In fact, quite the opposite: people had accepted them as normal technology. NoSQL databases are now considered to be "just another tool in our set".

So in the space of twelve months, NoSQL has gone from "the cool new thing" to "no big deal".

And that, I think, is a big deal.

Sunday, March 18, 2012

Windows 8 means a faster treadmill

The release of Windows 8 marks a change in Microsoft's approach to backwards compatibility. Microsoft has shifted its position from "compatible at just about everything" to "things change a lot and your old things may not work".

Windows 8 and its Metro interface re-define the programming of applications. On x86 processors, legacy applications can run in "Windows 7 mode". On ARM processors, legacy applications... cannot run. And while Windows 8 offers Windows 7 mode, Microsoft has made no promise of such a feature in future releases.

With the shift to WinRT and Metro, Microsoft has started a countdown clock for the lives of all Windows applications in existence.

In the past, Microsoft maintained compatibility for just about every application. Even vintage DOS applications would run under Windows XP (and probably still run under Windows 7). That compatibility came at no small expense, not only in development and testing costs, but at opportunity costs. (New development was constrained by the design decisions of previous releases.)

Users, developers, and support teams are on a treadmill, with new technologies and releases arriving faster than before. The good old days of decade-long technology planning have been replaced with a range of two or three years.

People can get upset about the faster pace of the treadmill, but they have nowhere to go.

Apple has "revved" its platform a number of times, changing the processor, the operating system, the user interface, and the device form factor. The folks working on Linux are working on similar changes.

If Microsoft believes that it can be more profitable in a new market, or if it believes that the current market is not profitable, then I believe that they will move to the new market. It's customer's problems with lack of backward compatibility are not Microsoft's problem.

Interestingly, corporations long ago lobbied for shorter depreciation schedules for computing equipment. They successfully got the depreciation for equipment reduced to ... three years. Now Apple and Microsoft seem to be agreeing, indicating that equipment really is obsolete after three years. (Except that they include software in the definition of "equipment".)



I'm not sure that this faster pace is a good thing. I'm also not sure that I like it. But I do know this: it's happening. The question is not how to stop it, or how to avoid it, but how to cope with it. How do we live in a world when technology changes (dramatically) every three or maybe two years?


Sunday, March 13, 2011

The rate of change

In the good old days, technology changed. (Yeah, it still changes.) But it changed at a slow rate. When computers were large, hulking beasts that filled entire rooms, changes occurred over years and were small. You might get a new FORTRAN or COBOL compiler -- but not a new language -- every half-decade or so. (FORTRAN-66 was followed by FORTRAN-77, for example.)

Today, computers get faster and more powerful every six months. (Consider the Apple iPad, with the second version released prior to a year after the first.) Microsoft has released compilers for C# in 2001, 2003, 2005, 2008, and 2010 -- about one every two years. And not only do we get compilers, but we also get new languages. In 1995, Java was the new thing. In 2001, it was C#. Now we have Ruby, Python, Erlang, Haskell, Scala, Lua, and a bunch more. Not all of these will be the "next big thing", but one (or possibly more) will be.

Organizations can absorb change at a certain rate, and no faster. Each organization has its own rate. Some companies are faster than others. Larger companies take more time, since they have more people involved in decisions and more legacy applications. Small companies with fewer people and "software assets" can adopt new technologies quicker. Start-ups with a small number of employees and a few lines of code can move the quickest.

We're in a situation where technology changes faster than most companies can absorb the change. In most (big-ish) companies, managers don't really work with technology but make decisions about it. They make decisions based on their experience, which they got when they were managers-to-be and still working with technology. So managers in today's organizations think that tech works like a C++ compiler (or maybe a Java JVM), and senior managers think that tech works like a COBOL compiler and IMS database. (I imagine that MBA graduates who have no direct experience with tech believe that tech works as a series of commoditized black boxes that are replaceable at the proper cost.)

This is a big deal. If managers cannot value technology and make good judgements, then the decision-making process within companies becomes political, with different groups pushing for their positions and advocating certain directions. Solutions are selected for perceived benefits and the results can be vastly different from the desired outcome. Mistakes can be very expensive for a company, and possibly fatal to the project or the company.

So what is a company to do? One could hire managers who have deep technical knowledge and keep abreast of changes, but such managers are hard to find and hard to keep. One could create a separate team of technologists to set a technical direction for the company, but this can devolve into a special interest group within the company and create additional politics.

I think the best thing a company can do is set a general direction to keep the company technically capable, to set an expectation of all employees to be technically aware, and to reward those teams that demonstrate the ability to manage technology changes. Instead of dictating a specific solution, look for and encourage specific behaviors.