Saturday, May 22, 2010

Yet it moves

Managers and programmers worry about different things. Programmers worry about code and becoming technically stale. Managers worry about delivering an acceptable solution on time and under cost.

The differences extend to the selection of tools. When picking the programing language for a project, a programmer will worry about the power of the tool and the ability to perform the necessary work. Managers worry about these things (or so they will claim) but they also worry about staffing and risk. Can they hire programmers who are familiar with the language? (Or replace programmers who leave?)

Since managers avoid risk, they prefer tools that have a proven track record, especially in their own organization. By using commonly available languages and development environments, managers have a ready supply of available workers -- the rest of their organization. And if other teams in the organization have selected tools based on the market, then the pool of potential candidates includes programmers in the entire industry.

So when starting a new project, many managers will make the safe choice and pick the popular language. It doesn't really matter that the language is not the best choice for the programming task; the ability to swap in new staff outweighs the difference in language performance.

Yet managers make an assumption, one that leads to problems in the long term. They assume that once started, the project will remain in the chosen language.

There are clear market leaders for programming languages. COBOL was (and perhaps still is) the leader for accounting and financial applications. FORTRAN rocks for scientific applications. The picture for office applications is less consistent.

In the 1980s, the languages of choice for office applications were Pascal and Microsoft Basic. The Macintosh OS interface was based on Pascal, and so was the interface for Microsoft Windows.

During the 1990s, the popular languages were C and later C++ and Visual Basic.

At the turn of the century, Java ascended.

Here in 2010 we like C# and .NET.

This history is an extremely simplified version. In each period, there were other languages. Applications started in an earlier age often stayed in their initial language, with a small percentage moving to a later language. But I think the simple version captures the basic gist of language popularity.

Here's the point: the "popular" language, the "safe" language, changes. The safe language (for a manager) of 1990 is not a safe language in 2010. In fact, managers who recommend that a new project use the C language would be asked for a justification for such an unusual choice. (They may have a specific need, but they would be asked. Managers recommending C# would receive nods of approval without the need for supporting justifications.)

An efficient project needs a ready supply of programmers, and this is what managers worry about.

Yet if the criteria for an efficient project is a ready supply of staff, why don't managers plan for future changes in technology? Today, managers start a project with a language and that language is locked in for the entire life of the project. There is no plan to re-tool the project ten years hence. (It seems that we pick new languages every decade.)

Instead of planning for the (seemingly inevitable) change, managers wait until they are forced to move to a new language. They may want a new platform (C#/.NET replacing C++/MFC) or perhaps recognize a dearth of programming talent. Managers assume that a project will start in a language and continue to use it... forever.

I'm not asking that managers pick the exact time to re-code the project, nor am I asking that they specify the new language in advance. That would require quite the forecasting powers.

I am asking that managers plan on a change in technology, somewhere between eight and twelve years into the life of the project. I am asking that managers work with their technical staff to reduce the work of that transition.

Some would say that planning for such a change is impossible -- we cannot predict the rise of new technology, nor plan for it. I agree with the statement that we cannot make specific (accurate) predictions about new technology. (Who could have predicted the iPhone in 1990? Or the cloud?)

I believe that we can build systems and applications that are more easily converted to new technology. We've been converting systems for the past fifty years. In that time, we have learned some key ideas: keep I/O separate from processing, use multiple levels of logic, make code readable, ... the list goes on.

If managers were concerned about the future technology shift (specific or general), they would be measuring the system architecture and ensuring that it was well-organized and prepared for change. Yet they don't.

I can hear the cries of managers throughout the industry: "Yes we do care! We require system documentation and code reviews and project planning meetings! How can you say we don't care about the future?"

To this I respond: Then why is it such an effort to move your systems to new platforms? Why is it so hard to extract business logic from the system? Why is it so hard to assess the quality of your systems? And show me a project plan that includes the task "change platform". (Not a plan developed late in the life of the system, but early on and scheduling the task several years in the future.)

If the system were prepared to move to a new language (or platform), if the code were well-organized and readable, these tasks would be easy.

Our systems have been built on the assumption of a firm, non-moving  foundation. Yet it moves.


No comments: