Software has a life cycle: It is born, it grows, and finally dies. That's not news, or interesting. What is interesting is that the big changes in software happen early in its life.
Let's review some software: PC-DOS, Windows, and Visual Basic.
PC-DOS saw several versions, from 1.0 to 6.0. There were intermediate versions, such as versions 3.3 and 3.31, so there were more that six versions.
Yet the big changes happened early. The transition from 1.0 to 2.0 saw big changes in the API, allowing new device types and especially subdirectories. Moving from version 1.0 to 2.0 required almost a complete re-write of an application. Moving applications from version 2.0 to later versions required changes, but not as significant. The big changes happened early.
Windows followed a similar path. Moving from Windows 1 to Windows 2 was a big deal, as was moving from Windows 2 to Windows 3. The transition from Windows 3 to Windows NT was big, as was the change from Windows 3.1 to Windows 95, but later changes were small. The big changes happened early.
Visual Basic versions 1, 2, and 3 all saw significant changes. Visual Basic 4 had some changes but not as many, and Visual Basic 5 and 6 were milder. The big changes happened early. (The change from VB6 to VB.NET was large, but that was a change to another underlying platform.)
There are other examples, such as Microsoft Word, Internet Explorer, and Visual Studio. The effect is not limited to Microsoft. Lotus 1-2-3 followed a similar arc, as did dBase, R:Base, the Android operating system, and Linux.
Why do big changes happen early? Why do the big jumps in progress occur early in a product's life?
I have two ideas.
One possibility is that the makers and users of an application have a target in mind, a "perfect form" of the application, and each generation of the product moves closer to that ideal form. The first version is a first attempt, and successive versions improve upon previous versions. Over the life of the application, each version moves closer to the ideal.
Another possibility is that changes to an application are constrained by the size of the user population. A product with few users can see large changes; a product with many users can tolerate only minor changes.
Both of these ideas explain the effect, yet they both have problems. The former assumes that the developers (and the users) know the ideal form and can move towards it, albeit in imperfect steps (because one never arrives at the perfect form). My experience in software development allows me to state that most development teams (if not all) are not aware of the ideal form of an application. They may think that the first version, or the current version, or the next version is the "perfect" one, but they rarely have a vision of some far-off version that is ideal.
The latter has the problem of evidence. While many applications grow there user base over time and also shrink their changes over time, not all do. Two examples are Facebook and Twitter. Both have grown (to large user bases) and both have seen significant changes.
A third possibility, one that seems less theoretical and more mundane, is that as an application grows, and its code base grows, it is harder to make changes. A small version 1 application can be changed a lot for version 2. A large version 10 application has oodles of code and oodles of connected bits of code; changing any bit can cause lots of things to break. In that situation, each change must be reviewed carefully and tested thoroughly, and those efforts take time. Thus, the older the application, the larger the code base and the slower the changes.
That may explain the effect.
Some teams go to great lengths to keep their code well-organized, which allows for easier changes. Development teams that use Agile methods will re-factor code when it becomes "messy" and reduce the couplings between components. Cleaner code allows for bigger and faster changes.
If changes are constrained not by large code but by messy code, then as more development teams use Agile methods (and keep their code clean) we will see more products with large changes not only early in the product's life but through the product's life.
Let's see what happens with cloud-based applications. These are distributed by nature, so there is already an incentive for smaller, independent modules. Cloud computing is also younger than Agile development, so all cloud-based systems could have been (I stress the "could") developed with Agile methods. It is likely that some were not, but it is also likely that many were -- more than desktop applications or web applications.
Wednesday, November 11, 2015
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment