We have legacy web applications. We have legacy Windows desktop applications. We have legacy DOS applications (albeit few). We have legacy mainframe applications (possibly the first type to be named "legacy").
Will we have legacy cloud applications? I see no reason why not. Any technology that changes over time (which is just about every technology) has legacy applications. Cloud technology changes over time, so I am confident that, at some time, someone, somewhere, will point to an older cloud application and declare it "legacy".
What makes a legacy application a legacy application? Why do we consider some applications "legacy" and others not?
Simply put, then technology world changed and the application did not. There are multiple aspects to the technology world, and any one of them, when left unchanged, may cause us to view an application as legacy.
It may the user interface. (Are we using an old version of HTML and CSS? An old version of JavaScript?) It may be the database. (Are we using a relational database and not a NoSQL database?) The the back-end code may be difficult to read. The back-end code may be in a language that has fallen out of favor. (Perl, or Visual Basic, or C, or maybe an early version of Java?)
One can ask similar questions about legacy Windows desktop applications or mainframe applications. (C++ and MFC? COBOL and CICS?)
But let us come back to cloud computing. Cloud computing has been around since 2006. (There was an earlier use of the term "cloud computing", but for our purposes the year 2006 is sufficient.)
So let's assume that the earliest cloud applications were built in 2006. Cloud computing has changed since then. Have all of these applications kept up with those changes? Or have some of them languished, retaining their original design and techniques? If they have not kept up with changing technology, we can consider them legacy cloud applications.
Which means, as owners or custodians of applications, we now not only have to worry about legacy mainframe applications and legacy web applications and legacy desktop applications. We can add legacy cloud applications to our list.
Cloud computing is a form of computing, but it is not magical. It evolves over time, just like other forms of computing. Those who look after applications must either make the effort to modify cloud applications over time (to keep up with the mainstream) or live with legacy cloud applications. That effort is an expense.
Like any other expense, it is really a business decision: invest time and money in an old (legacy) application or invest the time and money somewhere else. Both paths have benefits and costs; managers must decide which has the greater merit. Choosing to let an old system remain old is an acceptable decision, provided you recognize the cost of maintaining that older technology.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment