Files have been the common element of computing since at least the 1960s. Files existed before disk drives and file systems, as one could put multiple files on a magnetic tape.
MS-DOS used files. Windows used files. OS/2 used files. (Even the p-System used files.)
Files were the unit of data storage. Applications read data from files and wrote data to files. Applications shared data through files. Word processor? Files. Spreadsheet? Files. Editor? Files. Compiler? Files.
The development of databases saw another channel for sharing data. Databases were (and still are) used in specialized applications. Relational databases are good for consistently structured data, and provide transactions to update multiple tables at once. Microsoft hosts its Team Foundation on top of its SQL Server. (Git, in contrast, uses files exclusively.)
Despite the advantages of databases, the main method for storing and sharing data remains files.
Until now. Or in a little while.
Cloud computing and web services are changing the picture. Web services are replacing files. Web services can store data and retrieve data, just as files. But web services are cloud residents; files are for local computing. Using URLs, one can think of a web service as a file with a rather funny name.
Web services are also dynamic. A file is a static collection of bytes: what you read is exactly was was written. A web service can provide a set of bytes that is constructed "on the fly".
Applications that use local computing -- desktop applications -- will continue to use files. Cloud applications will use web services.
Those web services will be, at some point, reading and writing files, or database entries, which will eventually be stored in files. Files will continue to exist, as the basement of data storage -- around, but visited by only a few people who have business there.
At the application layer, cloud applications and mobile applications will use web services. The web service will be the dominant method of storing, retrieving, and sharing data. It will become the dominant method because the cloud will become the dominant location for storing data. Local computing, long the leading form, will fall to the cloud.
The default location for data will be the cloud; new applications will store data in the cloud; everyone will think of the cloud. Local storage and local computing will be the oddball configuration. Legacy systems will use local storage; modern systems will use the cloud.
Monday, September 25, 2017
Monday, September 18, 2017
What Agile and Waterfall have in common
Agile and Waterfall are often described in contrasts: Waterfall is large and bureaucratic, Agile is light and nimble. Waterfall is old, Agile is new. Waterfall is... you get the idea.
But Waterfall and Agile have things in common.
First and most obvious, Waterfall and Agile are both used to manage development projects. They both deliver software.
I'm sure that they are both used to deliver things other than software. They are tools for managing projects, not limited to software projects.
But those are the obvious common elements.
An overlooked commonality is the task of defining small project steps. For Waterfall, this is the design phase, in which requirements are translated into system design. The complete set of requirements can paint a broad picture of the system, providing a general shape and contours. (Individual requirements can be quite specific, with details on input data, calculations, and output data.)
Breaking down the large idea of the system into smaller, code-able pieces is a talent required for Waterfall. It is how you move from requirements to coding.
Agile also needs that talent. In contrast to Waterfall, Agile does not envision the completed system and does not break that large picture into smaller segments. Instead, Agile asks the team to start with a small piece of the system (most often a core function) and build that single piece.
This focus on a single task is, essentially, the same as the design phase in Waterfall. It converts a requirement (or user story, or use case, or whatever small unit is convenient) into design for code.
The difference between Waterfall and Agile is obvious: Waterfall converts all requirements in one large batch before any coding is started, and Agile performs the conversions serially, seeing one requirement all the way to coding and testing (or more properly, testing and then coding!) before starting the next.
So whether you use Waterfall or Agile, you need the ability to "zoom in" from requirements to design, and then on to tests and code. Waterfall and Agile are different, but the differences are more in the sequence of performing tasks and not the tasks themselves.
But Waterfall and Agile have things in common.
First and most obvious, Waterfall and Agile are both used to manage development projects. They both deliver software.
I'm sure that they are both used to deliver things other than software. They are tools for managing projects, not limited to software projects.
But those are the obvious common elements.
An overlooked commonality is the task of defining small project steps. For Waterfall, this is the design phase, in which requirements are translated into system design. The complete set of requirements can paint a broad picture of the system, providing a general shape and contours. (Individual requirements can be quite specific, with details on input data, calculations, and output data.)
Breaking down the large idea of the system into smaller, code-able pieces is a talent required for Waterfall. It is how you move from requirements to coding.
Agile also needs that talent. In contrast to Waterfall, Agile does not envision the completed system and does not break that large picture into smaller segments. Instead, Agile asks the team to start with a small piece of the system (most often a core function) and build that single piece.
This focus on a single task is, essentially, the same as the design phase in Waterfall. It converts a requirement (or user story, or use case, or whatever small unit is convenient) into design for code.
The difference between Waterfall and Agile is obvious: Waterfall converts all requirements in one large batch before any coding is started, and Agile performs the conversions serially, seeing one requirement all the way to coding and testing (or more properly, testing and then coding!) before starting the next.
So whether you use Waterfall or Agile, you need the ability to "zoom in" from requirements to design, and then on to tests and code. Waterfall and Agile are different, but the differences are more in the sequence of performing tasks and not the tasks themselves.
Monday, September 11, 2017
Legacy cloud applications
We have legacy web applications. We have legacy Windows desktop applications. We have legacy DOS applications (albeit few). We have legacy mainframe applications (possibly the first type to be named "legacy").
Will we have legacy cloud applications? I see no reason why not. Any technology that changes over time (which is just about every technology) has legacy applications. Cloud technology changes over time, so I am confident that, at some time, someone, somewhere, will point to an older cloud application and declare it "legacy".
What makes a legacy application a legacy application? Why do we consider some applications "legacy" and others not?
Simply put, then technology world changed and the application did not. There are multiple aspects to the technology world, and any one of them, when left unchanged, may cause us to view an application as legacy.
It may the user interface. (Are we using an old version of HTML and CSS? An old version of JavaScript?) It may be the database. (Are we using a relational database and not a NoSQL database?) The the back-end code may be difficult to read. The back-end code may be in a language that has fallen out of favor. (Perl, or Visual Basic, or C, or maybe an early version of Java?)
One can ask similar questions about legacy Windows desktop applications or mainframe applications. (C++ and MFC? COBOL and CICS?)
But let us come back to cloud computing. Cloud computing has been around since 2006. (There was an earlier use of the term "cloud computing", but for our purposes the year 2006 is sufficient.)
So let's assume that the earliest cloud applications were built in 2006. Cloud computing has changed since then. Have all of these applications kept up with those changes? Or have some of them languished, retaining their original design and techniques? If they have not kept up with changing technology, we can consider them legacy cloud applications.
Which means, as owners or custodians of applications, we now not only have to worry about legacy mainframe applications and legacy web applications and legacy desktop applications. We can add legacy cloud applications to our list.
Cloud computing is a form of computing, but it is not magical. It evolves over time, just like other forms of computing. Those who look after applications must either make the effort to modify cloud applications over time (to keep up with the mainstream) or live with legacy cloud applications. That effort is an expense.
Like any other expense, it is really a business decision: invest time and money in an old (legacy) application or invest the time and money somewhere else. Both paths have benefits and costs; managers must decide which has the greater merit. Choosing to let an old system remain old is an acceptable decision, provided you recognize the cost of maintaining that older technology.
Will we have legacy cloud applications? I see no reason why not. Any technology that changes over time (which is just about every technology) has legacy applications. Cloud technology changes over time, so I am confident that, at some time, someone, somewhere, will point to an older cloud application and declare it "legacy".
What makes a legacy application a legacy application? Why do we consider some applications "legacy" and others not?
Simply put, then technology world changed and the application did not. There are multiple aspects to the technology world, and any one of them, when left unchanged, may cause us to view an application as legacy.
It may the user interface. (Are we using an old version of HTML and CSS? An old version of JavaScript?) It may be the database. (Are we using a relational database and not a NoSQL database?) The the back-end code may be difficult to read. The back-end code may be in a language that has fallen out of favor. (Perl, or Visual Basic, or C, or maybe an early version of Java?)
One can ask similar questions about legacy Windows desktop applications or mainframe applications. (C++ and MFC? COBOL and CICS?)
But let us come back to cloud computing. Cloud computing has been around since 2006. (There was an earlier use of the term "cloud computing", but for our purposes the year 2006 is sufficient.)
So let's assume that the earliest cloud applications were built in 2006. Cloud computing has changed since then. Have all of these applications kept up with those changes? Or have some of them languished, retaining their original design and techniques? If they have not kept up with changing technology, we can consider them legacy cloud applications.
Which means, as owners or custodians of applications, we now not only have to worry about legacy mainframe applications and legacy web applications and legacy desktop applications. We can add legacy cloud applications to our list.
Cloud computing is a form of computing, but it is not magical. It evolves over time, just like other forms of computing. Those who look after applications must either make the effort to modify cloud applications over time (to keep up with the mainstream) or live with legacy cloud applications. That effort is an expense.
Like any other expense, it is really a business decision: invest time and money in an old (legacy) application or invest the time and money somewhere else. Both paths have benefits and costs; managers must decide which has the greater merit. Choosing to let an old system remain old is an acceptable decision, provided you recognize the cost of maintaining that older technology.
Monday, September 4, 2017
Agile can be cheap; Waterfall is expensive
Agile can be cheap, but Waterfall will always be expensive.
Here's why:
Waterfall starts its process with an estimate. The Waterfall method uses a set of phases (analysis, design, coding, testing, and deployment) which are executed according to a fixed schedule. Many Waterfall projects assign specific times to each phase. Waterfall needs this planning because it makes a promise: deliver a set of features on a specific date.
But notice that Waterfall begins with an estimate: the features that can be implemented in a specific time frame. That estimate is crucial to the success of the project. What is necessary to obtain that estimate?
Only people with knowledge and experience can provide a meaningful estimate. (One could, foolishly, ask an inexperienced person for the estimate, but that estimate has no value.)
What knowledge does that experienced person need? Here are some ideas:
- The existing code
- The programming language and tools used
- The different teams involved in development and testing
- The procedures and techniques used to coordinate efforts
- The terms and concepts used by the business
- The programming language and tools used
- The different teams involved in development and testing
- The procedures and techniques used to coordinate efforts
- The terms and concepts used by the business
With knowledge of these, a person can provide a reasonable estimate for the effort.
These areas of knowledge do not come easily. They can be learned only by working on the project and in different capacities.
In other words, the estimate must be provided by a senior member of the team.
In other words, the team must have at least one senior member.
Waterfall relies on team members having knowledge about the business, the code, and the development processes.
Agile, in contrast, does not rely on that experience. Agile is designed to allow inexperienced people work on the project.
Thus, Agile projects can get by without senior, experienced team members, but Waterfall projects must have at least one (and probably more) senior team members. Since senior personnel are more expensive than junior, and Waterfall requires senior personnel, we can see that Waterfall projects will, on average, cost more than Agile projects. (At least in terms of per-person costs.)
Do not take this to mean that you should run all projects with Agile methods. Waterfall may be more expensive, but it provides different value. It promises a specific set of functionality on a specific date, a promise that Agile does not make. If you need the promises of Waterfall, it may be worth the extra cost (higher wages). This is a business decision, similar to using proprietary tools over open-source tools, or leasing premium office space in the suburbs over discount office space in a not-so-nice part of town.
Which method you choose is up to you. But be aware that they are not the same, not only in terms of deliverables but in staffing requirements. Keep those differences in mind when you make your decision.
Thus, Agile projects can get by without senior, experienced team members, but Waterfall projects must have at least one (and probably more) senior team members. Since senior personnel are more expensive than junior, and Waterfall requires senior personnel, we can see that Waterfall projects will, on average, cost more than Agile projects. (At least in terms of per-person costs.)
Do not take this to mean that you should run all projects with Agile methods. Waterfall may be more expensive, but it provides different value. It promises a specific set of functionality on a specific date, a promise that Agile does not make. If you need the promises of Waterfall, it may be worth the extra cost (higher wages). This is a business decision, similar to using proprietary tools over open-source tools, or leasing premium office space in the suburbs over discount office space in a not-so-nice part of town.
Which method you choose is up to you. But be aware that they are not the same, not only in terms of deliverables but in staffing requirements. Keep those differences in mind when you make your decision.
Labels:
agile,
agile development,
agile methods,
waterfall,
waterfall methods
Subscribe to:
Posts (Atom)