The notion of a "full stack" developer has been with us for a while, Some say it is a better way to develop and deploy systems, others take the view that it is a way for a company to build systems at lower cost. Despite their differing opinions on the value of a full stack engineer, everyone agrees on the definition: A "full stack" developer (or engineer) is a person who can "do it all" from analysis to development and testing (automated testing), from database design to web site deployment.
But here is a question: Why was there a split in functions? Why did we have separate roles for developers and system administrators? Why didn't we have combined roles from the beginning?
Well, at the very beginning of the modern computing era, we did have a single role. But things became complicated, and specialization was profitable for the providers of computers. Let's go back in time.
We're going way back in time, back before the current cloud-based, container-driven age. Back before the "old school" web age. Before the age of networked (but not internet-connected) PCs, and even before the PC era. We're going further back, before minicomputers and before commercial mainframes such as the IBM System/360.
We're going back to the dawn of modern electronic computing. This was a time before the operating system, and individuals who wanted to use a computer had to write their own code (machine code, not a high-level language such as COBOL) and those programs managed memory and manipulated input-output devices such as card readers and line printers. A program had total control of the computer -- there was no multiprocessing -- and it ran until it finished. When one programmer was finished with the computer, a second programmer could use it.
In this age, the programmer was a "full stack" developer, handling memory allocation, data structures, input and output routines, business logic. There were no databases, no web servers, and no authentication protocols, but the programmer "did it all", including scheduling time on the computer with other programmers.
Once organizations developed programs that they found useful, especially programs that had to be run on a regular basis, they dedicated a person to the scheduling and running of those tasks. That person's job was to ensure that the important programs were run on the right day, at the right time, with the right resources (card decks and magnetic tapes).
Computer manufacturers provided people for those roles, and also provided training for client employees to learn the skills of the "system operator". There was a profit for the manufacturer -- and a cost to be avoided (or at least minimized) by the client. Hence, only a few people were given the training.
Of the five "waves" of computing technology (mainframe, minicomputers, personal computers, networked PCs, and web servers) most started with a brief period of "one person does it all" and then shifted to a model that divided labor among specialists. Mainframes specialized with programmers and system operators (and later, database administrators). Personal computers, by their very nature, had one person but later specialists for word processing, databases, and desktop publishing. Networked PCs saw specialization with enterprise administrators (such as Windows domain administrators) and programmers each learning different skills.
It was the first specialization of tasks, in the early mainframe era, that set the tone for later specializations.
Today, we're moving away from specialization. I suspect that the "full stack" engineer is desired by managers who have tired of the arguments between specialists. Companies don't want to hear sysadmins and programmers bickering about who is at fault when an error occurs; they want solutions. Forcing sysadmins and programmers to "wear the same hat" eliminates the arguments. (Or so managers hope.)
The specialization of tasks on the different computing platforms happened because it was more efficient. The different jobs required different skills, and it was easier (and cheaper) to train some individuals for some tasks and other individuals for other tasks, and manage the two groups.
Perhaps the relative costs have changed. Perhaps, with our current technology, it is more difficult (and more expensive) to manage groups of specialists, and it is cheaper to train full-stack developers. That may say more about management skills than it does about technical skills.
Subscribe to:
Post Comments (Atom)
1 comment:
Interesting argument, John. Another argument for full stack would be my own experiences as a consultant. I had the luck to end up performing a variety of different tasks - system administrator, build-master, tester, programmer in lots of different languages, even architecture roles. What I noticed was that programmers who didn't get to wear those other hats, generally, weren't as good as those who had had to do the other tasks. This was most obvious when working with packaged software "configurators". The full stack engineer, by forcing all those skills together creates the environment that would forge architects and world-class programmers.
Post a Comment