Sunday, November 21, 2021

CPU power and developer productivity

Some companies have issued M1-based Macbooks to developers. Their reason? To improve productivity.

Replacing "regular" PCs with M1-based Macbooks is nice, and it certainly provides developers with more CPU power, but does it really increase the productivity of developers?

In a very simple way, yes, it does. The more powerful Macbooks will let developers compile faster, build deployment packages faster, and run tests faster.

But the greater leaps in developer productivity were not performing steps faster.

Increases in developer productivity come not from raw CPU power, or more memory, or faster network connections, or higher-resolution displays. Meaningful increases come from better tools, which in turn are built on CPU power, memory, network connections, and displays.

The tools that have helped developers become productive are many. They include programming languages, compilers and interpreters, editors, debuggers, version control systems, automated test systems, and communication systems (such as e-mail or chat or streamed messages like Slack).

We build better tools when we have the computing power to support them. In the early days of computing, prior to the invention of the IDE (integrated development environment), the steps of editing and compiling were distinct and handled by separate programs.

Early editors could not hold the entire text of a long program in memory at once, so they had special commands to "page" through the file. (One opened the editor and started with the first "page" of text, made changes and got things right, and then moved to the next "page". It sounds like the common "page down" operation in modern editors, except that a "page" in the old editors was longer than the screen, and -- note this -- there was no "page up" operation. Paging was a one-way street. If you wanted to go back, you had to page to the end of the file, close the file, and then run the editor again.

Increased memory ended the need for "page" operations.

The first integrated environment may have been the UCSD P-System, which offered editing, compiling, and running of programs. The availability of 64K of memory made this system possible. Unfortunately, the CPUs at the time could not support the virtual processor (much like the later JVM for Java) and the p-System never achieved popularity.

The IBM PC, with a faster processor and more memory, made the IDE practical. In addition to editing and  compiling, there were add-ons for debugging.

The 80386 processor, along with network cards and cables, made Windows practical, which made network applications possible. That platform allowed for e-mail (at least within an organization) and that gave a boost to lots of employees, developers included. Networks and shared file servers also allowed for repositories of source code and version control systems. Version control systems were (and still are) an immense aid to programming.

Increases in computing power (CPU, memory, network, or whatever) let us build tools to be more effective as developers. An increase of raw power, by itself, is nice, but the payoff is in the better tools.

What will happen as a result of Apple's success with its M1 systems?

First, people will adopt the new, faster M1-based computers. This is already happening.

Second, competitors will adopt the system-on-a-chip design. I'm confident that Microsoft is already working on a design (probably multiple designs) for its Surface computers, and as a reference for other manufacturers. Google is probably working on designs for Chromebooks.

Third, once the system-on-a-chip design has been accepted in the Windows environment, people will develop new tools to assist programmers. (It won't be Apple, and I think it won't be Google either.)

Where-ever the source, what kinds of tools can we expect? That's an interesting question, and the answer is not obvious. But let us consider a few things:

First, the increase in power is in CPU and GPU capacity. We're not seeing an increase in memory or storage, or in network capacity. We can assume that innovations will be built on processing, not communication.

Second, innovations will probably help developers in their day-to-day jobs. Developers perform many tasks. Which are the tasks that need help? (The answer is probably not faster compiles.)

I have a few ideas that might help programmers.

One idea is to analyze the run-time performance of programs and identify "hot spots" -- areas in the code that take a lot of time. We already have tools to do this, but they are difficult to use and run the system in a slow mode, such that a simple execution can take minutes instead of seconds. (A more complex task can run for an hour under "analyze mode".) A faster CPU can help with this analysis.

Another idea is for debugging. The typical debugger allows a programmer to step through the code one line at a time, and to set breakpoints and run quickly to those points in the code. What most debuggers don't allow is the ability to go backwards. Often, a programming stepping through the code gets to a point that is know to be a problem, and needs to identify the steps that got the program to that point. The ability to "run in reverse" would let a programmer "back up" to an earlier point, where he (or she) could see the decisions and the data that lead to the problem point. Computationally, this is a difficult task, but we're looking at a significant increase in computational power, so why not?

A third possibility is the analysis of source code. Modern IDEs perform some analysis, often marking code that is syntactically incorrect. With additional CPU power, could we identify code that is open to other errors? We have tools to identify SQL injection attacks, and memory errors, and other poor practices. These tools could be added to the IDE as a standard feature.

In the longer term, we may see new programming languages emerge. Just as Java (and later, C#) took advantage of faster CPUs to execute byte-code (the UCSD p-System was a good idea, merely too early) new programming languages may do more with code. Perhaps we will see a shift to interpreted languages (Python and Ruby, or their successors). Or maybe we will see a combination of compiled and interpreted code, with some code compiled and other code interpreted at run-time.

More powerful computers let so do things faster, but more importantly, they let us do things differently. They expand the world of computation. Let's see how we use the new world given to us with system-on-a-chip designs.

Thursday, November 11, 2021

M1 is not the only option for productivity

A number of companies have announced that they are equipping their developers with M1 MacBooks, to improve performance of tasks such as builds and, I presume, tests.

The thinking runs along these lines: the tasks developers perform are important, some of these tasks take a long time, the new M1 MacBooks perform these tasks quickly, therefore providing developers with M1 MacBooks is an investment that improves productivity. (And an increase in productivity is a good thing.)

The supporting arguments often use the time to run a build, or the time to perform automated tests. The M1 MacBooks, according to the argument, can perform these tasks much faster than the current equipment. The implied benefits are often described as a reduction in expenses, which I believe is an incorrect result. (The company will continue to pay its developers, so their annual expenses will not change as a result of the new MacBooks -- except for the cost of the MacBooks.)

But there is another aspect to this "rush to faster computers" that I think has been overlooked. That aspect is cloud computing.

If one is using laptops or desktops with Windows, one can move that work to virtual instances of Windows in the cloud. Microsoft's "Windows 365" service offers Windows in the cloud, with different options for processor power, memory, and storage. One can rent a fast processor and get the same improvement in computing.

Let's look at some numbers. A new M1 MacBook Pro with a 14-inch screen costs $2000 and with a 16-inch screen costs $2500. (There are multiple configurations; these are the lowest prices.)

If those MacBooks last 3 years (a reasonable assumption) then the amortized costs are $56 per month or $69 per month.

Now let's consider an alternative: Windows virtual machines in the cloud. Microsoft's "Windows 365" offers different configurations for prices ranging from $31 per month to $66 per month.

Of course, one still needs a local PC to access the cloud-based Windows, so let's add that cost, too. But we don't need a high-end laptop: The local PC is simply a fancy terminal: a device to accept keystrokes and mouse clicks and send them to the cloud-based PC, and accept screen updates and display them to the user. We don't need a lot of processing power for that.

One can get a decent 14-inch laptop for $600 (less if you hunt for bargains) and a decent 15.6-inch laptop for about the same. Assuming a purchase cost of $600, the monthly addition is $17, which pushes the monthly costs for the cloud-based configuration to $73 or $86. That's a bit higher than the cost of the local MacBook, but not that much higher. And keep in mind that with Windows 365, Microsoft handles some tasks for you, such as updates.

I don't consider a cloud-based solution for MacBooks, because cloud-based MacBooks are different from cloud-based Windows PCs. Windows PCs are often virtualized instances running on high-end hardware; MacBooks in the cloud are Mac computers in a datacenter -- not virtualized instances. A MacBook in the cloud is really just a MacBook at a remote location.

My point is not that cloud-based Windows PCs are better than MacBooks, or that local MacBooks are better than local Windows PCs.

My point is that one has different options for computing. Local MacBooks are one option. Local Windows PCs are another option. Cloud-based Windows PCs are an option. (And if you insist, cloud-based Macs are an option.)

Some companies are pursuing a strategy of local MacBooks. That strategy may be good for them, It does not automatically follow that the same strategy is good for everyone. (Nor does it follow that the strategy is good for them; time will tell.)

My advice is to consider the different options for computing, review your needs and your finances, and select a strategy that works for you.