Sunday, January 23, 2022

Will Microsoft Change Windows to Linux

People, from time to time, ask about Microsoft changing from Windows to Linux. When they do, lots of people respond. The responses fall into two general categories: Microsoft will switch to Linux because it is the superior operating system, and Microsoft will stick with Windows because it is the superior operating system.

The rebuttals are always -- always -- in the technical realm. Linux is better at this, and Windows is better at that.

I have a different response.

Microsoft will switch from Windows to Linux if, and when, it is Microsoft's interest to switch.

In the 1990s and 2000s, Windows was a key part of its strategy. Microsoft sold software (or licenses for software, which amounts to the same thing) and it used Windows as a base for its other products. Office ran on Windows (and versions for Mac OS, which were a special case). SQL Server ran on Windows. Internet Explorer ran on Windows. Outlook ran on Windows, and talked to Exchange, which also ran on Windows. Visual Studio ran on Windows. SourceSafe ran on Windows (and Unix, because it had been developed by an independent company and sold to Microsoft).

During that period, Microsoft would never consider switching from Windows to Linux. Such a move would destroy Microsoft's strategy of "everything on Windows".

Today, Microsoft offers services that extend beyond Windows, and some of them use Linux. Azure provides cloud services. One can provision Linux servers as well as Windows servers (and pay Microsoft for both). Microsoft has less incentive to force customers to use Windows.

In addition, Microsoft is moving its apps into the cloud and onto the web. One can open and edit Word documents and Excel spreadsheets in a browser. (The online versions of Word and Excel are limited compared to the locally-installed versions. I expect the online versions to improve over time.) Microsoft has also created a cloud-based, web version of Visual Studio Code, which lets programmers collaborate across multiple operating systems.

Microsoft has dropped the "everything on Windows" strategy in favor of a "sell services and subscriptions" strategy. It doesn't require Windows to be at the center of the customer experience.

Will Microsoft replace Windows with Linux? The proper way to look at the question is not in the technical realm, but in the financial realm. If Microsoft can make more money with Linux than Windows, it should (and probably will) offer Linux.

Windows provides an income stream, in the form of licenses. Microsoft is moving from a "buy once until you upgrade" approach to an annual subscription. The latter is more predictable, for both Microsoft and customers, and seems to provide higher revenue to Microsoft. But the point is that Windows provides income to Microsoft.

Windows is also an expense for Microsoft. The development, maintenance, and support for Windows requires time and effort in significant quantities.

The question then becomes: which is the higher number? Does revenue cover expenses (and then some)? Or does Windows cost more to maintain than it brings in revenue?

The current capabilities of Microsoft's cloud-based web applications are such that locally-installed applications provide more to the customer. Some day that may change. Until it does, those advantages translate to incentives to support Windows.

Technical arguments can be fun. They can also be heated. But they are not the way to convince Microsoft to switch to Linux. Or to stay with Windows. The decision is a financial one, not a technical one.


Wednesday, January 12, 2022

Successful programming languages

The IT world has seen a number of programming languages. Some became popular and some did not.

With more than half a century of experience, we can see some patterns in languages that became popular. First, let's review some of the popular programming languages.

FORTRAN and COBOL were successful because they met a business need (or two, perhaps). The primary need was for a programming language that was easier to understand than assembly language. The secondary need was for a programming language that could be used across computers of different manufacturers, allowing companies to move programs from one vendor's hardware to another. Both FORTRAN and COBOL met those needs.

BASIC was modestly successful on minicomputers and wildly successful on personal computers. It was possible to run BASIC on the small PCs (sometimes with as little as 4K of memory!). It was easy to use, and amateur programmers could write and run programs relatively easily. It filled the technical space of timesharing on minicomputers, and the technical space of personal computers.

C became popular in the Unix world because it was included in Unix distributions. If you ran Unix, you most likely programmed in C. One could say that it was pushed upon the world by AT&T, the owners of Unix.

SQL became successful in the 1980s, just as databases became available and popular. Prior to databases, computer systems offered "file managers" and "file access libraries" which allowed basic operations on records but not tables. Each library had its own set of capabilities and its own API. SQL provided a common set of operations and a common API. If allowed businesses to move easily from one database to another,  guaranteed a core set of operations, and permitted a broad set of programmers to work on multiple systems.

C++ became popular because it solved a problem with C programs, namely the organization of large programs. C++ offered object-oriented concepts of classes, inheritance, encapsulation, and polymorphism, each of which helps organize code.

Visual Basic became popular because it provided an easy way to write programs for Windows. Microsoft's earlier Visual C++ required knowledge of the Windows API and lots of discipline. Visual Basic required neither, hiding the Windows API from the programmer and providing safety in the programming language.

Objective-C became popular because Apple used it for programming applications for Macintosh computers. (Later, when Apple switched to the Swift programming language, interest in Objective-C plummeted.)

Java became popular because it promised that people could write programs once and then run them anywhere. It did a good job of delivering on that promise, too.

C# is Microsoft's version of Java (its second version, after Visual J++) and is popular only in that Microsoft pushes it. If Microsoft were to disappear overnight, interest in C# would drop dramatically.

Swift is Apple's language for development of applications for the iPhone, iPad, and other Apple products. It is successful because Apple pushes it upon the world.

JavaScript became popular because it was ubiquitous. Much like BASIC on all PCs, JavaScript was in all browsers, and the combination of HTML, the DOM, and JavaScript allowed for web applications with powerful processing in the browser.

Python became popular because it was a better Perl. Python had a simpler syntax and also had object-oriented programming built in.

Notice that, with the exception of Rust replacing C or C++, these new languages become popular in new spaces. They don't replace an existing popular language. BASIC didn't replace COBOL or FORTRAN, it became popular in the new spaces of timesharing and personal computers. C# didn't replace Java; it joined Visual Basic in the Microsoft space and slowly gained in popularity as Microsoft supported it more than Visual Basic.

So we can see that there are a few reasons that languages become popular:

  • A vendor pushes it
  • It solves a commonly-recognized business problem
  • It fills a technical space

If we accept these as the reasons that languages become popular, we can make some predictions about new languages that become popular. We can say that, if a major vendor pushed a language for its projects (a vendor such as Amazon, for example) then that language would become popular.

Or, we could say that a new technical space would allow a new language to become popular.

Or, if there is a commonly recognized business problem with our current set of programming languages, a new language that solves that problem would become popular.

So what do we see?

If we accept that C and C++ have problems (memory management, buffer overflows) and we accept that those problems are commonly recognized, then we can see the replacement of C and C++ with a different language, one that addresses those problems. The prime contender for that is Rust. We may see a gradual shift from C and C++ programming to Rust, as more and more people develop confidence in Rust and develop a fear of memory management and buffer overrun issues.

One technical space that could provide an opportunity for a new programming language is the Internet of Things (IoT). Appliances and devices must communicate with each other and with servers. I suspect that the IoT space is in need more of protocols than of programming languages, but perhaps there is room for a programming language that works with new protocols to establish trusted connections and, more importantly, trusted updates.

A third area is a teaching language. BASIC was designed, initially, for people not familiar with programming. Pascal was designed to teach structured programming. Do we have a language designed to teach object-oriented programming? To teach functional programming? Do we even have a language to teach basic programming? BASIC and Pascal have long been discarded by the community, and introductory courses now often use Java, JavaScript, or Python which are all rather complex languages. A new, simple language could gain popularity.

As we develop new devices for virtual reality, augmented reality, and other aides, we may need new programming languages.

Those are the areas that I think will see new programming languages.

Tuesday, January 11, 2022

Predictions for 2022

Let's start with some simple, obvious predictions:

Apple will announce its series of M2 chips (or at least the first chip of the M2 series) and will focus on products for individuals such as heads-up displays and appliances for the home.

Microsoft will focus on security and collaboration. It has already announced its "Pluton" chip that integrates into CPUs. It already has collaboration in its office and development products (Word, Excel, Visual Studio). I expect that Microsoft will enhance those capabilities. Oh, and deliver a revised version of Windows 11.

Facebook will deal with its current scandals (anti-trust actions, harmful content with no mitigation) and possibly some new ones. I expect to see changes in senior management and organization of company. Mark Zuckerberg will either leave or be forced out.

Amazon... will continue to do what it has been doing. It will continue to offer online sales, cloud services, and a smattering of consumer devices.

Google will continue to develop Android and ChromeOS, and will continue to merge the two. Google's cloud services run a distant third after Amazon and Microsoft; I think we will see development and advertising for Google's offerings.

In other words, things will be fairly boring.

Some less obvious predictions:

No new programming languages. At least, no new popular programming languages will arise in 2022. The current set of programming languages (C#, Java, Python, JavaScript, C++, SQL) is meeting the needs of industry. There will be lots of new languages, of course. People invent programming languages and each week sees an announcement of a new language. But the popular languages will remain popular.

No major changes to hardware. Laptop and desktop computers and smart phones and tablets are supplying the needed computing power. There is no compelling reason to change. (Microsoft and PC vendors may envy Apple's system-on-chip offerings, but won't be in a position to offer comparable hardware for some time. Instead, look to Intel to provide bigger and faster chips (as they have just announced this month).

Work-from-home, or remote work, will continue to be important in 2022. But 2022 will be different, in that companies will commit more firmly to remote work. It will be CFOs that lead this change; they will insist that companies stop paying for office space that remains vacant. CEOs may stall, but the finances are very persuasive. Corporate boards will look at the numbers and side with the CFOs on this issue.

NFTs (non-fungible tokens) will see a slow decline of interest. Current interest is driven by novelty and potential; the future may see useful functions associated with them.

The "Internet of Things" or IOT (sometimes IoT) will see some growth as multiple companies make internet-connected appliances and devices. We already have light bulbs, door cameras, home temperature systems, and tracking tags. Apple's idea of iPhones as connection points will be copied by Google, and Android phones will also serve as connection points. (Although the two systems will be incompatible, much like the early days of telephone service which saw multiple companies with independent networks serving the same town or city.)

All in all, 2022 will be a year of modest changes, small improvements, and few surprises.

Or so I think.

Monday, January 3, 2022

The biggest gains of Apple's M processors is behind us

Improvements in hardware are not linear. If we look at the performance of hardware over time, we can see that performance improvements follow a pattern: a sharp rise in performance followed by a period of little improvements. (A graph looks like a staircase, with the pattern of "rise, flat, rise, flat".)

The first implementation of a change provides significant increases. Over time, we refine the improvements and gain additional increases. But those later increases are smaller. Eventually, subsequent refinements provide minimal improvements. So we move on to other ideas.

The history of personal computers follows this pattern. We have made a number of changes to hardware that have improved performance. Each of those changes yielded a large initial gain, and then gradually diminishing improvements.

We have increased the clock speed.

We changed memory technology from "core" memory (ferrous rings) to transistor-based memory (static at first, and then dynamic memory).

We have added caches, to store values in processor, reducing the dependence on memory. We liked the idea of caches so much that we did it more than once. Processors now have three levels of caches.

We have off-loaded work to (smarter) devices. Devices now have their own processors and can perform tasks independently of the CPU.

We have increased the number of CPU cores, which improves performance for systems with multiple processes and multiple threads. (Which is just about every system we have today.)

Each of these changes improved performance. A large step up at first, and then smaller increases.

Now Apple has used another method to improve performance: Reduce distance between chips with system-on-a-chip designs. The M1 chips include all components of the computer: CPU, memory, storage, GPU, and more.

Yet the overall pattern of improvements will hold with this new design. The first M1 chip will have significant improvements over the older design of discrete components. The M1 Pro and M1 Max will have improvements over the M1 chip, but not larger than the initial M1 gains.

Later chips, such as the M2, M2 Pro, and even the M3, will have gains, but less and less (in terms of percentages) than the previous chips. The performance curve, after a sharp rise with the M1 chip, will flatten. Apple will have entered the "plain of modest gains" phase.

Apple's M1 chips are nice. They provide good performance. Newer versions will be better: faster, more powerful. But the biggest increases, I think, are already behind us.