Showing posts with label ARM. Show all posts
Showing posts with label ARM. Show all posts

Monday, December 21, 2020

Measuring the success of .NET

Microsoft released .NET (and the language C#) in the early 2000s. With almost 20 years of experience, can we say that NET has been a success?

Microsoft, I am certain, will claim that .NET has been a success and continues to be a success. But they have a vested interest in the success of .NET, so their opinion is possibly biased.

It strikes me that Apple's move to ARM processors has provided us with a measure for the success of .NET. How can that be? Well, let's look at programming languages.

Programming languages come in three flavors: compiled, interpreted, and byte-code. Each has a different way of executing instructions to perform the desired calculations.

Compiled languages convert source code into machine code, and that machine code is directly executable by the processor. In today's world, that means (usually) that the compiler produces code for the Intel x86 processor, although other processors can be the "target". Once compiled, the executable form of the program is usable only on a system with appropriate processor. Code compiled for the Intel x86 runs only on the Intel x86 (or a compatible processor such as one from AMD). Notably, the compiled code cannot be used on a different processor, such as the ARM processors. (Apple can run Intel code on its new ARM processors because it emulates the Intel processor. The emulator pretends to be an Intel processor, and the code doesn't know that it is running on an emulator.)

Interpreted languages take a different approach. Instead of converting the source code to executable code, the interpreter parses each line of source code and executes it directly. The importance of this is that a program written in an interpreted language can be run on any processor, provided you have the interpreter.

BASIC, one of the earliest interpreted languages, ran on different processors, including mainframes, minicomputers, and microcomputers. All of them could run the same program, without changes, thus a BASIC program was quite "portable".

In between compiled languages and interpreted languages are the byte-code languages, which are a little bit of interpreter and little bit compiler. Byte-code languages are compiled, but not to a specific processors. Or rather, they are compiled to an imaginary processor, one that does not exist. The code produced by the compiler (often called "byte code") is interpreted by a small run-time system. The idea is that the run-time system can be created for various processors, and the same byte-code can run on any of them.

Java uses the byte-code approach, as does C#, VB.NET (all the .NET languages, actually), and the languages Python, Ruby, Perl, and Raku (and a bunch more).

The languages C, C++, Objective-C, Go, and Swift are compiled to executable code.

I can think of no programming languages that use the pure interpreted approach, at least not nowadays. The two languages that used an interpreted approach were BASIC and FORTH, and there were some implementations of BASIC that were byte-code and some even were fully compiled. But that's not important in 2020.

What is important in 2020 is that Apple is moving from Intel processors to ARM processors, and Microsoft may be considering a similar move. Of the two companies, Microsoft may be in the better position.

Apple uses the languages Objective-C and Swift for its applications, and encourages third-party developers to use those languages. Both of those languages are compiled, so moving from Intel processors to ARM processors means that programs, originally compiled for Intel processors must recompiled to run on the ARM processors.

Unless, of course, you run the Intel code on an ARM processor with an emulator. Not every application is ready to be recompiled, and not every third-party developer is ready to recompile their applications, so the emulator is an easy way to move applications to the ARM processors. The emulator is important to Apple, and I'm sure that they have spent a lot of time and effort on it. (And probably will spend more time and effort in the next year or two.)

Microsoft, in contrast, uses a large set of languages and encourages third-party developers to use its .NET framework. That puts it in a different position than Apple. While Apple's applications must be recompiled, Microsoft's applications do not. Microsoft can provide an ARM-based Windows and and ARM-based .NET framework, and all of the applications written in .NET will run. (And they will run without an emulator.)

Microsoft has a nice, simple path to move from Intel to ARM. Alas, the world is not so simple, and Microsoft's situation is not so simple.

The migration from Intel to ARM is easy (and low-risk) only for applications that are written completely in .NET. But applications are not always written completely in .NET. Sometimes, applications are written partially in .NET and partially in "native code" (code for the underlying processor). There are two reasons for such an approach: performance and legacy code. Native code tends to run faster than .NET byte-code, so applications that require high performance tend to be written in native code.

The other reason for native code is legacy applications. Large applications written prior to the introduction of .NET (and yes, that is twenty years ago) were written in a language that compiled to Intel code, typically C or C++. Converting that code from C or C++ to a .NET languages (C#, VB.NET, or the not-quite-C++ that was "managed C++") was a large effort, and entailed a large risk. Better to avoid the effort and avoid the risk, and maintain the application in its original language.

With all of that as a prelude (and I admit it is a long prelude), we can now look at the success of .NET.

If Microsoft releases a version of Windows for the ARM processor, we can see how many applications can migrate from Intel-based Windows to ARM-based Windows, and how many applications cannot. The applications that can move will be those applications written completely in .NET. The applications that cannot migrate (or that need an emulator) are those applications that are written all or partially in native code.

The degree of total (or pure) .NET applications can be a measure of the success of .NET. The more applications that use .NET (and no native code), the more success we can attribute to .NET.

The degree of native-code applications (partial or total), on the other hand, indicates a failure of .NET. It indicates choices made to not use .NET and use a different platform (Windows API, MFC, etc.) instead.

That is my measurement of the success of .NET: How many applications can move from Intel to ARM without recompiling. If Microsoft announces Windows for ARM, let's see which applications can move immediately and without the assistance of an emulator.


Thursday, July 2, 2020

Rough seas ahead for Intel

Apple's announced that it is switching processors from Intel to ARM. (For the MacBook. Apple has used ARM for iPhone and iPad for years.) This announcement indicates problems for Intel.

But first, some thoughts on "market leaders".

We like to think that markets have leaders, and those leaders set the direction and pace for the market, much as a squad leader in the military sets the direction and pace for the squad. Let's call this the "true leader".

There is another type of leader, one that instead of setting the direction and pace, looks at the crowd and runs out in front, and then claims leadership. Let's call this the "opportunistic leader".

So is Intel a true leader or an opportunistic leader? I think it is the latter.

One could argue that Intel lost its ability to lead the market in the mid-1980s, with the specific culprit being the IBM PC. Prior to the IBM PC, the market for personal computers was fractured, with different manufacturers using different processors. Apple and Commodore used the 6502, Tandy used the Zilog Z-80 in the TRS-80, and lots of others used Intel's 8080. Intel had the 8088 and 8086 processors, but very few manufacturers used them.

In the 1980s, Intel had a plan for the future of microprocessors - the iAPX line. It was a plan that built on the 16-bit 8086 but expanded to different architectures as the processors became more powerful. There were several specifications for processors, culminating with the iAPX 432 a 32-bit processor. (At the time, this was impressive.) The more powerful processors were not compatible with the 8-bit 8080, or (notably) the 16-bit 8086. It was a nice plan.

The IBM PC and the market for the IBM PC invalidated that plan. People wanted the IBM PC, which meant the Intel 8088. Later, it meant the IBM PC AT with the Intel 80286, which was not an iAPX processor.

This shows that Intel was not leading the market in the "true leader" sense.

Want another example? Let's look at "itanium", Intel's attempt at 64-bit architecture. (Also known as "IA-64".) It was a processor that was not compatible with the existing 32-bit software. Worse, it has a design that pushed work to the compiler, and some of that work was not determinable at compile time. AMD countered with x86_64, a design compatible with existing 32-bit code, and customers wanted that instead of Intel's offering. They wanted it so much that Intel eventually adopted the x86_64 design and abandoned IA-64.

I think these two examples show that Intel is more of the opportunistic leader than the true leader. Intel can design processors, and can design and manufacture the chips and build the support circuitry, but it's business strategy is to run out in front of the market.

Which brings us to Apple and ARM processors.

Apple has announced that it will switch from Intel processors to ARM processors for its Macbook laptops.

Is Apple a true market leader? Or are they opportunistic, running out in front of the crowd?

It doesn't matter.

It may be that Apple's decision will "move the market". It may be that Apple's "spidey sense" has detected a desire to shift to ARM processors, and Apple is first in the market to make the change. Whichever reason, Apple is doing it.

I think that not only Apple will shift from Intel to ARM, but the rest of the market will, too. For two reasons: cost and efficiency.

People will want computers with ARM processors because they (the computers) will have a lower price. 

People will want computers with ARM processors because they (the computers) will be more energy-efficient. (Which means a longer battery life.)

People who buy computers generally don't care about the processor. They care about running applications, and ARM can deliver that.

Windows is ready to move to ARM. Linux is ready to move to ARM. Apple is not only ready to move to ARM but is in fact moving to ARM.

What will happen to Intel?

Intel has a definite problem. They have lost the low-end market for processors. They are losing the mid-range market for processors.

One might think that Intel may find haven in high-end processors, running servers and other high-power computers. And there may be some space for them, but there is a problem: IBM.

IBM has staked out the top end of the market, with System/z processors (the successor to IBM's System/360, System/370, and System/390 processors) running the big servers that host virtual machines. IBM has a competent business there, one that will not be easily displaced by Intel's x86 processors.

Intel has a market that it being eaten by ARM at the bottom end and limited by IBM at the top end. Not an enviable position.

And certainly not a market leader, in any sense.