Sunday, August 8, 2021

Apple and the photographs

There has been a lot of discussion about Apple's plan to identify photographs that contain illegal material. Various comments have been made on the "one in one trillion" estimate for false positives. Others have commented on the ethics of such a search.

One idea I have not seen discussed is the reason for Apple to do this. Why would Apple choose to identify these images? Why now?

It doesn't help sell Apple products.

It doesn't help sell Apple services.

It doesn't improve Apple's reputation.

Yet Apple made the effort to identify illegal photographs, which included requirements, design, coding, and testing. It cost Apple money to do this.

If Apple gains no revenue, gains no reputation, has no material gain at all, then why should they do it? Why make the effort and why expend the money?

Perhaps there is a gain, but one that Apple has not revealed to us. What unrevealed reason could Apple have to examine and identify photographs on Apple equipment? (Or more specifically, in Apple iCloud?)

The one thought that comes to mind is an agreement with law enforcement agencies. In such an agreement, Apple scans photographs and reports illegal material to law enforcement agencies. In exchange, Apple gets... what? Something from the government? Payment? Or perhaps they don't get something from the government, such as a lawsuit or regulatory interference.

I'm speculating here. I have no knowledge of Apple's motivations. Nor do I have any knowledge of such a deal between government and Apple -- or any company. (Any keep in mind that there are governments other than the US government that may make such requests.)

But if there is a deal, then perhaps Apple is not the last company to perform such action. We may see other companies announce similar efforts to identify illegal material. Worse, we may learn that some companies have implemented such programs without informing their customers.

Thursday, August 5, 2021

The roaring success of Windows 365

Microsoft announced Windows 365, its "Windows as a Service" offering that lets one (if one is a business) create and use virtual Windows desktops. And just as quickly, Microsoft announced that it was suspending new accounts, because too many had signed up for the service.

A few thoughts:

First (and obvious) is that Microsoft underestimated the demand for Windows 365. Microsoft hosts the service on its Azure framework, and if the demand for Windows 365 outstrips the available servers, then it is popular indeed.

Second, (and perhaps less obvious) is that Microsoft is (probably) kicking themselves for pricing the service as they did. With strong demand, Microsoft apparently "left money on the table" and could have charged higher rates.

Third, (and also not so obvious) is that Microsoft's business customers (that is, most businesses) really want to move from their current arrangement for PC hardware to a different one. They either want to move to Windows 365 or they want to try it -- which indicates that they are not happy with their current PC platform. (That current platform might be physical, real-world PCs on employee desks, or it might be virtual PCs accessed by Remote Desktop or Citrix or some other mechanism.)

The desire for customers to try a different solution is, in my mind, a warning for Microsoft. It means that customers are not happy with the current state of Windows and its support -- new versions, upgrades, security patches, and administration. It could mean that customers will entertain other solutions, including Linux and Apple.

A few other thoughts:

With demand from the business community much stronger than expected, Microsoft will probably focus on business customers. In other words, I expect Microsoft to offer no comparable service for individuals or families -- at least not for the next two years. I expect the cost for a consumer product is higher than the cost for a commercial product, and the revenue for a consumer product is lower than the revenue for a business product.

Microsoft may leverage demand for the Windows 365 service to spur Windows 11 sales. They have announced Windows 365 with access from various devices, and the opportunity is to provide additional services for Windows 11 clients. (Better security, better integration between access computer and virtual desktop, and possibly lower costs.)

Finally, there is the outside chance that Microsoft will provide a special edition of Windows 11, one that is stripped down and usable only to access Windows 365. (Much like ChromeOS is suitable only to run Chrome.) Microsoft may even design and sell hardware to support a Windows 11 "C" mode ("C" for "Connectivity"?).

The strong demand for Windows 365 shows that lots of people are interested in it. Microsoft won't ignore that. Be prepared for more announcements for this new service.

Wednesday, July 28, 2021

Linux is a parasite, and it may be our future

Linux is a parasite.

So is Unix.

The first Unix ran on a DEC PDP-7. But DEC did not sell PDP-7s to run Unix; it sold them to run its own operating system called "DECsys".

Later Unix versions ran on PDP-11s. But DEC did not sell PDP-11s to run Unix; it sold them to run later operating systems called RSX-11, TSX-11, CTS-11, and RSTS.

DEC's minicomputers were simple, compared to today's PCs. They would load and run just about any program. On many models, the loader program (what we would call the bootstrap code) was entered by hand on a front panel.

There was no trusted platform, no TPM, no signed code. It was easy to load Unix onto a DEC minicomputer. The success of Unix was due, in part, to the openness of those minicomputers.

But to be honest, Unix was a parasite. It took advantage of the hardware that was available.

Linux is in the same way a parasite on PCs. PCs are sold to run Windows. (Yes, a few are sold with Linux. But PCs are designed to run Windows, and the cast majority are sold with Windows.)

PC hardware has been, from the original IBM PC, open and well-documented. Linux took advantage of that openness, and has enjoyed a modicum of success.

Linux is a parasite on Apple PCs too, taking advantage of the hardware that Apple designed.

But the life of a parasite is not easy.

As Apple changes its hardware and bolsters security, it becomes harder to run Linux on an Apple PC. It is possible to run Linux on an M1 MacBook. I expect that the effort will increase over the next few years, as Apple introduces more changes to defend against malware.

Microsoft is making similar changes to Windows and the PC platform. Microsoft designs and builds a small number of PCs, and issues a specification for the hardware to run Windows. That specification is changing to defend against malware. Those changes also make it harder to install Linux.

Will we see a day when it is impossible to install Linux on a PC? Or on a Macbook? I think we will, probably with Apple equipment first. Devices such as the iPhone and Apple Time Capsule require signed code to boot an operating system, and Apple is not divulging the signing keys. It is not possible to install Linux on them. I think a similar fate awaits Apple's Macbooks and iMac lines. Once that happens, Linux will be locked out of Apple hardware.

Chromebooks look for code signed by Google, although in developer mode they can boot code that has been signed by others. (The Chromebook boot code looks for a signed kernel, but it doesn't care who signed it.)

Microsoft is moving towards signed code. Windows version 11 will require signed code and a TPM (Trusted Platform Module) in the PC. There are ways to load Linux on these PCs, so Linux has not yet been locked out.

I think Microsoft recognizes the contributions that Linux makes to the ecosystem, and is taking steps to ensure that Linux will be available on future PCs. Apple, I think, sees no benefit from Linux and is willing to lock Linux out of Apple devices. Microsoft sees value in letting Linux run on PCs; Apple doesn't.

It might be that Microsoft is preparing a radical change. It may be that Microsoft is getting ready to limit Windows to virtual systems, and drop support for "real" PCs. The new "Windows 365" product (virtual computers running Windows accessible from a browser) could be the future of Windows.

In this fantasy world I am constructing, Microsoft provides Windows on virtual hardware and not anywhere else. Access to Windows is available via browser, but one must acquire the hardware and operating system to run the browser. That could be an old PC running an old (in the future) version of Windows 10 or Windows 11, or it could mean a Chromebook running ChromeOS, or it could mean a desktop PC running Linux.

This would be a big change -- and I'm not saying that it will happen, only that it may happen -- and it would have profound affects on the IT world. There are some thoughts that come to mind:

First, performance becomes less important for the physical PC running the browser. The heavy CPU work is on the server side. The PC hosting the browser is a fancy terminal, displaying the results of the computation but not performing the computation. The race for speed shifts to the servers hosting the virtual instances of Windows. (And there is less pressure to update local PCs every three years.)

Second, the effort to develop and support Windows drops significantly. A lot of work for Microsoft is maintaining compatibility with hardware. Windows works with just about every piece of hardware going back decades: printers, video cards, disk drives, camera, phones, ... you name it, Windows supports it. If Microsoft shifts to a virtual-server-only version of Windows, a lot of that work disappears from Microsoft's queue. The work doesn't vanish; it shifts to the people building the non-virtual PCs that run the browsers. But the work (and the expense) does vanish from Microsoft's accounts.

Third, this change is one that Apple cannot follow. Apple has built its strategy of privacy on top of a system of local processing -- a secure box. They don't send data to remote servers -- doing so would let your personal data escape the secure box. It has no way to offer virtual instances of macOS that correspond to Windows 365 without breaking that secure box. (And just as Windows 365 allows for longer lifespans of local PCs, virtual macOS would allow for longer lifespans of Macs and Macbooks -- something that Apple would prefer not to see, as they rely on consumers replacing their equipment every so often.)

If Microsoft does make this change, the prospects for Linux improve. If Microsoft pulls Windows off of the market, then PC manufacturers must offer something to run on their hardware. That something cannot be macOS, and it certainly won't be FreeDOS. (As good as it is, FreeDOS is not what we need.)

The operating system that comes with PCs may be Linux, or a variant of Linux built for laptop makers. There could be two versions: a lightweight version that is close to ChromeOS (just enough to run a browser) and a heavier version that is close to today's Linux distros.

If Microsoft makes this change -- and again, I'm not sure that they will -- then we really could see "the year of the Linux desktop". Oh, and it would mean that Linux would no longer be a parasite.

Tuesday, July 20, 2021

Debugging

Development consists of several tasks: analysis, design, coding, testing, and deployment are the typical tasks listed for development. There is one more: debugging, and that is the task I want to talk about.

First, let me observe that programmers, as a group, like to improve their processes. Programmers write the compilers and editors and operating systems, and they build tools to make tasks easier.

Over the years, programmers have built tools to assist in the tasks of development. Programmers were unhappy with machine coding, so they wrote assemblers which converted text codes to numeric codes. They were unhappy with those early assemblers because they still had to compute locations for jump targets, so they wrote symbolic assemblers that did that work.

Programmers wrote compilers for higher-level languages, starting with FORTRAN and FLOW-MATIC and COBOL. We've created lots of languages, and lots of compilers, since.

Programmers created editors to allow for creation and modification of source code. Programmers have created lots of editors, from simple text editors that can run on a paper-printing terminal to the sophisticated editors in today's IDEs.

Oh, yes, programmers created IDEs (integrated development environments) too.

And tools for automated testing.

And tools to simplify deployment.

Programmers have made lots of tools to make the job easier, for every aspect of development.

Except debugging. Debugging has not changed in decades.

There are three techniques for debugging, and they have not changed in decades.

Desk check: Not used today. Used in the days of mainframe and batch processing, prior to interactive programming. To "desk check" a program, one looks at the source code (usually on paper) and checks it for errors.

This technique was replaced by tools such as lint and techniques such as code reviews and pair programming.

Logging: Modify the code to print information to a file for later examination. Also know as "debug by printf()".

This technique is in use today.

Interactive debugging: This technique has been around since the early days of Unix. It was available in 8-bit operating systems like CP/M (the DDT program). The basic idea: Run the program with the debugger, pausing the execution it at some point. The debugger keeps the program loaded in memory, and one can examine or modify data. Some debuggers allow you to modify the code (typically with interpreted languages).

This technique is in use today. Modern IDEs such as Visual Studio and PyCharm provide interactive debuggers.

Those are the three techniques. They are fairly low-level technologies, and require the programmer to keep a lot of knowledge in his or her head.

These techniques gave us Kernighan's quote:

"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?"

— The Elements of Programming Style, 2nd edition, chapter 2

These debugging techniques are the equivalent of assemblers. They allow programmers to do the job, but put a lot of work on the programmers. They assist with the mechanical aspect of the task, but not the functional aspect. A programmer, working on a defect and using a debugger, usually follow the following procedure:

- understand the defect
- load the program in the debugger
- place some breakpoints in the source code, to pause execution at points that seem close to the error
- start the program running, wait for a breakpoint
- examine the state of the program (variables and their contents)
- step through the program, one line at a time, to see which decisions are made ('if' statements)

This process requires the programmer to keep a model of the program inside his or her head. It requires concentration, and interruptions or distractions can destroy that model, requiring the programmer to start again.

I think that we are ready for a breakthrough in debugging. A new approach that will make it easier for the programmer.

That new approach, I think, will be innovative. It will not be an incremental improvement on the interactive debuggers of today. (Those debuggers are the result of 30-odd years of incremental improvements, and they still require lots of concentration.)

The new debugger may be something completely new, such as running two (slightly different) versions of the same program and identifying the points in the code where execution varies.

Or possibly new techniques for visualizing the data of the program. Today's debuggers show us everything, with limited ways to specify items of interest (and other items that we don't care about and don't want to see).

Or possibly visualization of the program's state, which would be a combination of variables and executed statements.

I will admit that the effort to create a debugger (especially a new-style debugger) is hard. I have written two debuggers in my career: one for 8080 assembly language and another for an interpreter for BASIC. Both were challenges, and I was not happy with the results for either of them. I suspect that to write a debugger, one must be twice as clever as when writing the compiler or interpreter.

Yet I am hopeful that we will see a new kind of debugger. It may start as a tool specific to one language. It may be for an established language, but I suspect it will be for a newer one. Possibly a brand-new language with a brand-new debugger. (I suspect that it will be an interpreted language.) Once people see the advantages of it, the idea will be adopted by other language teams.

The new technique may be so different that we don't call it a debugger. We may give it a new name. So it may be that the new debugger is not a debugger at all.

Wednesday, July 14, 2021

Windows 11 is change, which is not new

Microsoft announced Windows 11, and with it a set of requirements for the hardware that is required to run Windows 11. This is not new; all versions of Windows have had a list of "minimum required hardware". Yet some folks are quite upset about the requirements. Why are they so upset?

Looking back over the history of PCs (and going back the the first IBM PC, before the days of Windows), we can see a steady pattern of improvements to hardware and operating systems that took advantage of those improvements. New versions often required better hardware.

The first IBM PCs came without hard disks, and floppy disks were an option. DOS, the PC operating system before Windows, required floppy disks. IBM's PC XT included a hard disk, and DOS version 2 took advantage of the hard disk. (And was required to use the hard disk.) One could run DOS 2 on a floppy-only PC -- if you had enough memory -- but it provided little advantage. Systems with insufficient memory were not supported.

Windows 3.0, the first version of Windows to achieve popularity, would run on a PC with an 8088 processor, but it required a hard drive, and the multimedia operations required an 80286 processor and a CD drive. Here we see that older, less capable systems, are not supported.

Windows NT and each of its successors have set requirements for processor, memory, graphics, and disk space. Windows 2000, Windows XP, Windows Vista, Windows 8, and Windows 10 all have requirements for hardware.

So we should be used to the idea that new operating systems will not support older systems.

But I keep coming back to the question: why are people so upset about this version of Windows? What is it with Windows 11 that makes people complain?

I can think of several reasons:

First, this announcement was a surprise. Microsoft has, for the past several years, released Windows 10 and kept the hardware requirements unchanged. Those requirements allowed for a broad swath of PCs to run Windows 10. (I myself have PCs from 2007 and 2012 that are running Windows 10.) There has been nothing in the messages from Microsoft that Windows 10 would be replaced, or that hardware requirements would change. Until now.

Second, the new requirements have dropped support for a lot of PCs, and perhaps folks are still using these older PCs. By raising the hardware bar for Windows, Microsoft has declared some (okay, lots of) PCs are "unworthy". If a person happens to have one of those PCs, they may consider this an insult.

But the reason I truly suspect is a different one.

Past updates and changes to hardware requirements have had clear benefits. When Windows/386 wanted a VGA card, we understood that the graphics capabilities of earlier video cards were not sufficient for the desired experience. When an operating system required a 16-bit card for the network interface, we understood that the transfer speeds of the older 8-bit cards were not sufficient. When Windows NT required an Intel 386 processor, we understood that the older 8088 and 80286 processors were not sufficient to provide multitasking the way we wanted it.

With past upgrades, we understood the reasons for the required hardware. That's not true with Windows 11.

Windows 11 needs a certain amount of memory and disk space; that's understood. It also needs the TPM 2 chip; we understand that. But Windows 11 has requirements for a certain, not-well-understood subset of Intel processors. (It's not clear that Microsoft understands the subset, either.)

Part of the problem is Intel's product line. Intel has gobs of processor models. It has so many that the old names of "8088" and "80286" or "Pentium 1" and "Pentium 3" don't work. Instead, Intel uses letters and numbers, something like i7-6550 and i5-5204. (Those aren't real models; I made them up. Or maybe they are real, maybe I hit on actual product numbers. But you get the idea.)

Intel has shipped, over the past decade, possibly thousands of different processor models, each with  different features. Most people don't care about most of the differences. The typical person looks at the processor clock speed and the number of cores, and little else. Hardware enthusiasts and game players may look at socket type and cache size.

Only the folks who write operating systems and low-level drivers go beyond those to look at the arcane aspects of the different processors. Those aspects can include the handling of interrupts, privileged execution of certain instructions, fixes to errors in the instruction set, virtual memory, and virtual machines.

It is these differences that are important to Microsoft. Windows has to work with all of those processors. It has to handle the quirks of each processor. It has to "know" that it can trust an instruction on some processors and not trust it on others. All of those quirks add up, and they can interact in strange and subtle ways.

On top of that, Microsoft has to test each of those configurations (preferably on real processors, not simulations). That means that Microsoft has to maintain a large collection of hardware.

By limiting the processors to those designed and shipped in the past three years, Microsoft eliminates the older processors and in so doing reduces the variation that they cause. The reduced set of processors allows for (relatively) simpler code for Windows, and a simpler test process.

But none of this is obvious. Microsoft has not said "we're limiting the supported processors to those we can test on", nor have they said "we're limiting the supported processors to those that have these (insert arcane aspect) features".

All we have is a vague announcement. (And I will say that the whole "Windows 11" announcement seems rushed. It doesn't have the depth and details of previous announcements from Microsoft. But that's another topic.)

That vague announcement does not give us understanding. And because we don't understand the reasons, we resent the change. That's basic psychology.

I will close with a few thoughts:

- Microsoft, I think, has thought about Windows 11 and its requirements, and has made a good decision.
- That decision is not available to us, so we see the change as arbitrary.
- It is easy to resent what we do not understand.
- Microsoft was probably surprised by the reaction to the announcement, and may be working on more announcements.
- While I don't understand Microsoft's decision, I have faith that they have a good process.

A poor message can hide a good process; let's wait for more information.

Also - Microsoft is not alone in changing hardware requirements. Apple has done so with every new version of macOS (I think). Even Linux drops support for older systems. I have an old 32-bit MacBook running Ubuntu 16.04 with no way to upgrade because Ubuntu now requires 64-bit processors.

Monday, July 5, 2021

Returning to the office, or not

As the COVID-19 pandemic wanes, companies must consider their future. Do they require all employees to return to the office? Do they continue a "work from home" policy?

Some companies have announced "work from home forever" policies. 

Some companies have required employees to return to the office. The response has often been one of protest, with some workers quitting their jobs instead of returning to the office.

Other companies have announced hybrid schedules. Apple is probably the largest company to announce such a schedule. Its employees protested such a plan.

Why such hostility to returning to normal? I have a few ideas.

Work-life balance: The shift to work-from-home has given workers experience with a different work-life balance. Parents have been able to watch their children. Workers have been able to not only get a cup of coffee but also do the laundry while working.

Companies, for a long time, have asked their employees to think about work-life balance, probably so the company can boast high scores for employee engagement and quality of workplace. Companies want to rank high on the "best places to work" surveys. With work-from-home experience, employees now have a new understanding of "work-life balance". It is now more than an item on an annual survey. Employees have more control and a better work-life balance when working from home, and they like it.

The commute: Employees have found that a "commute" from the bedroom to the kitchen table (or wherever they set up their workspace) is much more pleasant than the daily drive to the office. (Or the bus ride. Or the carpool.) To companies, this is an externality; employees bear the entire cost and effort of the commute, and must allow for delays and side trips. Asking people to switch from a 20-second walk to a 30-minute drive (and pay for gasoline and tolls) is imposing a cost, and employees are aware of the cost. It's easy to understand the advantages of the work-from-home commute.

Privacy: The modern office, with either an open-floor plan or even cubicles, provides little in the way of privacy. Interruptions can be frequent. Distractions can be many. Working at home provides interruptions and distractions too, but they can be moderated. Some people have even found ways to work in a spare room, with a door that can be closed.

I suspect that most employers are asking their workers to return to the (unmodified) offices with open-floor plans. Employees know that this will mean more noise, more distractions, and less privacy.

Lighter supervision: Employees working at home have zero chance of a manager walking past their workspace. Managers must make an extra effort to contact an employee. This means that employees can spend more time working, which can mean higher productivity. (Or at least longer periods of focus on work and less on management.)

If companies want employees to return to the office, they can mandate it (and workers will grudgingly return but they won't be happy) or they can request it (but they must make a good case to persuade workers to return).

A winning presentation to workers will have to address all of the above ideas. Saying "we'll all be in one place" won't cut it, nor will "we work better when we're together". Workers have had some degree of independence, and they (for the most part) like it.


Tuesday, June 29, 2021

Windows 11 is for the enterprise

Microsoft's recent announcement of Windows 11 has gotten a lot of people asking questions? Why now? Why the change for minimum requirements? And why was the announcement so plain and unassuming?

I think the answer lies in Microsoft's customers for Windows. So let's look at the different types of customers. We can see that there are a few different types of customers for Windows.

Enterprises: Large companies with lots of computers. They authenticate with Exchange. They use Microsoft Office, SQL Server, and other Microsoft products. They buy lots of licenses. They pay for support. And -- importantly -- they depreciate computers over a three year schedule, and they frequently replace computers every three years. They have dedicated IT support teams (possibly outsourced or contractors) and they have discussions and plans for IT.

We can consider large non-profit organizations and large government agencies in this group, as long as they replace their computers every three years.

Small businesses: Companies with fewer computers (probably less than 100). They don't use Exchange for authentication; they assign everyone a computer with a password and share data via workgroups. They use the software that comes with the computer (Windows and Office). And they don't replace their computers every three years; they keep them longer.

Small business do not (typically) have plans for IT, other than "keep things running and replace computers when they fail". They let their computers age in place, with no specific plans to upgrade Windows or applications.

We can consider small non-profit organizations and small government agencies in this group, as long as they don't have formal plans to replace computers every three years.

Typical individuals: Like small businesses, they have few computers, they use the software that comes with the computer (possibly Office 365), and they keep their computers for longer than three years. They, too, let their computers age in place.

Enthusiasts: These are individuals who enjoy tinkering with hardware or software. Like the typical individual, they have a few computers. Unlike the typical individual, they take a more active interest in IT. They probably have more computers than the typical individual, and they tend to have some computers with the latest versions of Windows. (They may also have older computers with older versions of Windows, just for fun.)

Enthusiasts were important in the early days of Windows. They downloaded beta versions, showed Windows to their friends, and learned how to make Windows work on different types of hardware. The were an important part of the "Windows revolution" over DOS.

Gamers: These individuals have few computers. They take an interest in hardware, and software when it helps their gaming experience. They use powerful computers, either built by themselves or off-the-shelf with custom video and replaced disk. They may replace equipment every three years; the time is driven not by depreciation schedules but by hardware and game software.

Browsers: Individuals who use Windows like a Chromebook. That is, they have a computer running Windows but they use only web apps. They don't use local applications (not even Office). Like typical users, they have no plans for upgrades and tend to use computers for a long time.

With these different groups in mind, we can gain some insight into Microsoft's motivations.

Microsoft's announcement for Windows 11, and specifically the requirements for 64-bit, and TPM 2.0, limit Windows 11 to recent computers. This is going to cause some problems for some users, because the equipment they currently have will not support Windows 11. But look at the groups, and see which will be affected:

Small businesses, typical individuals, and browsers will not be affected by Windows 11. They probably do not run the latest version of Windows 10, and may be running Windows 8.1 or even Windows 7. (The latter is unlikely due to the lack of support for Internet Explorer.)

Enterprise businesses will not be affected (much) by Windows 11. They will have equipment that is ready to run Windows 11 (thanks to their policy of replacing computers every three years) and they have an IT support team who can coordinate the installation of the new version. (That IT group may not be happy about a new version of Windows, but they can handle the task.)

The groups most affected by Windows 11 will be gamers and enthusiasts. Gamers will have to review the benefits of Windows 11, and will probably replace older PCs when games come out that are for Windows 11 only. Enthusiasts will be the hardest hit: their curated older hardware that is running Windows 10 (because it can) will not be able to run Windows 11. They will have to pony up for new hardware (and find space for it, while keeping their older PCs).

So my conclusion is this: Windows 11 is for the enterprise. Microsoft is targeting enterprise customers (the ones who pay lots of licensing fees) and keeping them happy. (Enterprises love security!)

The other types of users are going along for the ride. Small businesses and typical individuals won't be affected (they already have hardware, and when they buy new PCs they will come with Windows 11).

The folks most affected will be the enthusiasts who won't be able to install Windows 11 on their old hardware. (And probably won't be able to install Windows 10 after its end-of-life in 2025.) That's a small crowd, and they are less important today than they were in the early days of Windows.

Microsoft cannot support old hardware forever. The advantages of increased security are obvious and necessary. A special version of Windows 11 ("Windows 11 minus"? "Windows for the tinkerers"?) that supports older (less secure) hardware would require a lot of time and effort, and the return for that time and effort would be very small.

The enthusiasts and tinkerers need another home, one that is not dominated by the concerns (and economics) of the enterprise.