Tuesday, March 3, 2026

AI and the mortgage debt crisis of 2008

In 2008, investment banks saw tremendous losses caused by defaults on mortgages. It wasn't just mortgages; investment companies had bundled and repackaged mortgage loans into securities and sold those securities to other investors. The demand for these mortgage-backed securities was high (they paid good interest) and that demand spurred demand for mortgages, which spurred banks to offer (and originate) mortgages to a large number of people including a large number of people whom they would normally not give mortgage loans. The problem came when interest rates rose, causing mortgage payments to increase (many were adjustable-rate mortgages), and many mortgage holders could not afford the higher payments. They defaulted on the loans, which triggered failures through the entire chain of investments.

The end products, the mortgage-based securities, were supposedly top quality. The mortgages upon which they were based were not; the investment bankers had convinced themselves that a combination of mixed-grade mortgages could support a top-grade investment product.

It was a system that worked, until it didn't.

What does this have to do with AI? Keep in mind the notion of building top-grade products from a composite of mixed-grade products.

AI -- at least AI for programming -- works by building a large dataset of programs and then using that dataset to generate requested programs. The results are, in a sense, averages of certain selected items in the provided data (the "training data").

The quality of the output depends on the quality of the input. If I train an AI model on a large set of incorrect programs, the results will match those flawed programs. By training on large sets of programs, AI providers are betting on the "knowledge of the masses"; they assume that a very large collection of programs will be mostly correct. Scanning open source repositories is a common way to build such datasets. Companies with large datasets of their own (such as Microsoft) can use those private datasets for training an AI model.

I think that averaging to correctness works for most requests, but not necessarily for all requests.

I expect that simpler code is more available in code repositories, and complex and domain-specific code is less common. We can see lots and lots of "hello, world" programs, in almost any programming language. We can see lots of simple classes for a customer address (again, in almost any programming language).

We don't see lots of code for obscure applications, or very large applications. There are few publicly available applications to run oil rigs, for example. Or large, multinational accounting systems. Or perhaps even control software for a consumer-grade microwave oven.

There may be a few large, complex programs available in AI training data. But a few (or one) is not drawing on "the knowledge of the masses". It is not averaging a large set of mostly right code into a correct set of code.

Here we can see the parallel of AI for coding to the mortgage securities industry. The latter built (what it thought were) top-grade investment products from mixed-grade mortgages. The former is building (what it and users think are) quality code from mixed-grade existing code.

But I won't be surprised to learn that AI coding models work for small, simple code and fail for large, complex code.

In other words, AI coding works -- until it doesn't.

Friday, February 6, 2026

Microsoft doesn't know how customers want to use AI

Microsoft has pushed its "Copilot" AI in a lot of places. It's in Windows. It's in Office (excuse me, "Microsoft 365") applications. It's in Visual Studio Code, Visual Studio, and GitHub. If Microsoft has a property, Microsoft has injected Copilot into it.

Little of this (if any) has gone over well with customers. Combined with the injection of advertising, the push of AI has created so much dissatisfaction that customers are leaving Windows for Mac or (gasp) Linux.

A lot has been written (or recorded and posted on YouTube) about this. I won't rehash the arguments here.

What I will ask is this: Why is Microsoft doing this? Why is Microsoft putting Copilot into its products and services willy-nilly, much like it did with the ".NET" label for product names.

I have an idea:

Microsoft doesn't know how customers will use AI, or what they want to do with it.

This is a change for Microsoft. For much of its life, Microsoft has played "catch-up" with technology. After its lead with BASIC, and its fortunate contract with IBM for PC-DOS, Microsoft has been following others. It followed Apple's MacIntosh computers with Windows. It followed a number of database providers with SQL Server. It followed NetScape with Internet Explorer. It followed Java with C#. It followed the iPod with the Zune (look it up). It followed Amazon AWS with Azure.

Now Microsoft is following other AI providers with its Copilot. But those other AI providers are different from Apple and NetScape and Sun Microsystems (the makers of Java). They all knew what their customers wanted, and they provided a solution that met those wants.

Today's providers of AI don't know what their customers want. They don't know how to make a profit from AI. But they are popular and Microsoft is following them, which means that Microsoft doesn't know when their customers want from AI and Microsoft doesn't know how to make a profit from AI.

I find all of this rather unsettling.

 

Thursday, January 22, 2026

A flood of used GPUs

It seems to me that we will soon be inundated with a large number of used GPUs. I'm not sure what we are going to do with them, but I suspect that some creative people will devise uses for them.

My idea starts with the data centers used for AI. These are large facilities with lots (thousands, probably tens of thousands) of servers each with one or more GPU. Some are being built as I write this, some have been just recently "turned on", and some are getting old (in terms of technology).

GPUs don't last forever. They suffer from two forms of obsolescence. The first is wear. While GPU chips last quite a long time, other parts of the GPU wear more quickly. Fan motors, capacitors, and other discrete electronic components degrade after significant use.

The second form is capacity, or more specifically the availability of a newer, faster, more efficient GPU. We've seen this in the PC gaming market, with new GPUs announced every year. I myself have benefitted from this phenomenon. A while back, when living in an apartment in Oregon, someone else "donated" an old PC to the recycling area. The PC was mostly complete, with case, power supply, motherboard, memory, and -- interestingly -- a GPU. (The former owner had removed all disk drives, but left everything else.)

GPUs are not cheap, but the former owner thought that the GPU in this PC had little value. I estimate that the former owner had used this PC for about five years.

So let's take that five year figure and apply it to data centers, specifically data centers for AI, because that use lots of GPUs.

What happens after a data center has been online for five years? Technology advances, and there will be newer, faster, more efficient GPUs on the market. The owners of the data center will look at those new GPUs with envy. (Especially the "more efficient" aspect of the new GPUs.)

I predict that some data centers will see their older GPUs replaced. (Perhaps the entire server, not just the GPU.) Which means that the big tech owners of data centers will have a large pile of used GPUs (or servers) sitting on the side.

What to do with those old GPUs? One could recycle them, and I suspect that many will be, but that costs money. They could be buried in a landfill, but that costs money too.

Which leaves another option: sell them.

Selling used GPUs is tricky. You cannot label them as new (not legally). But I suspect that there will be a market for used, recent-model GPUs. We might see a large number of them on the market, which means the price will be relatively low.

Buying used GPUs is also tricky. Used GPUs may fail quickly, and there is usually no warranty.

If you have been pondering a project that uses a GPU (or a number of GPUs), this may be your opportunity to start it.