Monday, October 30, 2023

Apple M3

I found Apple's presentation "Scary Fast" to be not scary but somewhat disturbing. Or perhaps "disappointing" is the better adjective.

Apple's M3 processors are impressive, but perhaps not as impressive as the "Scary Fast" presentation implies. I'm not saying that Apple is lying or presenting false information, but they are picking the information very carefully.

Apple compares the M3 to the M1 processor, and the MacBook Pro (with an M3 Pro) to the fastest Intel-based MacBook.

I find those comparisons odd. Why compare the M3 to the old M1 processor? Why not compare it to the M2. And comparing an M3-based MacBook Pro to an Intel-based MacBook seems even odder. (Is anyone still using Intel-based MacBooks? Other than me?)

But Apple's cherry-picking for performance comparisons is not the major disappointment.

The big issue, the one issue that I think Apple misses completely, is that its hardware is outrunning its software. Apple's M3 processors are fast and capable, but so were Apple's M2 processors. The M2 processors were so powerful that the low-end, plain M2 processors was more than enough for almost everyone. If I were equipping an office with Apple devices, I would give everyone a MacBook Air, or possibly a low-end MacBook Pro. Those are enough for just about all typical office work. (Folks editing video or running large test sets might benefit from the higher processors, but they are a small portion of the audience.)

Apple's hardware is faster than everyone else's, just as high-end sports cars are faster than the average automobile. But for most people, average automobiles are good enough. Most people don't want the expenses of a high-end sports car, nor can they take advantage of the speed. Apple's M3 processors are fast, but pure speed translates into performance for only a few users. It is quite likely that today (that is, with no M3 processors in the field) most people have computers that are more than fast enough for their needs.

Apple concentrates on hardware and invests little in a number of other areas:

- Cloud-based computing
- Artificial intelligence
- Design of programming languages, multi-threaded applications, parallel tasks, and coordination of distinct processes
- Programming tools, from command-line tools to IDEs
- Automated testing of GUI programs
- Voice recognition

That's not to say that Apple has done nothing in these areas. My point is that Apple has done a small amount and relies on others to do the work in these areas. And that work isn't getting done. Apple's obsession with hardware is costing them opportunities in these areas. It holds Apple back, preventing it from growing the technology. It also holds us back, because we have to wait for Apple.

Tuesday, October 10, 2023

Apple, TSMC, and 3nm chips

The news from a few months back is that Apple has purchased all of TSMC's capacity for 3nm chips for one year. That's a pretty impressive deal. It gives Apple exclusive access to TSMC's latest chip technology, locking out all other PC manufacturers. It also shows that Apple is planning on a lot of sales in the coming year.

Yet I see a dark side to this arrangement.

First, it places a cap on Apple's sales in the year. Apple has "maxed out" its chip source; it cannot get more from TSMC. Apple's growth is now constrained by TSMC's growth, which has been less than planned. (TSMC's new fabrication plant in Arizona has been delayed and cannot produce multi-chip assemblies.)

With a cap on production, Apple must choose carefully which chips it wants from TSMC. What percentage will be M2 chips? M2 Pro chips? A17 chips for iPhones? If Apple guesses wrong, it could have a lot of unsold inventory for one product and be unable to meet sales demand for another.

Second, it makes allies of PC manufacturers (anyone who isn't Apple) and chip manufacturers (anyone who isn't TSMC). TSMC may have difficulty winning business from Lenovo, Dell, and even Microsoft. The arrangement probably doesn't help Apple's relationships with Intel and Samsung, either.

Third, it shows that Apple's latest processors are not second-sourced. (Second-sourcing was a common practice in the 1980s. It reduced risk to the customer and to the primary manufacturer.) Not having a second source for its processors means that any disruption to manufacturing will directly the finished products. If TSMC cannot deliver, Apple has nowhere to turn.

It may be that Apple's chips cannot be second-sourced. I don't know the details, but it may be that Apple provided specifications for the chips, and TSMC designed the chip layout. If that is the case, then it is most likely that TSMC owns the layouts, not Apple, and for Apple to get chips from Intel or Samsung those companies would have to start with the specifications and design their own chips. That's a lengthy process, and might take longer than the expected lifetime of the chips. (The M1 chip is all but obsolete, and the M3 is already replacing the M2 chip. The "A" series chips have similarly rapid turnover.)

So Apple purchasing all of TSMC's capacity for a year sounds impressive -- and it is -- but it also reveals weaknesses in Apple's position.


Monday, September 11, 2023

Google raises prices, which may be a good thing

Google has raised prices for several of its services. The annual rates for Workspace, YouTube Premium, and Nest are all going up. The internet is not happy, of course. Yet I see a benefit in these price increases, and not just to Google. I think consumers may benefit from them.

It does sound odd. How can consumers -- who pay these prices -- benefit from increases? Wouldn't they benefit more from decreases in prices?

My answer is: not necessarily.

My thinking is this:

While Google is a large company, with many products and services, most of its revenue comes from advertising.  One could say that Google is an advertising company that has a few side projects that are supported by advertising revenue.

The model was: make a lot of money in advertising and offer other services.

Google was -- and is -- wealthy enough that it could give away e-mail and storage. When Google first offered it's GMail service, it allowed up to one Gigabyte of storage to each user, an amount that was unheard-of at the time.

It's tempting to want this model to continue. It gives us "something for nothing". But letting advertising pay for everything else has a downside.

When a service depends on revenue from advertising, it is natural for the company to expect that service to help advertising. If the service doesn't, then that service is either changed or discontinued. (Why continue to offer a service that costs money to maintain but doesn't help with revenue?)

Google has a reputation for cancelling projects. Perhaps those projects were cancelled because they did not provide revenue via advertising -- or didn't help the advertising group gain customers, or better market data, or something else.

When a service is funded by advertising, that service is beholden to advertising.

In contrast, when a service has its own revenue -- enough revenue to generate a profit -- then that service is somewhat isolated from the advertising budget. If YouTube Premium costs $X to run and brings in $Y in revenue (and Y is greater than X) then YouTube Premium has a good argument to continue, despite what the folks in advertising want.

The same goes for other services like Nest and Google's cloud storage.

I expect that no one enjoys increasing prices. I certainly don't. But I recognize the need for services to be independent, and free of the influence of other lines of business. Higher revenue leads to services that are stronger and longer-lasting. (Or so I like to think.)

I may grumble about the increase in prices for services. But I grumble with restraint.

Thursday, August 10, 2023

The Apple Mac Pro is a Deluxe Coffee Maker

Why does Apple offer the Mac Pro? It's expensive and it offers little more than the Mac Studio. So why is it in the product line? Several have asked this question, and I have an idea.

But before discussing the Apple Mac Pro, I want to talk about coffee makers. Specifically the marketing of coffee makers.

Coffee makers are, at best, a commodity. One puts coffee, water, and electricity in, and after a short time coffee comes out. The quality of the coffee is, I believe, dependent on the coffee and the water, not the mechanism.

As bland as they are, there is one legend about coffee makers. It has to do with the marketing of coffee makers, and it goes something like this:

(Please note that I am working from a dim memory and much -- if not all -- of this legend may be wrong. But the idea will serve.)

A company that made and sold coffee makers was disappointed with sales, and wanted to increase its profits. They brought in a consultant to help. The consultant looked at the product line (there were two, a basic model and a fancy model), sales figures, sales locations, advertising, and various other things.  The consultant then met with the big-wigs at the company and presented recommendations.

The executives at the company were expecting to hear about marketing strategies, advertising, and perhaps pricing. And the consultant did provide recommendations along those lines.

"Add a third coffee maker to your product line," he said. "Make it a deluxe model with all of the features of your current models, and a few more, even features that people won't use. Sell it for an expensive price."

The executives were surprised to hear this. How could a third coffee maker, especially an expensive one, improve sales? Customers were not happy with the first two; a third would be just as bad.

"No," said the consultant. "The deluxe model will improve sales. It won't have many sales itself, but it will encourage people to buy the fancy (not deluxe) model. Right now your customers see that fancy model as expensive, and a poor value. A third model, with lots of features and a high price will convince customers that the fancy (not deluxe) model is a bargain."

The company tried this strategy ... and it worked! Just as the consultant said. Sales of the deluxe model were dismal, but sales of the (now) middle-tier fancy (not deluxe) model perked up. (Pun intended.)

We often forget that sales is about psychology as well as features.

Now let's consider Apple and the Mac Pro. The Mac Pro is not a good bargain. It performs only slightly better than the Mac Studio, yet it carries a much higher price tag. The Mac Pro has features that are ... questionable at best. (PCI slots that won't take graphics cards. Don't forget the wheels!)

Perhaps -- just perhaps -- Apple is using the Mac Pro to boost sales of the Mac Studio. Pricing the Mac Pro the way Apple does, it makes the Mac Studio a much more attractive option.

I suspect that if Apple had no Mac Pro and put the Mac Studio at the top of its product line, then a lot of people would argue for the Mac Mini as the better option. Those same people can make the same argument with the Mac Pro and convince themselves to buy the Mac Studio.

So maybe the Mac Pro isn't a Mac Pro at all. Maybe it is a deluxe coffee maker.

Thursday, August 3, 2023

We'll always have source code

Will AI change programming? Will it eliminate the need for programmers? Will we no longer need programs, or rather, the source code for programs? I think we will always have source code, and therefore always have programmers. But perhaps not as we think of programmers and source code today.

But first, let's review the notions of computers, software, and source code.

Programming has been with us almost as long as we have had electronic computers.

Longer than that, if we include the punch cards used by the Jacquard loom, But let's stick to electronic computers and the programming of them.

The first digital electronic computers were built in the 1940s. They were programming not by software but by wires -- connecting various wires to various points to perform a specific set of computations. There was no concept of a program -- at least not one for the computer. There were no programming languages and there was no notion of source code.

The 1950s saw the introduction of the stored-program computer. Instead of wiring plug-boards, program instructions were stored in cells inside the computer. We call these instructions "machine code". When programming a computer, machine code is a slightly more convenient than wiring plug-boards, but not by much. Machine code consists of a number of instructions, which each reside at distinct, sequential locations in memory. The processor executes the program by simply reading one instruction from a starting location, executing it, and then reading the next instruction at the next memory address.

Building a program in machine code took a lot of time and required patience and attention to detail. Changing a program often meant inserting instructions, which meant that the programmer had to recalculate all of the destination addresses for loops, branches, and subroutines. With stored-program computers, there was the notion of programming, but not the notion of source code.

Source code exists to be processed by a computer and converted into machine code. We first had source code with symbolic assemblers. Assemblers were (and still are) programs that read a text file and generate machine code. Not just any text file, but a text file that follows specific rules for content and formatting, and specifies a series of machine instructions but as text -- not as numbers. The assembler did the grunt work of converting "mnemonic" codes to numeric machine codes. It also converted numeric and text data to the proper representation for the processor, and calculated the destinations for loops, branches, and subroutines. Revising a program written in assembly language was much easier than revising machine code.

Later languages such as FORTRAN and COBOL converted higher-level text into machine code. They, too, had source code.

Early C compilers converted code into assembly code, which then had to be processed by an assembler. This last sequence looked like this:

    C source code --> [compiler] --> assembly source code --> [assembler] --> machine code

I've listed both the C code and the assembly code as "source code", but in reality only the C code is the source code. The assembly code is merely an intermediate form of the code, something generated by machine and later read by machine.

A better description of the sequence is:

    C source code --> [compiler] --> assembly code --> [assembler] --> machine code

I've changed the "assembly source code" to "assembly code". The adjective "source" is not really correct for it. The C program (at the left) is the one and only source.

Later C compilers omitted this intermediate step and generated machine code directly. The sequence become:

    C source code --> [compiler] --> machine code

Now let's consider AI. (You didn't forget about AI, did you?)

AI can be used to create programs in two ways. One is to enhance a traditional programming IDE with AI, and thereby assist the programmer as he (or she) is typing. That's no different from our current process; all we have done is made the editor a bit smarter.

The other way is to use AI directly and ask it to create the program. In this method, a programmer (or perhaps a non-programmer) provides a prompt text to an AI engine and the AI engine creates the entire program, which is then compiled into machine code. The sequence looks like this:

    AI prompt text --> [AI engine] --> source code --> [compiler] --> machine code

Notice that the word "source" has sneaked back into the middle of the stream. The term doesn't belong there; that code is intermediate and not the source. A better description is:

    Source AI prompt text --> [AI engine] --> intermediate code --> [compiler] --> machine code

This description puts the "source" back on the first step of the process. That prompt text is the true source code. One may argue that a prompt text is not really source code, that it is not specific enough, or not Turing-complete, or not formatted like a traditional program. I think that it is the source code. It is created by a human and it is the text used by the computer to generate the machine code that we desire. That makes it the source.

Notice that in this new process with AI, we still have source code. We still have a way for humans to instruct computers. I've been writing about source code as if it were written. Source code has always been written (or typed, or keypunched) in the past. It is possible that future systems recognize human speech and build programs from that (much like on several science fiction TV programs). If so, those spoken words will be the source code.

AI may change the programming world. It may upend the industry. It may force many programmers to learn new skills, or to retire. But humans will always want to express their desires to computers. The way they express them may be through text, or through speech, or (in some far-off day) through direct neural links. Those thoughts will be source code, and we will always have it. The people who create that source code are programmers, so we will always have them.

We will always have source code and programmers, but source code and programming will change over time.

Thursday, July 20, 2023

Hollywood's blind spot

Hollywood executives are probably correct in that AI will have a significant effect on the movie industry.

Hollywood executives are probably underestimating the effect that AI will have on the movie industry.

AI, right now, can create images. Given some prompting text, an AI engine can form an image that matches the description in the text. The text can be simple, such as "a zombie walking in an open field", or it can be more complex.

It won't be long before AI can make not a single image but a video. A video is nothing more than a collection of images, each different from the previous in minor ways. When played back at 24 frames per second, the human mind perceives the images not as individual images but as motion. (This is how movies on film work, and how movies on video tape work.) I'm sure people are working on "video from AI" right now -- and they may already have it.

A movie is, essentially, a collection of short videos. If AI can compose a single video, then AI can compose a collection of videos. The prompting text for a movie might resemble a traditional movie script -- with some formatting changes and additional information about costumes, camera angles, and lighting.

Thus, with enough computing power, AI can start with an enhanced, detailed script and render a movie. Let's call this a "script renderer".

A script renderer makes the process of moviemaking cheap and fast. It is the word processor of the twenty-first century. And just as word processors upended the office jobs of the twentieth century, the script renderer will upend the movie jobs of this century. Word processors (the software on commonplace computers) replaced people and equipment: secretaries, proofreaders, typewriters, carbon paper, copy machines, and Wite-out erasing fluid.

Script renderers (okay, that's a clumsy term and we'll probably invent something better) will do similar things for movies. If an AI can make a movie from a script, then movie makers don't need equipment (cameras, lights, costumes, sets, props, microphones) and the people who handle that equipment. It may be possible for a single individual to write a script, send it through a renderer, and get a movie. What's more, just as word processors let one print a document, review it, make changes, and print it again, a script renderer will let one render a movie, view it, make changes, and render it again -- perhaps all in a few hours.

Hollywood executives, if they have seen this far ahead, may be thinking that their studios will be much more profitable. They won't need to pay actors, or camera operators, or build sets, or ... lots of other things. All of those expenses disappear, but the revenue from the movies remain.

But here's what they don't see: Making a movie will simply be a matter of computing power. Anyone with a computer and access to a sufficiently powerful AI will be able to convert a script into a movie.

Today, anyone can start a newsletter. Or print invitations to a party. Or their own business cards.

Tomorrow, anyone will be able to make a movie. It won't be easy; one still needs a script with the right details, and one should have a compelling story and good dialog. But it will be much easier than it is today.

And create movies they will. Not just movies, but TV episodes, mini series, and perhaps even short videos like the old Flash Gordon serials.

I suspect that the first wave of "civilian movies" will be built on existing materials. Fans of old "Star Trek" shows will create new episodes with new stories but using the likenesses of the original actors. The studios will sue, of course, but it won't be a simple case of copyright infringement. The owners of the old shows will have to build a case on different grounds. (They will probably prevail, if only because the amateurs cannot pay the court costs.)

The second wave will be different. It will be new material, away from the copyrighted and trademarked properties. But it will still be amateurish, with poor dialog and awkward pacing.

The third wave of non-studio movies will be better, and will be the real threat to today's movie studios. These movies will have higher quality, and will obtain some degree of popularity. That will get the attention of Hollywood executives, because now these "civilian" movies will compete with "real" movies.

Essentially, AI removes the moat around movie studios. That moat is the equipment, sound stages, and people needed to make a movie today. When the moat is gone, lots of people will be able to make movies. And lots will.


Thursday, July 13, 2023

Streaming services

Streaming services have a difficult business model. The cost of producing (or licensing) content is high, and the revenue from subscriptions or advertisements is low. Fortunately, the ratio of subscribers to movies is high, and the ratio of advertisements to movies is also high. Therefore, the streaming services can balance revenue and costs.

Streaming services can increase their revenue by adjusting subscription fees. But the process is not simple. Raising subscription fees does raise income per subscriber, but it may cause some subscribers to cancel their subscription. Here, economics comes into play, with the notion of the "demand curve", which measures (or attempts to measure) the willingness of customers to pay at different price levels.

Streaming services can decrease their costs by removing content. For licensed content (that is, movies and shows that are made by other companies) the streaming service pays a fee. If they don't "carry" those movies or services, then they don't have to pay. Cancelling their license reduces their cost.

For content that the service produces, the costs are more complex. There is the cost of production, which is a "sunk cost" -- the money has been spent, whether the service carries the movie/show or not. There are also ongoing costs, in the form of residual payments, which are paid to the actors, writers, and other contributors while the movie or show is made available. Thus, a service that has produced a movie can reduce its costs (somewhat) by not carrying said movie.

That's the basic economics of streaming services, a very simplified version. Now let's look at streaming services and the value that they provide to viewers.

I divide streaming services into two groups. Some services make their own content, and other services don't. The situation is somewhat more complicated, because the content-making services also license content from others. Netflix, Paramount+, and Roku all run streaming services, all make their own content, and all license other content to show on their service. Tubi, Frndly, and Pluto TV make no content and simply license content from others.

The content-producing services, in my mind, are the top-tier services. Disney+ makes its own content and buys (permanently) other content to add to its library, and is recognized as a top-tier service. Netflix, Paramount+, and Peacock create their own content (and license some) and I consider them top-tier services.

The services that don't produce content, the services that simply license content and then make it available, are the second-tier services. They are second-tier because their content is available for a limited time. They don't own content; they can only rent it. Therefore, content will be available for some amount of time, and then disappear from the service. (Roku, for example, had the original "Bionic Woman" series, but it is not available now.)

For second-tier services, content comes and goes. There is no guarantee that a specific movie or show will be available in the future. Top-tier services, in contrast, have the ability to keep movies and shows available. They don't, and I think that damages their brand.

Services damage their brand when they remove content, in three ways.

First, they reduce the value of their service. If a service reduces the number of movies and shows available, then they have reduced their value to me. This holds in an absolute sense, and also in a relative sense. If Disney+ removes movies, and Paramount+ keeps its movies, then Disney+ drops in value relative to Paramount+.

Second, they break their image of "all things X". When Paramount+ dropped the series "Star Trek: Prodigy", they lost the right to claim to be home to all things Star Trek. (I don't know that Paramount+ has every made this claim. But they cannot make it now.)

Third, the services lose the image of consistency. On a second-tier service, which lives off of licensed (essentially rented) content, I expect movies and shows to come and go. I expect a top-tier service to be predictable. If I see that it has a movie available this month, I expect them to have it next month, and six months from now, and a year from now. I expect the Disney+ service to have all of the movies that Disney has made over the years, now and in the future. I expect the Paramount+ service to have all of the Star Trek movies and TV shows, now and in the future.

By dropping content, the top-tier services become more like the second-tier services. When Netflix, or Max, or Peacock remove content, they become less reliable, less predictable, less... premium.

Which they may want to consider when setting their monthly subscription rates.