Tuesday, May 2, 2023

The long life of the hard disk

It was in 1983 that IBM introduced the IBM XT -- the IBM PC with the built-in hard disk. While hard disks had been around for years, the IBM XT made them visible to the vast PC market. Hard disks were expensive, so there was a lot of advertising and also a lot of articles about the benefits of hard disks.

Those benefits were: faster operations (booting PC-DOS, loading programs, reading and writing files, more secure because you can't lose the disk like a floppy, and more reliable compared to floppy disks.

The hard disk didn't kill floppy disks. They remained popular for some time. Floppy disks disappeared some time after Apple introduced the iMac G3 (in 1998). Despite Apple's move, floppies remained popular.

Floppy disks did gradually lose market share, and in 2010 Sony stopped manufacturing floppy disks. But hard disks remained a staple of computing.

Today, in 2023, the articles are now about replacing hard disks with solid-state disks. The benefits? Faster boot times, faster program loading times, faster reading and writing. (Sound familiar?) Reliability isn't an issue, nor is the possibility of losing media.

Apple again leads the market in moving from older storage technology. Their product line (from iPhones and iPads to MacBooks and iMacs) all use solid-state storage. Microsoft is moving in that direction too, pressuring OEMs and individuals to configure PCs to boot Windows from an internal SSD rather than a hard disk. It won't be long before hard disks are dropped completely and not even manufactured.

But consider: the hard disk (with various interfaces) was the workhorse of storage for PCs from 1983 to 2023 -- forty years.

Floppy disks (in the PC world) were used from 1977 to 2010, somewhat less than forty years. But they were used prior to PCs, so maybe their span was also forty years.

Does that mean that SSDs will be used for forty years? We've had them since 1978 (if you count the very early versions) but they moved into the main stream of computing in 2017 with Intel's Optane products. Forty years after 2017 puts us in 2057. But that would be the end of SSDs -- their replacement should arrive earlier than that, possibly fifteen years earlier.

Tuesday, April 25, 2023

Chromebook life spans

At the start of the Covid pandemic, back in 2020, lots of schools needed a way for students to attend from home. They selected Chromebooks. Chromebooks were less expensive than Windows PCs or MacBooks, easier to administrate, and less vulnerable to hacking (by bad guys and students alike).

Schools bought a lot of them.

And now, those same schools are learning that Chromebooks come with expiration dates. Many of them have three-year life spans.

The schools -- or rather, the people who manage budgets for schools -- are not happy.

There is a certain amount of caveat emptor here, which the school IT and budget administrators failed to perform, but I would rather focus on the life spans of Chromebooks.

Three years isn't all that long in the IT world. How did Google (who designs the Chromebook specification) select that term?

(We should note that not all Chromebooks have three-year life spans. Some Chromebooks expire after five or even seven years. It is the schools that selected the three-year Chromebooks that are unhappy. But let's focus on the three-year term.)

(We should also note that the Chromebook life span is for updates. The Chromebooks continue to work; Google simply stops updating ChromeOS and the Chrome browser. That may or may not be an issue; I myself used an old Chromebook for years after its expiration date. Eventually, web sites decided that the old version of Chrome was not worth talking to, and I had to replace the Chromebook.)

I have an idea about the three-year life span. I don't work at Google, and have no contacts there, so I'm speculating. I may be wrong.

It seems to me that Google selected the three-year life span to tailor Chromebooks not to schools but to large corporations. Large corporations (or maybe IT vendors), back in the 1990s, convinced Congress to adjust the depreciation schedules for IT equipment, reducing the expected life to three years. This change had two effects. First, the accelerated schedule lets corporations write off the expense of IT equipment faster. Second, IT vendors convinced large corporations to replace their IT equipment every three years. (The argument was that it was cheaper to replace old PCs rather than maintain them.)

With corporations replacing PCs every three years, it made sense for Google to build their Chromebooks to fit that schedule. While PCs did not have built-in expiration dates, corporations were happy to replace their PCs on that three-year schedule.

A three-year expiration gave Google several advantages. They could design the Chromebooks with less expensive components. They could revise the ChromeOS operating system rapidly and not worry about backwards compatibility. Google could sell the idea of planned obsolescence to the makers of Chromebooks (HP, Dell, Lenovo, etc.) as a market that would provide a steady demand.

Again, this is all speculation. I don't know that Google planned any of this.

But it is consistent.

Schools are upset that Chromebooks have such a short supported life. Google made Chromebooks with those short support life spans because the target was corporations. Corporations replaced IT equipment every three years because of tax laws and the perceived costs of maintaining older hardware.

If we take away anything from this, perhaps we should note that Google was focussed on the corporate market. Other users, such as schools or non-profits or individuals, were possibly not considered in their calculations.

Monday, March 27, 2023

The Long Tail

One of the ideas from the dot-com boom was that of the "long tail". If one ranks their customers by sales, then the best customers -- the ones that order the most -- are clumped to the left, and the customers that order the least are in a long thin grouping to the right. That long, thin grouping is "the long tail".

This idea came about in the mid-2000s. It was considered revolutionary by some (mostly those who pushed the idea) and it was, in retrospect, a different way of doing business. It relied on a reduction of cost to serve customers.

Prior to "the long tail", businesses marked certain customers as unprofitable, and built their pricing to discourage those customers. (Banks, for instance, want a minimum amount when opening a new account. There is a cost to manage the account, and the profits from a low-balance account fall below that cost.)

The "long tail" advocates recognized that computers and the internet allowed for low-cost interactions with customers, and saw profit in small purchases. The efficiencies of self-service, or fully automated service, allowed for customers with smaller or fewer purchases. New businesses were formed, and became successful.

Yet there were some aspects of long-tail businesses that weren't predicted. Those aspects are the absence of customer service and the treatment of customers as disposable entities.

Consider Google. The tech giant has several businesses in the long tail of computer services. They include e-mail, data storage, processing, office tools such as word processors and spreadsheets. Companies can sign up for paid plans (the "fat" part of the tail) and individuals can sign up for free or nominally free plans (the "long" part of the tail).

But notice that the plans for individuals (free e-mail and spreadsheets, for example) have nothing in the way of customer support. There is no help line to call. There is no support by e-mail. (There are web pages with documentation, and there is a web forum staffed by volunteers, which has some answers to some questions.)

Customer support is expensive, and the individual plans (either free or low-cost) generate such few profits for Google (on an individual basis) that a single 10-minute support call would wipe out years of profits.

Long-tail businesses don't offer real-person support because they cannot afford to.

But the headaches for the customers in long-tail products and services don't end there.

Google has, on more than one occasion, cancelled a person's account. The common explanation is that the person did something that violated the terms of service. What, exactly the person did is not specified, and there is no action listed to correct the problem.

I am picking on Google here, but this happens to users of other services, too.

What makes it bad for Google's customers (and Google, indirectly) is Google's wide variety of services. A violation of terms and conditions in one service can cause Google to suspend a person's account, which removes access to all of Google's services, including e-mail, spreadsheets, data storage, and more. Google provides no information to the customer, and the customer is effectively cut off from all services.

Google has discarded the customer. It seems a harsh resolution, but it is a low-cost one. Google does not spend the time discussing options with the customer; the cost of cutting off service is, essentially zero. Google loses the minimal profits from that one customer, but those profits don't cover the cost of investigations and discussions. It is cheaper to dispose of the customer than to retain them. There are lots more customers.

Long-tail businesses don't value customer retention.

These are two results of long-tail business models. I won't say that they are wrong. The economic conditions seem to require their existence.

And the market does offer alternatives. Microsoft, for example, offers support (limited, but more than Google) for its Microsoft 365 services. In doing so, they move the customer from the thin part of the tail to a thicker part.

The old adage "You get what you pay for" seems to apply here.

Wednesday, February 22, 2023

Paying for social media

Twitter has implemented a monthly charge for its "Twitter Blue" service. Facebook (or Meta) has announced something similar.

Apple introduced its "Tracking Transparency" initiative (which allows users to disable tracking by apps) and that changed the market. Advertisers are apparently paying less to Facebook (and possibly Twitter) because of this change.

It was perhaps inevitable that Twitter and Facebook would look to replace that lost revenue. And where to look? To the users, of course! Thus the subscription fees were invented.

Twitter's fee is $8 per month, and Meta's is $12 per month. (Both are higher when purchased on an Apple device.)

Meta's price seems high. I suspect that Meta will introduce a second tier, with fewer features, and with a lower monthly rate.

Facebook and Meta must be careful. They may think that they are competing with streaming services and newspaper subscriptions. Streaming services have different pricing, from ad-supported services that charge nothing to Netflix and HBOmax that charge $15 per month (or thereabouts).

But newspapers and streaming services are different from social media. Netflix, HBOmax, and the other streaming services create content (or buy the rights to content) and provide it to viewers. Newspapers create content (or buy the rights) and present it to readers. For both, the flow of information is one-way: from the service to the user.

Social media operates differently. Users create the content, with posts and updates. That information is of interest to family, friends, and colleagues. The value to users is not merely in content, but in the network of connections. A social media site with lots of your friends is interesting to you; a site with only a few is less interesting, and a site with no friends is of no interest.

Meta and Twitter face a different challenge than Netflix and HBOmax. If streaming services raise prices or do other things to drive away customers, the value for the remaining customers remains the same. But if Facebook or Twitter drive away users, then they are reducing the value of the service to the remaining users. Meta and Twitter (and any other social media site) must act carefully when introducing changes.

I tend to think that these new fees are the result of necessity, and not of simple greed. That is, Twitter and Facebook need the revenue. If that is the case, then we users of web sites and social media may be in for more fees. It seems that simple, non-targeted advertising doesn't work for web sites, and targeted advertising (with no data sent to advertisers) doesn't work either.

Advertisements coupled with detailed user information did work, in that it provided enough revenue to web sites. That arrangement was ended by Apple's "Transparency in Tracking" initiative.

We're now in a "next phase" of social media, one in which users will pay for the service. (Or some users will pay, and other users will pay higher amounts for additional services, and some users may pay nothing.)

Thursday, February 16, 2023

Unstoppable cannon balls, immovable posts, and Apple

In the mid 20th Century, Martin Gardner wrote a series of articles for Scientific American. His column was called "Mathematical Games"; the content was less math and more puzzles, riddles, and brain teasers. One such brain teaser went something like this:

"Assume that there are unstoppable cannon balls. These cannon balls are different from the normal variety in that once shot from a cannon, they do not stop. They push aside any object in their way. Also assume that there are immovable posts. These posts are different from the normal variety in that they do not move, for any reason. Now, what happens when an unstoppable cannon ball strikes an immovable post?"

Readers of Mr. Gardner's columns had to wait for answers, which appeared in the magazine's next issue. I won't make you, dear readers, wait that long. The answer to the riddle of the unstoppable cannon ball and the immovable post is simple: they cannot exist together. If one has an unstoppable cannon ball, then by definition the universe cannot have an immovable post. Or, if one has an immovable post, then again by definition one cannot have an unstoppable cannon ball.

While that answer may be disappointing, it has a certain wisdom. That wisdom may help Apple.

With the introduction of the M1 and M2 processor lines, Apple has entered into the realm of brain teasers. They don't have an unstoppable cannon ball or an immovable post, but they have built similar things in their product line.

The problem for Apple is the Mac Pro computer. The Mac Pro is Apple's premium computer; it sports the best processor, the fastest disks, the speediest memory, and -- of course -- the highest price tag. But it has one thing that other computers in Apple's product line no longer have: the ability to replace components. The Mac Pro is the only computer that let's the user replace memory, add disks, and add GPU cards. Apple computers (not phones, not tablets) in the past have allowed for upgrades. My vintage Apple Powerbook G4 allows one to replace memory, disk drives, and battery. The original Macbook allowed for the same.

Over the years, Apple changed their products and gradually removed the ability to change components. Today's Macbook laptops and non-Pro Mac computers are all fully encased; there is no way to open them and swap components. (At least not for the average user.)

The M1 and M2 system-on-chip processors make upgrades or changes impossible. Everything is on the chip: CPU, GPU, memory, storage, and more.

The benefit of the everything-on-one-chip design is performance. When components are housed in separate chips (such as CPU in one, memory in another, and GPU in yet another) then one must provide connecting wires. These wires (or traces on the system board) run from one component to the next. Driving the signals across these wires requires extra circuitry - dedicated transistors to raise the voltage of signals from the on-chip levels to the levels for the system board. Corresponding receiver circuits adapt the signals from board-level voltages to on-chip voltages. Each of those drivers and receivers slows the signal. (It's not much, but at the frequencies of today's computers, those small delays add to significant delays.)

The distance from one component to another also causes delays. Again, each delay is small, but over the billions of operations, they add up.

Which brings us back to the unstoppable cannon ball and the immovable post.

The older designs with discrete components are the immovable post. By itself, this is not a problem.

With the system-on-chip designs of M1 and M2, Apple has built, essentially, an unstoppable cannon ball. They have left the universe of swappable components and entered the universe of system-on-chip.

You cannot have both. You cannot have a computer that has all components on a single chip, and still allows for pieces to be upgraded.

Now, you can have some computers in your line with swappable components, and others with system-on-chip designs. In that sense, you can have both.

But you cannot have a single computer with both. A computer is either totally integrated or it has replaceable components. Keep in mind that the total integration design has the much better performance.

Apple wants its Mac Pro to have replaceable components and the best performance of the line. Apple wants the Mac Pro to be the top of its product line, with the best performance (and the priciest of price tags). I don't see a way to make this happen.

The performance of Apple's M2 Ultra processor is good. Really good. Better than the old, Intel-based, swappable component Mac Pro. A new, Intel-based, swappable component Mac Pro (using the latest processors and memory chips) could be faster than the old one, but not by much. It *may* be a little faster than the M2 Ultra, but it won't be *much* faster. It certainly won't be the flagship product that Apple wants.

Apple can build computers based on the M1 or M2 processor, and they will have top performance, but they won't have replaceable components. (The unstoppable cannon ball.) Apple can build computers with replaceable components (either Intel or AMD processors, or discrete processors based on the M1/M2 CPU) but they won't have the performance. (The immovable post.)

The idea of a top-tier computer system with replaceable parts is now a thing of the past. It probably has always been a thing of the past, as high performance computers have always integrated as much as possible. The notion of replaceable parts came from the hobbyist market and the original IBM PC, which wisely traded performance for flexibility. In the 1980s, when we had a poor understanding of what we wanted from computers, flexibility was the better choice.

Today, we have very definite ideas about our computers. We don't need to experiment with different video cards and memory configurations. We don't need to add network cards to some but not all computers. (Our manufacturers also have much better processes, and computer components are much more reliable. Computers run, and we have little need to replace a failed component.)

Apple could offer a Mac computer that has replaceable parts. It would be a low-end computer, not the high-end Mac Pro. I suspect that Apple will not make such a computer. It would be more expensive to produce, have a larger support effort (customers making mistakes and asking questions), and have limited appeal in the Apple fan base.

But Apple cannot build a high-end Mac Pro with replaceable components. It won't have the performance, and the Mac Pro is all about performance.

I think that Apple will build a Mac Pro, but with the "Extreme" variant of an M2 (or possibly M3) processor. The Mac Pro will be the only computer in Apple's line with the "Extreme" variant; other computers will use the plain, "Pro", "Max", or "Ultra" version of its processors. The new Mac Pro won't have replaceable parts, but it will have superior performance. People may be surprised, but I won't be one of them.

Thursday, February 9, 2023

AI answers may improve traditional search

Isaac Asimov, the writer of science and science fiction, described his experience with publishing houses as a writer. People had warned him to stay away from the publishing world, telling him that it was full of unscrupulous opportunists who would take advantage of him. Yet his experience was a good one; the publishers, editors, and others he worked with were (for the most part) honest, hard-working, and ethical.

Asimov had a conjecture about this. He surmised that for some time prior to his arrival as a writer, the publishing industry did have a large number of unscrupulous opportunists, and they gave the industry a bad reputation. He further theorized that when he started as an author, those individuals had moved on to a different industry. Not because of his arrival, but because there was a newer, larger, and more lucrative industry to take advantage of individuals. It was the movie industry that provided a better "home" for those individuals. Once they saw that movies were the richer target, they abandoned the publishing industry, and left the ethical people (who really wanted to work in publishing) behind.

I don't recall that Asimov proved his conjecture, but it has a good feel to it.

What does this have to do with software? Well, not much for the programming world, but maybe a lot for the online search world.

Search engines (Google, Bing, Duck-duck-go, and others) make a valiant attempt to provide good results, but web sites use tricks to raise a web site's ranking in the search engines. The result is that today, in 2023, many searches work poorly. Searches to purchase something work fairly well, and some searches for answers (when does the Superbowl start) tend to be relevant, but many queries return results that are not helpful.

As I see it, web site operators, in their efforts to increase sales, have hired specialists to optimize their ranking in search engines, leading to an endless race of constantly outdoing their competition. The result is that search engines provide little in the way of "organic" lists and too many "sponsored" or "optimized" responses.

The situation with search engines is, perhaps, similar to the pre-Asimov era of publishing: full of bad operators that distort the product.

So what happens with the new AI-driven answer engines?

If people switch from the old search engines to the new answer engines, we can assume that the money will follow. That is, the answer engines will be popular, and lead to lots of ad revenue. When the revenue shifts from search engines to answer engines, the optimizations will also shift to answer engines. Which means that the efforts to game search engines will stop, and search engines can drift back to organic results.

This change occurs only if the majority of users switch to the answer engines. If a sizable number of people stay on the older search engines, then the gains from optimizing results will remain, and the optimization games will continue.

I'm hoping that most people do switch to the new answer engines, and a small number of people -- just enough for search engines to remain in business -- keep using the older engines.

Wednesday, February 1, 2023

To build and to maintain

I had the opportunity to visit another country recently (which one doesn't matter) and I enjoyed the warmer climate and the food. I also had the opportunity to observe another country's practices for building and maintaining houses, office buildings, roads, bridges, and other things.

The United States is pretty good at building things (roads, bridges, buildings, and such) and also good at maintaining them. The quality of construction and the practices for maintenance vary, of course, and overall governments and large corporations are better at them than small companies or individuals.

In the country I visited, the level of maintenance was lower. The culture of the country is such that people in the country are good at building things, but less concerned with maintaining them. This was apparent in things like signs in public parks: once installed they were left exposed to the elements where they faded and broke in the sun and wind.

My point is not to criticize the country or its culture, but to observe that maintaining something is quite different from building it.

That difference also applies to software. The practices of maintaining software are different from the practices of constructing software.

Software does not wear or erode like physical objects. Buildings expand and contract, develop leaks, and suffer damage. Software, stored in bits, does not expand and contract. It does not develop leaks (memory leaks aside), It is impervious to wind, rain, and fire. So why do I say that software needs maintenance?

I can make two arguments for maintenance of software. The first argument is a cyber-world analog of damage: The technology platform changes, and sometimes the software must change to adapt. A Windows application, for example, may have been designed for one version of Windows. Windows, though, is not a permanent platform; Microsoft releases new versions with new capabilities and other changes. While Microsoft makes a considerable effort to maintain compatibility, there are times when changes are necessary. Thus, maintenance is required.

The second argument is less direct, but perhaps more persuasive. The purpose of maintenance (for software) is to ensure that the software continues to run, possibly with other enhancements or changes. Yet software, when initially built, can be assembled via shortcuts and poor implementations -- what we commonly call "technical debt". Often, those choices were made to allow for rapid delivery.

Once the software is "complete" -- or at least functional, maintenance can be the act of reducing technical debt, with the goal of allowing future changes to be made quickly and reliably. This is not the traditional meaning of maintenance for software, yet it seems to correspond well with the maintenance of "real world" objects such as automobiles and houses. Maintenance is work performed to keep the object running.

If we accept this definition of maintenance for software, then we have a closer alignment of software with real-world objects. It also provides a purpose for maintenance; to ensure the long-term viability of the software.

Let's go back to the notions of building and maintaining. They are very different, as anyone who has maintained software (or a house, or an automobile).

Building a thing (software or otherwise) requires a certain set of skills and experience.

Maintaining that thing requires a different set of skills and experience. Which probably means that the work for maintenance needs a different set of management techniques, and a different set of measurements.

And building a thing in such a way that it can be maintained requires yet another set of skills and experience. And that implies yet another set of management techniques and measurements.

All of this may be intuitively obvious (like solutions to certain mathematics problems were intuitively obvious to my professors). Or perhaps not obvious (like solutions to certain math problems were to me). In either case, I think it is worth considering.