Thursday, February 17, 2022

My guesses about the Metaverse

Facebook is committed to the Metaverse. They are so committed that they changed the name of the company from "Facebook" to "Meta".

But what, exactly, is the metaverse? Facebook -- excuse me, Meta -- has provided only vague descriptions.

I have a few ideas. I start with some assumptions:

First, the metaverse, for Meta, will be a source of income. Meta will make money -- somehow -- with the metaverse offering.

Second, that income will probably come from advertising. Advertising is what Meta knows. I expect them to use that expertise.

Third, the metaverse will run on Meta devices, and not Apple phones (or Android phones). Meta will do this to avoid the Apple tax that it collects on transactions, and to collect data on its users. Apple's recent moves to increase privacy on its phones will provide an incentive for Meta to build its own platform.

Fourth, the platform will be an "immersive" (we're going to see that word a lot, I fear) one that uses an over-the-eyes display, headphones, and a microphone. There may be a few other pieces, but the display, headphones, and microphone are the important parts.

Given those assumptions, what will we see in the metaverse?

To sell advertising, Meta needs users. It needs users who spend a lot of time on the platform. The more time a user spends on metaverse, the more opportunities Meta has to show them advertisements. Therefore, the content on metaverse will be designed to attract and retain attention.

I expect that the metaverse will be closer to a video game than a web page. Instead of text and photographs, the metaverse will rely on animation and sound.

But the metaverse won't look and feel like a typical video game. Video games require too much attention, and if one is concentrating on the game then one is not paying attention to advertisements. (Also, not everyone wants to play action-packed video games.)

I think metaverse will have a mix of fast-paced and slow-paced attractions. It may have video games (especially multi-player video games), and it may have pastoral activities such as a walk in a virtual park. (A walk which you can take with friends, and in which you can meet people.) It can have real locations and fictional locations. One could visit the Eiffel Tower in Paris, for example, or the Grand Canyon. Or maybe one could visit a completely imaginary place such as Middle Earth and Hobbiton.

Metaverse may even have group sessions for things like virtual yoga classes or virtual bird watching.

How will Meta build all of these virtual locations? My guess is that they will build some, and rely on others to build more. They may ask game companies, who have experience with virtual locations, to build games for the metaverse or to assist others to build virtual-world counterparts to real-world locations such as museums, tourist destinations, or fantasy worlds.

Advertisements will be video, too. Instead of static text that pops up, and instead of simple photographs, advertisements will be video, and interactive, and fit into the current virtual world. They may be delivered by avatars. When walking through a virtual park, one may encounter a talking squirrel that mentions a movie, or a book, or a restaurant.

That's my vision of the metaverse. Meta has some challenges for this effort.

One challenge is getting other content providers on board. The creation of a virtual world is a significant effort, much higher than building a web page.

Another is interaction. The equipment needed to access metaverse (over-the-eyes display, headphones, and microphone in my guess) allows for limited input. Voice recognition seems the least clumsy approach, although I'm not sure that the technology is quite ready. (Also, a room full of people all on the metaverse and speaking into their microphones will be ... noisy.) Another approach is gestures, but over-the-eyes displays are limited to turning and perhaps nodding and shaking. For complex input, something more is needed. I'm not sure what that will be.

The biggest challenge may be non-technical. Facebook was successful because of the network effect. Once a person joined, they sent e-mails to their friends, asking them to join. Facebook got big from this effect. So big that it surpassed its predecessor, MySpace (which surpassed its predecessor, Friendster).

Facebook, at the moment has a lot of users but little in the way of goodwill. It will take a lot of convincing to get people to join this new metaverse. Meta will handle this with... advertising.

So those are my guesses. The metaverse will be a platform for delivering advertisements, attracting users with interactive video content. Meta will use their own platform, bypassing Apple and Google and their taxes, restrictions, and rules.

Let's see what Meta delivers!

Sunday, January 23, 2022

Will Microsoft Change Windows to Linux

People, from time to time, ask about Microsoft changing from Windows to Linux. When they do, lots of people respond. The responses fall into two general categories: Microsoft will switch to Linux because it is the superior operating system, and Microsoft will stick with Windows because it is the superior operating system.

The rebuttals are always -- always -- in the technical realm. Linux is better at this, and Windows is better at that.

I have a different response.

Microsoft will switch from Windows to Linux if, and when, it is Microsoft's interest to switch.

In the 1990s and 2000s, Windows was a key part of its strategy. Microsoft sold software (or licenses for software, which amounts to the same thing) and it used Windows as a base for its other products. Office ran on Windows (and versions for Mac OS, which were a special case). SQL Server ran on Windows. Internet Explorer ran on Windows. Outlook ran on Windows, and talked to Exchange, which also ran on Windows. Visual Studio ran on Windows. SourceSafe ran on Windows (and Unix, because it had been developed by an independent company and sold to Microsoft).

During that period, Microsoft would never consider switching from Windows to Linux. Such a move would destroy Microsoft's strategy of "everything on Windows".

Today, Microsoft offers services that extend beyond Windows, and some of them use Linux. Azure provides cloud services. One can provision Linux servers as well as Windows servers (and pay Microsoft for both). Microsoft has less incentive to force customers to use Windows.

In addition, Microsoft is moving its apps into the cloud and onto the web. One can open and edit Word documents and Excel spreadsheets in a browser. (The online versions of Word and Excel are limited compared to the locally-installed versions. I expect the online versions to improve over time.) Microsoft has also created a cloud-based, web version of Visual Studio Code, which lets programmers collaborate across multiple operating systems.

Microsoft has dropped the "everything on Windows" strategy in favor of a "sell services and subscriptions" strategy. It doesn't require Windows to be at the center of the customer experience.

Will Microsoft replace Windows with Linux? The proper way to look at the question is not in the technical realm, but in the financial realm. If Microsoft can make more money with Linux than Windows, it should (and probably will) offer Linux.

Windows provides an income stream, in the form of licenses. Microsoft is moving from a "buy once until you upgrade" approach to an annual subscription. The latter is more predictable, for both Microsoft and customers, and seems to provide higher revenue to Microsoft. But the point is that Windows provides income to Microsoft.

Windows is also an expense for Microsoft. The development, maintenance, and support for Windows requires time and effort in significant quantities.

The question then becomes: which is the higher number? Does revenue cover expenses (and then some)? Or does Windows cost more to maintain than it brings in revenue?

The current capabilities of Microsoft's cloud-based web applications are such that locally-installed applications provide more to the customer. Some day that may change. Until it does, those advantages translate to incentives to support Windows.

Technical arguments can be fun. They can also be heated. But they are not the way to convince Microsoft to switch to Linux. Or to stay with Windows. The decision is a financial one, not a technical one.


Wednesday, January 12, 2022

Successful programming languages

The IT world has seen a number of programming languages. Some became popular and some did not.

With more than half a century of experience, we can see some patterns in languages that became popular. First, let's review some of the popular programming languages.

FORTRAN and COBOL were successful because they met a business need (or two, perhaps). The primary need was for a programming language that was easier to understand than assembly language. The secondary need was for a programming language that could be used across computers of different manufacturers, allowing companies to move programs from one vendor's hardware to another. Both FORTRAN and COBOL met those needs.

BASIC was modestly successful on minicomputers and wildly successful on personal computers. It was possible to run BASIC on the small PCs (sometimes with as little as 4K of memory!). It was easy to use, and amateur programmers could write and run programs relatively easily. It filled the technical space of timesharing on minicomputers, and the technical space of personal computers.

C became popular in the Unix world because it was included in Unix distributions. If you ran Unix, you most likely programmed in C. One could say that it was pushed upon the world by AT&T, the owners of Unix.

SQL became successful in the 1980s, just as databases became available and popular. Prior to databases, computer systems offered "file managers" and "file access libraries" which allowed basic operations on records but not tables. Each library had its own set of capabilities and its own API. SQL provided a common set of operations and a common API. If allowed businesses to move easily from one database to another,  guaranteed a core set of operations, and permitted a broad set of programmers to work on multiple systems.

C++ became popular because it solved a problem with C programs, namely the organization of large programs. C++ offered object-oriented concepts of classes, inheritance, encapsulation, and polymorphism, each of which helps organize code.

Visual Basic became popular because it provided an easy way to write programs for Windows. Microsoft's earlier Visual C++ required knowledge of the Windows API and lots of discipline. Visual Basic required neither, hiding the Windows API from the programmer and providing safety in the programming language.

Objective-C became popular because Apple used it for programming applications for Macintosh computers. (Later, when Apple switched to the Swift programming language, interest in Objective-C plummeted.)

Java became popular because it promised that people could write programs once and then run them anywhere. It did a good job of delivering on that promise, too.

C# is Microsoft's version of Java (its second version, after Visual J++) and is popular only in that Microsoft pushes it. If Microsoft were to disappear overnight, interest in C# would drop dramatically.

Swift is Apple's language for development of applications for the iPhone, iPad, and other Apple products. It is successful because Apple pushes it upon the world.

JavaScript became popular because it was ubiquitous. Much like BASIC on all PCs, JavaScript was in all browsers, and the combination of HTML, the DOM, and JavaScript allowed for web applications with powerful processing in the browser.

Python became popular because it was a better Perl. Python had a simpler syntax and also had object-oriented programming built in.

Notice that, with the exception of Rust replacing C or C++, these new languages become popular in new spaces. They don't replace an existing popular language. BASIC didn't replace COBOL or FORTRAN, it became popular in the new spaces of timesharing and personal computers. C# didn't replace Java; it joined Visual Basic in the Microsoft space and slowly gained in popularity as Microsoft supported it more than Visual Basic.

So we can see that there are a few reasons that languages become popular:

  • A vendor pushes it
  • It solves a commonly-recognized business problem
  • It fills a technical space

If we accept these as the reasons that languages become popular, we can make some predictions about new languages that become popular. We can say that, if a major vendor pushed a language for its projects (a vendor such as Amazon, for example) then that language would become popular.

Or, we could say that a new technical space would allow a new language to become popular.

Or, if there is a commonly recognized business problem with our current set of programming languages, a new language that solves that problem would become popular.

So what do we see?

If we accept that C and C++ have problems (memory management, buffer overflows) and we accept that those problems are commonly recognized, then we can see the replacement of C and C++ with a different language, one that addresses those problems. The prime contender for that is Rust. We may see a gradual shift from C and C++ programming to Rust, as more and more people develop confidence in Rust and develop a fear of memory management and buffer overrun issues.

One technical space that could provide an opportunity for a new programming language is the Internet of Things (IoT). Appliances and devices must communicate with each other and with servers. I suspect that the IoT space is in need more of protocols than of programming languages, but perhaps there is room for a programming language that works with new protocols to establish trusted connections and, more importantly, trusted updates.

A third area is a teaching language. BASIC was designed, initially, for people not familiar with programming. Pascal was designed to teach structured programming. Do we have a language designed to teach object-oriented programming? To teach functional programming? Do we even have a language to teach basic programming? BASIC and Pascal have long been discarded by the community, and introductory courses now often use Java, JavaScript, or Python which are all rather complex languages. A new, simple language could gain popularity.

As we develop new devices for virtual reality, augmented reality, and other aides, we may need new programming languages.

Those are the areas that I think will see new programming languages.

Tuesday, January 11, 2022

Predictions for 2022

Let's start with some simple, obvious predictions:

Apple will announce its series of M2 chips (or at least the first chip of the M2 series) and will focus on products for individuals such as heads-up displays and appliances for the home.

Microsoft will focus on security and collaboration. It has already announced its "Pluton" chip that integrates into CPUs. It already has collaboration in its office and development products (Word, Excel, Visual Studio). I expect that Microsoft will enhance those capabilities. Oh, and deliver a revised version of Windows 11.

Facebook will deal with its current scandals (anti-trust actions, harmful content with no mitigation) and possibly some new ones. I expect to see changes in senior management and organization of company. Mark Zuckerberg will either leave or be forced out.

Amazon... will continue to do what it has been doing. It will continue to offer online sales, cloud services, and a smattering of consumer devices.

Google will continue to develop Android and ChromeOS, and will continue to merge the two. Google's cloud services run a distant third after Amazon and Microsoft; I think we will see development and advertising for Google's offerings.

In other words, things will be fairly boring.

Some less obvious predictions:

No new programming languages. At least, no new popular programming languages will arise in 2022. The current set of programming languages (C#, Java, Python, JavaScript, C++, SQL) is meeting the needs of industry. There will be lots of new languages, of course. People invent programming languages and each week sees an announcement of a new language. But the popular languages will remain popular.

No major changes to hardware. Laptop and desktop computers and smart phones and tablets are supplying the needed computing power. There is no compelling reason to change. (Microsoft and PC vendors may envy Apple's system-on-chip offerings, but won't be in a position to offer comparable hardware for some time. Instead, look to Intel to provide bigger and faster chips (as they have just announced this month).

Work-from-home, or remote work, will continue to be important in 2022. But 2022 will be different, in that companies will commit more firmly to remote work. It will be CFOs that lead this change; they will insist that companies stop paying for office space that remains vacant. CEOs may stall, but the finances are very persuasive. Corporate boards will look at the numbers and side with the CFOs on this issue.

NFTs (non-fungible tokens) will see a slow decline of interest. Current interest is driven by novelty and potential; the future may see useful functions associated with them.

The "Internet of Things" or IOT (sometimes IoT) will see some growth as multiple companies make internet-connected appliances and devices. We already have light bulbs, door cameras, home temperature systems, and tracking tags. Apple's idea of iPhones as connection points will be copied by Google, and Android phones will also serve as connection points. (Although the two systems will be incompatible, much like the early days of telephone service which saw multiple companies with independent networks serving the same town or city.)

All in all, 2022 will be a year of modest changes, small improvements, and few surprises.

Or so I think.

Monday, January 3, 2022

The biggest gains of Apple's M processors is behind us

Improvements in hardware are not linear. If we look at the performance of hardware over time, we can see that performance improvements follow a pattern: a sharp rise in performance followed by a period of little improvements. (A graph looks like a staircase, with the pattern of "rise, flat, rise, flat".)

The first implementation of a change provides significant increases. Over time, we refine the improvements and gain additional increases. But those later increases are smaller. Eventually, subsequent refinements provide minimal improvements. So we move on to other ideas.

The history of personal computers follows this pattern. We have made a number of changes to hardware that have improved performance. Each of those changes yielded a large initial gain, and then gradually diminishing improvements.

We have increased the clock speed.

We changed memory technology from "core" memory (ferrous rings) to transistor-based memory (static at first, and then dynamic memory).

We have added caches, to store values in processor, reducing the dependence on memory. We liked the idea of caches so much that we did it more than once. Processors now have three levels of caches.

We have off-loaded work to (smarter) devices. Devices now have their own processors and can perform tasks independently of the CPU.

We have increased the number of CPU cores, which improves performance for systems with multiple processes and multiple threads. (Which is just about every system we have today.)

Each of these changes improved performance. A large step up at first, and then smaller increases.

Now Apple has used another method to improve performance: Reduce distance between chips with system-on-a-chip designs. The M1 chips include all components of the computer: CPU, memory, storage, GPU, and more.

Yet the overall pattern of improvements will hold with this new design. The first M1 chip will have significant improvements over the older design of discrete components. The M1 Pro and M1 Max will have improvements over the M1 chip, but not larger than the initial M1 gains.

Later chips, such as the M2, M2 Pro, and even the M3, will have gains, but less and less (in terms of percentages) than the previous chips. The performance curve, after a sharp rise with the M1 chip, will flatten. Apple will have entered the "plain of modest gains" phase.

Apple's M1 chips are nice. They provide good performance. Newer versions will be better: faster, more powerful. But the biggest increases, I think, are already behind us.

Wednesday, December 29, 2021

Something is rotten in the state of Apple

Apple recently announced a set of (rather large) retention payments for some engineers. (I assume that these are senior engineers.) The payments are in the form of restricted stock units which vest over a four-year period. Apple is making these payments to prevent employees from leaving for other companies. It's not clear, but these appear to be one-time payments. (That is, this is not a new, recurring payment to employees.)

This is not a good idea.

If employees are leaving Apple, that is a problem. And the problem has two (for this post) possible causes: employees are unhappy with wages, or employees are unhappy with something else.

If employees are unhappy with wages, it is quite likely that wages are not in line with other companies. If that is the problem, then the correct action is to adjust wages to competitive levels. A one-time payment (even spread over years) will not fix the problem.

I tend to think that Apple's wages are competitive.

If employees are unhappy with something other than wages, then a monetary payment does not fix the problem. Employees could he unhappy about many things: working conditions, opportunities to work on fun projects, opportunities for professional growth, political involvement of the company, or projects deemed unethical by employees. (And more!)

Monetary compensation does not fix any of those problems. The problems remain, and the unhappiness of employees remains.

I suspect that in the middle of the COVID-19 pandemic, Apple requested (demanded?) that employees return to the office, and a number of employees -- senior-level employees -- refused. Not only did they refuse, they refused with "and if you make me return to the office, I will leave Apple and work for someone else". (This is all speculation. I am not employed by Apple, nor have I spoken with any Apple employees.)

Apple has not one but two problems.

The first problem is that employees are leaving, and Apple has addressed that problem with bonus payments.

The second problem is that Apple thinks this tactic is a good one. (They must think it is a good idea. Otherwise they would not have done it.)

Payments of this type must have been discussed at the highest levels of Apple. I suspect that even members of the board of directors were consulted. Did anyone object to these payments?

In the end, Apple decided to pay employees to address issues that are not related to compensation. It is the wrong solution to the problem, and it will probably make the situation worse. Bonus payments were offered to some employees. The other employees will resent that. Bonus payments vest over four years. After the last payment, employees who were given these payments will see their compensation drop significantly. They may resent that.

Apple has a problem. Retention payments don't address the underlying issues. Is Apple addressing the underlying issues? Possibly. (Apple would not advertise that.)

Let's see what happens at Apple over the next year. I expect to hear about changes in the organization, with the resignations of several senior executives.

I also expect that Apple will continue on its current course for its products and services. Those are sound, and have a good reputation with customers. I see no need to change them, other than the typical improvements Apple makes each year.

Tuesday, December 21, 2021

Moving fast and going far are not the same thing

There is an old saying: If you want to go fast, go alone; if you want to go far, go in a group.

One significant difference between Apple and Microsoft is that Apple manages product lines and Microsoft manages an ecosystem. This difference is significant. Apple is, essentially, moving alone. It can (now) design its own hardware and software. Apple does still need raw materials, fabrication for its chips, manufacture of its cases and boxes, and assembly of components into finished goods. But Apple deals with two types of entities: suppliers (the companies that supply raw materials, chips, etc.) and customers (the people and companies that purchase computers and services).

Microsoft, in contrast, lives in an ecosystem that includes suppliers, PC manufacturers, developers, and customers (both individual and organizational). While Microsoft does design its Surface tablets and laptops, those tablets and laptops are a small part of the larger market. The laptops and desktops made by Dell, Lenovo, HP, and others are a large portion of the market.

Apple can move quickly, changing its processors from Intel to Apple-designed ARM in less than two years. Microsoft, on the other hand, must move more cautiously. It cannot dictate that Windows will shift from Intel to ARM because Microsoft does not control the manufacturers of PCs.

If Microsoft wants to shift personal computers from the current designs of discrete components to system-on-chip designs (and I believe that they do) then Microsoft must persuade the rest of the ecosystem to move in that direction. Such persuasion is not easy -- PC makers have lots invested in the current designs, and are familiar with gradual changes to improve PCs. For the past three decades, Microsoft has guided PC design through specifications that allow PCs to run Windows, and those specification have changed gradually: faster processors here, faster buss connections there, faster memory at some times, better interfaces to graphics displays at other times. The evolution of personal computers has been a slow, predictable process, with changes that can be absorbed into the manufacturing processes of the PC makers.

The Microsoft "empire" of PC design has been, for all intents and purposes, successful. For thirty years we have benefitted from computers in the office and in the home, and those computers have (for the most part) been usable and reliable.

Apple benefitted from that PC design too. The Intel-based Mac and MacBook computers were designed in the gravity field of Windows. Those Mac computers were Windows PCs, capable of running Windows (and Linux) because they used the same processors, video chips, and buss interfaces as Windows PCs. They had to use those chips; custom chips would be too expensive and risky to make.

Apple has now left that empire. It is free of the "center of gravity" that Windows provides in the market. Apple can now design its own processor, its own video chips, its own memory, its own storage. Apple is free! Free to move in any direction it likes, free to design any computer it wants.

I predict that Apple computers will move in their own direction, away from the standard design for Windows PCs. Each new generation of Apple computers will be less and less "Windows compatible". It will be harder and harder to run Windows (or Linux) on Apple hardware.

Microsoft has a new challenge now. They must answer Apple's latest M1 (and M2) system-on-chip designs. But they cannot upend the ecosystem. Nor can they abandon Intel and shift everything to ARM designs. Apple has leveraged its experience with its 'A' series chips in phones to build the 'M' series chips for computers. Microsoft doesn't have that experience, but it has something Apple doesn't: an ecosystem.

I predict that Microsoft will form alliances with other companies to build system-on-chip designs. Probably with IBM, to leverage virtual machine technology (and patents) and possibly Intel to leverage chip fabrication. (Intel recently announced that it was open to sharing its fabrication plants for non-Intel designs.)

[I hold stock in both Microsoft and IBM. That probably biases my view.]

Microsoft needs to build experience with system-on-chip designs, and alliances can provide that experience. But alliances require time, so I'm not expecting an announcement from Microsoft right away. The first system-on-chip designs may be tablets and simple laptops, possibly competing with Chromebooks. Those first simple laptops may take two years of negotiation, experimentation, design, assembly, and testing before anything is ready for market. (And even then, they may have a few problems.)

I think Microsoft can achieve the goal of system-on-chip designs. I think that they will do it with the combined effort of multiple companies. I think it will take time, and the very first products may be disappointing. But in the long run, I think Microsoft can succeed.

If you want to move fast, go alone; if you want to go far, go in a group.