Showing posts with label predictions. Show all posts
Showing posts with label predictions. Show all posts

Thursday, February 17, 2022

My guesses about the Metaverse

Facebook is committed to the Metaverse. They are so committed that they changed the name of the company from "Facebook" to "Meta".

But what, exactly, is the metaverse? Facebook -- excuse me, Meta -- has provided only vague descriptions.

I have a few ideas. I start with some assumptions:

First, the metaverse, for Meta, will be a source of income. Meta will make money -- somehow -- with the metaverse offering.

Second, that income will probably come from advertising. Advertising is what Meta knows. I expect them to use that expertise.

Third, the metaverse will run on Meta devices, and not Apple phones (or Android phones). Meta will do this to avoid the Apple tax that it collects on transactions, and to collect data on its users. Apple's recent moves to increase privacy on its phones will provide an incentive for Meta to build its own platform.

Fourth, the platform will be an "immersive" (we're going to see that word a lot, I fear) one that uses an over-the-eyes display, headphones, and a microphone. There may be a few other pieces, but the display, headphones, and microphone are the important parts.

Given those assumptions, what will we see in the metaverse?

To sell advertising, Meta needs users. It needs users who spend a lot of time on the platform. The more time a user spends on metaverse, the more opportunities Meta has to show them advertisements. Therefore, the content on metaverse will be designed to attract and retain attention.

I expect that the metaverse will be closer to a video game than a web page. Instead of text and photographs, the metaverse will rely on animation and sound.

But the metaverse won't look and feel like a typical video game. Video games require too much attention, and if one is concentrating on the game then one is not paying attention to advertisements. (Also, not everyone wants to play action-packed video games.)

I think metaverse will have a mix of fast-paced and slow-paced attractions. It may have video games (especially multi-player video games), and it may have pastoral activities such as a walk in a virtual park. (A walk which you can take with friends, and in which you can meet people.) It can have real locations and fictional locations. One could visit the Eiffel Tower in Paris, for example, or the Grand Canyon. Or maybe one could visit a completely imaginary place such as Middle Earth and Hobbiton.

Metaverse may even have group sessions for things like virtual yoga classes or virtual bird watching.

How will Meta build all of these virtual locations? My guess is that they will build some, and rely on others to build more. They may ask game companies, who have experience with virtual locations, to build games for the metaverse or to assist others to build virtual-world counterparts to real-world locations such as museums, tourist destinations, or fantasy worlds.

Advertisements will be video, too. Instead of static text that pops up, and instead of simple photographs, advertisements will be video, and interactive, and fit into the current virtual world. They may be delivered by avatars. When walking through a virtual park, one may encounter a talking squirrel that mentions a movie, or a book, or a restaurant.

That's my vision of the metaverse. Meta has some challenges for this effort.

One challenge is getting other content providers on board. The creation of a virtual world is a significant effort, much higher than building a web page.

Another is interaction. The equipment needed to access metaverse (over-the-eyes display, headphones, and microphone in my guess) allows for limited input. Voice recognition seems the least clumsy approach, although I'm not sure that the technology is quite ready. (Also, a room full of people all on the metaverse and speaking into their microphones will be ... noisy.) Another approach is gestures, but over-the-eyes displays are limited to turning and perhaps nodding and shaking. For complex input, something more is needed. I'm not sure what that will be.

The biggest challenge may be non-technical. Facebook was successful because of the network effect. Once a person joined, they sent e-mails to their friends, asking them to join. Facebook got big from this effect. So big that it surpassed its predecessor, MySpace (which surpassed its predecessor, Friendster).

Facebook, at the moment has a lot of users but little in the way of goodwill. It will take a lot of convincing to get people to join this new metaverse. Meta will handle this with... advertising.

So those are my guesses. The metaverse will be a platform for delivering advertisements, attracting users with interactive video content. Meta will use their own platform, bypassing Apple and Google and their taxes, restrictions, and rules.

Let's see what Meta delivers!

Wednesday, January 8, 2020

Predictions for 2020

I have some predictions for 2020. They may (or may not) be correct.

Hardware: Virtual desktops and the cloud

I expect 2020 to be the "year of the cloud", in a sense. While cloud computing is popular, I see the phrase "the cloud" becoming more popular in the upcoming year, even more popular than cloud computing itself. How can this be? How can the term be used more than the actual thing?

I expect that lots of people with use the term "the cloud" to mean online (or web-based) computing, even in situations that do not use actual, honest-to-goodness cloud computing.

We will see an interest in virtual desktops, specifically for Windows. Today's PCs are real PCs with operating systems and applications. In 2020, look for a push (by Microsoft) for virtual desktops - instances of Windows hosted on servers and accessed by a browser or by Microsoft's Remote Desktop program.

Most folks will call this "Windows in the cloud" or "cloud computing". The former is a more accurate term, but the second is not too wrong. Virtual desktops will be hosted on servers, with some applications built for cloud computing and others not. Microsoft's Office products (Word, Excel, and Outlook) will all reside in the cloud as try cloud applications. Applications from other vendors will run on virtual desktops but won't be true cloud applications.

Virtual Windows desktops offer several advantages to Microsoft: they are paid as subscriptions, so Microsoft gets a steady cash flow. Second, Microsoft can move customers to the latest version Windows 10. Third, and perhaps most importantly, Apple is not prepared to offer a matching service. (Apple remains in the world of "fat desktops" which run the operating system and the applications. They cannot move that experience to virtual workstations hosted in the cloud.)

Cloud-based Windows is not for everyone. Some will be unwilling to move to virtual desktops, and some will be unable to move. Anyone who insists on running an older version of Windows will remain in "fat desktop" land. And anyone who cannot install their software (perhaps because they have mislaid the install CD-ROM) will stay with their current desktop computers.

Those who do move to virtual desktops will be able to use simpler, lightweight computers (perhaps Chromebooks, perhaps computers with "Windows in S mode") that run little more than the software necessary to access the virtual desktop. The physical desktop computer will be a "terminal" to the virtual desktop. The benefits for those users will be cheaper hardware, more reliable desktops, invisible backups, versioning of data files, and the ability to shift work from one lightweight desktop "terminal" to another.

Programming languages: More Python, less Perl

Perl was perhaps the first programming language to be designated a "scripting language". (It won the designation, although other languages such as Awk and Csh predate it.)

But Perl has fallen on hard times. Developers have left Perl for Python, and the "Perl 6" effort, delayed and poorly advertised, has now re-branded itself as "Raku" to allow Perl 5 to continue without the cloud of eventual shutdown. The change comes late, and I fear that many have abandoned Perl in favor of Python.

Ruby is in a similar situation, with Python appealing to many developers and interest in Ruby is waning.

Python is winning the "scripting race". Many schools teach it, and many experienced developers recommend it as a first language to learn. (I'm in that group.) It has good support with libraries, tools, and documentation. Microsoft supports it in "Visual Studio Code" and in "Visual Studio".

I expect Python to gain in popularity, and Perl and Ruby to decline.

Programming languages: Smaller languages

I expect to see a shift from object-oriented languages (Java, C#, and C++) to smaller scripting languages (Python and perhaps Ruby).

Object-oriented languages are effective for large, complex systems. They were just what we needed for the large, complex applications that ran on PCs and servers -- before cloud computing. With cloud computing, we build systems (large or small) out of services, and we strive to make services small. (A large application can built of many small services.)

Java, C#, and C++ can be used to build small services, but often Python and other languages can be a better fit. The "big" object-oriented languages carry a lot of baggage to make object-oriented programming work; the smaller languages carry less.

Two possible contenders for building services may be Swift and Visual Basic. Swift is relatively new, and still undergoing changes. Visual Basic has evolved through several generations and Microsoft may create a smaller, lighter version for services. (These are guesses; I have no indication from Apple or Microsoft that such projects are underway or even under consideration.)

Programming languages: Safer languages

The languages Rust and Go are getting a lot of attention. Both are compiled languages, letting one build fast and efficient applications.

Rust and Go may challenge C and C++ as the premiere languages for high-performance systems. C and C++ have had a good run, from the 1970s to today. But more and more, safety in programs is becoming important. The design of languages such as Rust and Go give them an advantage over C and C++.

C and C++ will stay with us, of course. Many large-scale and popular projects are written in them. I don't expect those projects to convert to Rust, Go, or any other language.

I do expect new projects to consider Rust and Go as candidate languages. I also expect projects to convert existing systems from their current technology (which could be COBOL, Fortran, or even C and C++) to think about Rust and Go.

Containers: Nice for deployment, little else

Containers were quite popular in the past year, and I expect that they will remain popular. They are convenient ways of deploying (or sharing with others) a complete application, all ready to go.

Containers offer no benefits beyond that, however, and they are helpful only to the people who must deploy applications. Therefore, containers, like virtual machines, will quietly move to the realm of sysadmins.

Development: better tech than ever

For the year 2020, the picture for development looks rather nice. We have capable languages (and lots of them); stable networks, servers, and operating systems; powerful tools for testing, deployment, and monitoring; and excellent tools for communication and collaboration. Problems in development will not be technical in nature -- which means that the challenges we face will be with people and interactions.

For developers (and project managers), I expect to see interest in collaboration tools. That includes the traditional tools such as version control, item tracking, and messaging tools such as Slack. We might see interest in new tools that have not been introduced as yet.

Summary

I see a bright future for development. We have good tools and good practices. The challenges will be people-oriented, not technology-oriented. Let's enjoy this year!

Thursday, January 10, 2019

Predictions for tech in 2019

Predictions are fun! They allow us to see into the future -- or at least claim that we can see into the future. They also allow us to step away from the usual topics and talk about almost anything we want. Who could resist making predictions?

So here are my predictions for 2019:

Programming languages: The current "market" for programming languages is fractured. There is no one language that dominates. The ten most popular languages (according to Tiobe) are Java, C, Python, C++, VB.NET, C#, JavaScript, PHP, SQL, and Objective-C. The top ten are not evenly distributed; Java and C are in a "lead group" and the remaining languages are in a second group.

O'Reilly lists Python, Java, Go, C#, Kotlin, and Rust as languages to watch for 2019. Notice that this list is different from Tiobe's "most popular" -- Rust and Kotlin show on that index in positions 34 and 36, respectively. Notably absent from O"Reilly's list are C++ and Perl.

For 2019, I predict that the market will remain fragmented. Java will remain in the lead group unless Oracle, who owns Java, does something that discourages Java development. (And even then, so many systems are currently written in Java that Java will remain in use for years. Java will be the COBOL of the 2020s: used in important business systems but not liked very much by younger developers.) C will remain in the lead group. (The popularity of C is hard to explain. But whatever C has, people like.)

Fragmentation makes life difficult for managers. Which languages should their teams use? A single leader makes the decision easy. The current market, with multiple capable languages allows for debates about development languages. An established project provides the argument of sticking with the current programming language; a new project (with no existing code) makes the decision somewhat harder. (My advice: pick a popular language that gets the job done for you. Don't worry about it being the best language. Good enough is... good enough.)

Operating systems: Unlike the "market" for programming languages, the "market" for operating systems is fairly uniform. I should say "markets": we can consider the desktop/laptop segment, the server segment, and possibly a cloud segment. For the desktop, Windows is dominant, and will remain dominant in 2019. Windows 10 is capable and especially good for large organizations who want centralized administration. MacOS is used in a number of shops, especially smaller organizations and startups, and will continue to have a modest share.

For servers, Linux dominates and will continue to dominate in 2019. Windows runs some servers, and will continue to, especially in organizations who consider themselves "Microsoft shops".

The interesting future for operating systems is the cloud segment. Cloud services run on operating systems, usually Linux or Windows, but this is changing on two fronts. The first is the hypervisor, which sits below the virtual operating system in a cloud environment; the second is containers, which sit above the virtual operating system (and which contain an application).

Hypervisors are well-understood and well established. Containers are new (well, new-ish) and not as well understood, but gaining acceptance. Between the two sits the operating system, which is coming under pressure as hypervisors and containers perform tasks that were traditionally performed by operating systems.

In the long run, hypervisors, containers, and operating systems will achieve a new equilibrium, with operating systems doing less than they have in the past. The question will not be "Which operating system for my cloud application?" but instead "Which combination of hypervisor, operating system, and container for my cloud application?". And even then, there may be large shops that use a mixture of hypervisor, operating system, and container for their applications.

Virtual reality and augmented reality: Both will remain experimental. We have yet to find a "killer app" for augmented reality, something that combines real-world and supplied visuals in a compelling application.

Cloud services: Amazon.com dominates the market, and I see little to change that. Microsoft and Google will maintain (and possibly increase) their market shares. Other players (IBM, Dell, Oracle) will remain small.

The list of services available from cloud providers is impressive and daunting. Amazon is in a difficult position; its services are less consistent than Microsoft's and Google's. Both Microsoft and Google came into the market after Amazon and developed their offerings more slowly. The result has been a smaller market share but a more consistent set of services (and, I dare say, a better experience for the customer). Amazon may change some services to make things more consistent.

Phones: Little will change in 2019. Apple and Android will remain dominant. 5G will get press and slow roll-out by carriers; look for true implementation and wide coverage in later years.

Tablets: 2019 may be the "last year of the tablet" -- at least the non-laptop convertible tablet. Tablet sales have been anemic, except for iPads, and even those are declining. Apple could introduce an innovation to the iPad which increases its appeal, but I don't see that. (I think Apple will focus on phones, watches, earphones, and other consumer devices.)

I see little interest in tablets from other manufacturers, probably due to the lack of demand by customers. As Android is the only other (major) operating system for tablets, innovation for Android tablets will have to come from Google, and I see little interest from Google in tablets. (I think Google is more interested in phones, location-based services, and advertising.)

In sum, I see 2019 as a year of "more of the same", with few or no major innovations. I suspect that the market for tech will, at the end of 2019, look very much like the market for tech at the beginning of 2019.

Wednesday, May 31, 2017

How many computers?

Part of the lore of computing discusses the mistakes people make in predictions. Thomas J. Watson (president of IBM) predicted the need for five computers -- worldwide. Ken Olson, founder and president of DEC, thought that no one would want a computer in their home.

I suspect that the we repeat these stories for the glee that they bring. What could be more fun than seeing important, well-known people make predictions that turn out to be wrong.

Microsoft's goal, in contrast to the above predictions, was a computer in every home and on every desk, and each of them running Microsoft software. A comforting goal for those who fought in the PC clone wars against the mainframe empire.

But I'm not sure that T. J. Watson was wrong.

Now, before you point out that millions (billions?) of PCs have been sold, and that millions (billions?) of smartphones have been sold, and that those smartphones are really computers, here me out.

Computers are not quite what we think they are. We tend to think of them as small, stand-alone, general-purpose devices. PCs, laptops, smartphones, tablets... they are all computers, right?

Computers today are computing devices, but the border is not so clear. Computers are useful when they are part of a network, and connected to the internet. A computer that is not connected to the internet is not so useful. (Try an experiment: Take any computer, smartphone, or tablet and disconnect it from the network. Now use it. How long before you become bored?)

Without e-mail, instant messages, and web pages, computers are not that interesting -- or useful.

The boxes we think of as computers are really only parts of a larger construct. That larger construct is built from processors and network cards and communication equipment and disks and server rooms and software and protocols. That larger "thing" is the computer.

In that light, we could say that the entire world is running on one "computer" which happens to have lots of processors and multiple operating systems and many keyboards and displays. Parts of this "computer" are powered at different times, and sometimes entire segments "go dark" and then return. Sometimes individual components fail and are discarded, like dead skin cells. (New components are added, too.)

So maybe Mr. Watson was right, in the long run. Maybe we have only one computer.

Sunday, January 2, 2011

Predictions for 2011

Happy New Year!

The turning of the year provides a time to pause, look back, and look ahead. Looking ahead can be the most fun, since we can make predictions.

Here are my predictions for computing in the coming year:

Tech that is no longer new

Virtualization will drop from the radar. Virtualization for servers is no longer exciting -- some might say that is is "old hat". Virtualization for the desktop is "not quite fully baked". I expect modest interest in virtualization, driven by promises of cost reductions, but no major announcements.

Social networks in the enterprise are also "not quite fully baked", but here the problem is with the enterprise and its ability to use them. Enterprises are built on command-and-control models and don't expect individuals to comment on each other's projects. When enterprises shift to results-oriented models, enterprise social networks will take off. But this is a change in psychology, not technology.

Multiple Ecosystems

The technologies associated with programming are a large set, and not a single bunch. Programmers seem to enjoy "language wars" (C# or C++? Java or Visual Basic? Python or Ruby?) and the heated debates continue in 2011. But beyond languages, the core technologies are bunched: Microsoft has its "stack" of Windows, .NET, C#, and SQL Server; Oracle with its purchase of Sun has Solaris, JVM, Java, Oracle DB, and MySQL; and so forth.

We'll continue to see the fracturing of the development world. The big camps are Microsoft, Oracle, Apple, Google, and open source. Each has their own technology set and the tools cross camps poorly, and I expect the different ecosystems will continue to diverge. Since each technology set is too large for a single person to learn, individuals must pick a camp as their primary skill area and forgo other camps. Look to see experts in one environment (or possibly two) but not all.

Companies will find that they are consolidating their systems into one of the big five ecosystems. They will build new systems in the new technology, and convert their old systems into the new technology. Microsoft shops will convert Java systems to C#, Oracle shops will convert Visual Basic apps to Java, and everyone will want to convert their old C++ systems to something else. (Interestingly, C++ was the one technology that spanned all camps, and it is being abandoned or at least deprecated by employers and practitioners.)

Microsoft will keep .NET and C#, and continue to "netify" its offerings. Learning from Apple, it will shift away from web applications in a browser to internet applications that run locally and connect through the network. Look for more "native apps" and fewer "web apps".

Apple will continue to thrive in the consumer space, with new versions of iPads, iPods, and iPhones. The big hole in their tech stack is the development platform, which compiles to the bare processor and not to a virtual machine. Microsoft uses .NET, Oracle uses JVM, and the open source favorites Perl, Python, and Ruby also use interpreters and virtual machines.

Virtual processors provide three advantages: 1) superior development tools, 2) improved security, and 3) independence from physical processors. Apple needs this independence; look for a new development platform for all of their devices (iPhone, iPad, iPod, and Mac). This new platform will require a relearning of development techniques, and may possibly use a new programming language.

Google has always lived in the net and does not need to "netify" its offerings. Unlike Microsoft, it will stay in the web world (that is, inside a browser). I expect modest improvements to things such as Google Search, Google Docs, and Google Chrome, and major improvements to Google cloud services such as the Google App Engine.

The open source "rebel alliance" will continue to be a gadfly with lots of followers but little commercial clout. Linux will be useful in the data center but it will not take over the desktop. Perl will continue its slow decline; Python, Ruby, and PHP will gain. Open source products such as Open Office may get a boost from the current difficult economic times.

Staffing

Companies will have a difficult time finding the right people. They will find lots of the "not quite right" people. When they find "the right person", that right person may want to work from home. Companies will have four options:

1) Adjust policies and allow employees to work from home or alternate locations. This will require revision to management practices, since one must evaluate on delivered goods and not on appearance and arrival time.

2) Keep the traditional management policies and practices and accept "not quite right" folks for the job.

3) Expand the role of off-site contractors. Companies that use off-site contractors but insist that employees show up to the office every day and attend Weekly Status Meetings of Doom will be in a difficult situation: How to justify the "work in the office every day" policy when contractors are not in the office?

4) Defer hiring.

How companies deal with staffing in an up market, after so many years of down markets, will be interesting and possibly entertaining.

New tech

Cloud computing will receive modest interest from established shops, but it will take a while longer for those shops to figure out how to use it. More interest will come from startups. The benefits of cloud computing, much like the PC revolution of the early 1980s, will be in new applications, not in improving existing applications.

We will see an interest in functional programming languages. I dislike the term "functional" since all programming languages let you define functions and are functional in the sense that they perform, but the language geeks have their reason for the term and we're stuck with it. The bottom line: Languages such as Haskell, Erlang, and even Microsoft's F# will tick up on the radar, modestly. The lead geeks are looking into these languages, just as they looked into C++ in the mid 1980s.

The cloud suppliers will be interested in functional programming. Functional languages are a better fit in the cloud, where processes can be shuffled from one processor to another. C# and Java can be used for cloud applications, but such efforts require a lot more discipline and careful planning.

Just as C++ was a big jump up from C and Pascal, functional languages are a big jump up from C++, Java, and C#. Programming in a functional language (Haskell, Erlang, or F#) requires a lot of up-front analysis and thought.

The transition from C to C++ was driven by Windows and its event-driven model. The transition from object-oriented to functional programming will be driven by the cloud and its new model. The shift to functional languages will take time, possibly decades. Complicating the transition will be the poor state of object-oriented programming. Functional programming assumes good-to-excellent knowledge of object-oriented programming, and a lot of shops use object-oriented languages but not rigorously. These shops will have to improve their skills in object-oriented programming before attempting the move to functional programming.

These are my predictions for the coming year. I've left out quite a few technologies, including:

Ruby on Rails
Silverlight
Windows Phone 7
NoSQL databases
Perl 6
Microsoft's platforms (WPF, WWF, WCF, and whatever they have introduced since I started writing this post)
Google's "Go" language
Android phones
Salesforce.com's cloud platform

There is a lot of technology out there! Too much to cover in a single post. I've picked those items that I think will be the big shakers. Let's see how well I do! We can check in twelve months.