What will we see as the next big thing?
Let's look at the history of computer technology -- or rather, a carefully curated version of the history of computer technology.
The history of computing can be divided into eras: The mainframe era, the minicomputer era, the micro/PC era, and so forth. And, with careful editing, we can see that these eras have similar durations: about 15 years each.
Let's start with mainframe computers. We can say that it ran from 1950 to 1965. Mainframe computers were (and still are) large, expensive computers capable of significant processing. They are housed in rooms with climate control and dedicated power. Significantly, mainframe computers are used by people only indirectly. In the mainframe age, programmers submitted punch cards which contained source code; the cards were fed into the computer by an operator (one who was allowed in the computer room); the computer compiled the code and ran the program; output was usually on paper and delivered to the programmer some time later. Mainframe computers also ran batch jobs to read and process data (usually financial transactions). Data was often read from magnetic tape and output could be to magnetic tape (updated data) or paper (reports).
Minicomputers were popular from 1965 to 1980. Minicomputers took advantage of newer technology; they were smaller, less expensive, and most importantly, allowed for multiple users on terminals (either paper-based or CRT-based). The user experience for minicomputers was very different from the experience on mainframes. Hardware, operating systems, and programming languages let users interact with the computer in "real time"; one could type a command and get a response.
Microcomputers and Personal Computers (with text displays, and without networking) dominated from 1980 to 1995. It was the age of the Apple II and the IBM PC, computers that were small enough (and inexpensive enough) for an individual to own. They inherited the interactive experience of minicomputers, but the user was the owner and could change the computer at will. (The user could add memory, add disk, upgrade the operating system.)
Personal Computers (with graphics and networking) made their mark from 1995 to 2010. They made the internet available to ordinary people. Graphics made computers easier to use.
Mobile/cloud computers became dominant in 2010. Mobile devices without networks were not enough (the Palm Pilot and the Windows pocket computers never gained much traction). Even networked devices such as the original iPhone and the Nokia N800 saw limited acceptance. It was the combination of networked mobile device and cloud services that became the dominant computing model.
That's my curated version of computing history. It omits a lot, and it fudges some of the dates. But it shows a trend, one that I think is useful to observe.
That trend is: computing models rise and fall, with their typical life being fifteen years.
How is this useful? Looking at the history, we can see that the mobile/cloud computing model has been dominant for slightly less than fifteen years. In other words, its time is just about up.
More interesting is that, according to this trend (and my curated history is too pretty to ignore), something new should come along and replace mobile/cloud as the dominant form of computing.
Let's say that I'm right -- that there is a change coming. What could it be?
It could be any of a number of things. Deep-fake tech allows for the construction of images, convincing images, of any subject. It could be virtual reality, or augmented reality. (The difference is nontrivial: virtual reality makes full images, augmented reality lays images over the scene around us.) It could be watch-based computing.
My guess is that it will be augmented reality. But that's a guess.
Whatever the new thing is, it will be a different experience from the current mobile/cloud model. Each of the eras of computing had its own experience. Mainframes had an experience of separation and working through operators. Minicomputers had interactive experience, although someone else controlled the computer. Personal computers had interaction and the user owned the computer. Mobile/cloud let people hold computers in their hand and use them on the move.
Also, the next big thing does not eliminate the current big thing. Mobile/cloud did not eliminate web-based systems. Web-based systems did not eliminate desktop applications. Even text-mode interactive applications continue to this day. The next big thing expands the world of computing.