Tuesday, January 29, 2019

Intelligence, real and artificial

We humans have been working on artificial intelligence for a long time. At least fifty years, by my count, and through most of that time, true artificial intelligence has been consistently "twenty years away".

Of course, one should not talk solely about artificial intelligence. We humans have what we style as real intelligence. Perhaps the term "organic intelligence" is more appropriate, as human intelligence evolved organically over the ages. Let's not argue too much about terms.

The human mind is a strange thing. It is the only thing in the universe that can examine itself. (At least, it's the only one that we humans know about.)

There are many models of the human mind. We have studied the anatomy, the physiology, the chemistry, ... and we still understand little about how it works. Freud studied the human psyche (close enough to the mind for this essay) and Skinner studied animal behaviors with reward systems, and we still know little about the mind.

But there is one model that strikes me as useful when developing artificial intelligence: the notion of the human brain as two different but connected processors.

In this model, humans have not one but two processors: one slow and linear, the other fast and parallel. The slow, linear processor gives us analytical thought, math, logic, and language. The parallel side gives us intuition, using s a pattern-matching system.

The logical side is easy for us to examine. It is linear and relatively slow, and since it has language, it can talk to us. We can follow a chain of reasoning and understand how we arrive at an answer. (We can also explain our reasoning to another person, or write it down.)

The intuitive side is difficult to examine. It is parallel and relatively fast, and since it does not have language, it cannot explain how it arrives at an answer. We don't know why we get the results we get.

From an evolution point of view, it is easy to see how we developed the intuitive (pattern-matching) side. Our ancestors were successful when they identified a rabbit (and ate it) and identified a tiger (and ran away from it). Pattern matching is quite useful for survival.

It is less clear how we evolved the linear-logical side of our brain. Slow, analytic thought may be helpful for survival, but perhaps not as helpful as avoiding tigers. Communication is clearly a benefit when living in groups. No matter how it arose, we have it.

These two sides make up our brain. (Yes, I am aware that there are various levels of the brain, all the way down to the brain stem, but bear with me.)

Humans are successful, I believe, because we have both the logical and the intuitive processors. We use both brains in our everyday life, from recognizing other humans and breakfast cereal, and we think about business strategies and algebra homework. We pick the right processor for the problem at hand.

Now let's shift from our human intelligence to ... not artificial intelligence but computer intelligence, such as it is.

Traditional computing is our logic, math-oriented brains with a turbocharger. Computers are fast, and can perform calculations rapidly and reliably, but they don't have "common sense", or the intuition that we humans use. While fast, we can examine the program and understand how a computer arrives at a result.

Artificial intelligence, on the other hand, corresponds to the intuitive side of human intelligence. It can solve problems (relatively quickly) many times through pattern-matching techniques. And, just as the human intuitive, pattern-matching brain cannot explain how it arrives at a result, neither can artificial intelligence systems. We cannot simple examine the program and look at some variables to understand how the result was determined.

So now we have two artificial systems, one logical and one intuitive systems. These two types of "intelligence" are the two types in humans.

The real advance will be to combine the traditional computing systems (the logical systems) with artificial intelligence (the pattern-matching systems), just as our brains combine the two. Bringing the two disparate systems into one will be necessary for true, Skynet-class, Forbin-class, HAL-9000-class, artificial intelligence.

I expect that joining the two will be quite the challenge. We understand little about our human brains and how the logical and intuitive processors coordinate their work. Getting the logical and intuitive computer systems to work together will be (I think) a long effort.

But when we get it -- watch out!

Wednesday, January 23, 2019

Apple improves, but does not invent

That headline is a bit strong, and fighting words to Apple devotees.

My point is that Apple examines a market, finds a product, and builds a better version. It has built the proverbial better mousetrap, and the world has beaten a path to Apple's door.

Let's look at the history.

The iPhone, Apple's signature and most successful product, was (and is) a hand-held computer that can act as a cell phone. The iPhone was not the first cell phone, nor was it the first hand-held computer. Prior to the iPhone we had the Palm series of computers, and Microsoft's attempt at hand-held computers with a stripped-down version of Windows named, unfortunately, WinCE. (And "wince" is how many us us had to look at the low-resolution screens of the Windows hand-held computers.)

Apple looked at the market of hand-held computers and built a better version. It just happened to have a phone.

Apple's success goes beyond the iPhone.

The iPod was Apple's music player. It was not the first music player, but it was better than existing models. The physical design was better, but more importantly, the iTunes software that let people easily (and consistently) purchase music (rather than download from shady web sites) made the iPod a success.

Apple looked at the market of music players and music downloads and built a better version.

The MacBook Air was (and still is) Apple's design for a laptop computer. Slim, light, and capable, it was better than the competing laptops of the time. Manufacturers have now adopted the MacBook design into their own product lines.

Apple looked at the market of laptop computers and built a better version.

The Apple watch is a product with modest success. It was not the first digital watch. Its connectivity to the iPhone makes it easy to use and more capable than other digital watches, including those with apps that you install on your phone.

Apple looked at the market of digit watches and built a better version.

The Macintosh was Apple's second major success in the computer market. As with the iPod, the hardware was good and the software was better. It was the operating system, with its graphical interface, that made the Macintosh a success. The "Mac" was easier to use than PCs or clones running PC-DOS.

Apple looked at the market of desktop computers (and operating systems) and built a better version.

The history shows that Apple does not invent successful products, but instead improves on existing products. Apple often brings disparate elements together (the iPhone was a combination of cell phone and hand-held computer, the Macintosh was a combination of PC and graphical operating system) into a usable design. That's a good skill, and somewhat rare. But it has its weakness. (More on that later.)

Apple can invent; they designed and built the Apple II in the early microcomputer days. While the Apple II was not the first home computer, it was among the first. The Apple II was built for the consumer market, much like a television. Apple invented the home computer -- perhaps at the same time as others -- and deserves credit for its work.

The problem with Apple's strategy (building better mousetraps) is that there must be mousetraps (of the non-better kind) to begin with. One cannot build the better mousetrap until mousetraps exist, and one has ideas for a better version of the mousetrap.

The challenge for Apple, now, is to find a market and build a better product. I'm not sure where Apple can go. There is AI, which is a nascent market with a few early products (much like the microcomputer market before the IBM PC) but it does not fit well into Apple's overall strategy of selling hardware.

Another possibility is virtual reality and augmented reality. Microsoft offers the HoloLens heads-up display, which is admired but not used all that much. (A few games and experimental applications, but no "killer" app.) Apple could design and sell their own although heads-up display, but they may encounter the same dearth of applications that stymies Microsoft. Apple would have to create (or entice others to create) content. It could be done, but it doesn't fit the historical pattern.

Apple could move into games with a gaming console of their own. There are already games available, and the market has a ready set of customers. The competition is stiff, and Apple will have a challenge to design the "better mousetrap" of a game console, and convince game producers to create versions for Apple's console. Also, gamers -- serious players -- like to modify and enhance their hardware. Apple products are usually sealed shut and closed to modification.

Self-driving cars? The idea has been discussed. But self-driving cars are not commercially available. Apple likes to let others develop the first products and then bring on a better mousetrap.

Apple has avoided the cloud services market. Apple may use cloud technologies for things like Siri and backup, but they don't sell services the way Amazon and Microsoft do. (Mostly, I think, because cloud services move processing off the local device, and Apple wants to sell expensive local devices.)

As tablet sales decline and phone sales plateau, Apple has some interesting challenges. I don't know where they are going to go. (Which means I will be surprised when they do.) I'm not bearish on Apple -- I think they have a bright future -- and I hope to be pleasantly surprised.

Thursday, January 10, 2019

Predictions for tech in 2019

Predictions are fun! They allow us to see into the future -- or at least claim that we can see into the future. They also allow us to step away from the usual topics and talk about almost anything we want. Who could resist making predictions?

So here are my predictions for 2019:

Programming languages: The current "market" for programming languages is fractured. There is no one language that dominates. The ten most popular languages (according to Tiobe) are Java, C, Python, C++, VB.NET, C#, JavaScript, PHP, SQL, and Objective-C. The top ten are not evenly distributed; Java and C are in a "lead group" and the remaining languages are in a second group.

O'Reilly lists Python, Java, Go, C#, Kotlin, and Rust as languages to watch for 2019. Notice that this list is different from Tiobe's "most popular" -- Rust and Kotlin show on that index in positions 34 and 36, respectively. Notably absent from O"Reilly's list are C++ and Perl.

For 2019, I predict that the market will remain fragmented. Java will remain in the lead group unless Oracle, who owns Java, does something that discourages Java development. (And even then, so many systems are currently written in Java that Java will remain in use for years. Java will be the COBOL of the 2020s: used in important business systems but not liked very much by younger developers.) C will remain in the lead group. (The popularity of C is hard to explain. But whatever C has, people like.)

Fragmentation makes life difficult for managers. Which languages should their teams use? A single leader makes the decision easy. The current market, with multiple capable languages allows for debates about development languages. An established project provides the argument of sticking with the current programming language; a new project (with no existing code) makes the decision somewhat harder. (My advice: pick a popular language that gets the job done for you. Don't worry about it being the best language. Good enough is... good enough.)

Operating systems: Unlike the "market" for programming languages, the "market" for operating systems is fairly uniform. I should say "markets": we can consider the desktop/laptop segment, the server segment, and possibly a cloud segment. For the desktop, Windows is dominant, and will remain dominant in 2019. Windows 10 is capable and especially good for large organizations who want centralized administration. MacOS is used in a number of shops, especially smaller organizations and startups, and will continue to have a modest share.

For servers, Linux dominates and will continue to dominate in 2019. Windows runs some servers, and will continue to, especially in organizations who consider themselves "Microsoft shops".

The interesting future for operating systems is the cloud segment. Cloud services run on operating systems, usually Linux or Windows, but this is changing on two fronts. The first is the hypervisor, which sits below the virtual operating system in a cloud environment; the second is containers, which sit above the virtual operating system (and which contain an application).

Hypervisors are well-understood and well established. Containers are new (well, new-ish) and not as well understood, but gaining acceptance. Between the two sits the operating system, which is coming under pressure as hypervisors and containers perform tasks that were traditionally performed by operating systems.

In the long run, hypervisors, containers, and operating systems will achieve a new equilibrium, with operating systems doing less than they have in the past. The question will not be "Which operating system for my cloud application?" but instead "Which combination of hypervisor, operating system, and container for my cloud application?". And even then, there may be large shops that use a mixture of hypervisor, operating system, and container for their applications.

Virtual reality and augmented reality: Both will remain experimental. We have yet to find a "killer app" for augmented reality, something that combines real-world and supplied visuals in a compelling application.

Cloud services: Amazon.com dominates the market, and I see little to change that. Microsoft and Google will maintain (and possibly increase) their market shares. Other players (IBM, Dell, Oracle) will remain small.

The list of services available from cloud providers is impressive and daunting. Amazon is in a difficult position; its services are less consistent than Microsoft's and Google's. Both Microsoft and Google came into the market after Amazon and developed their offerings more slowly. The result has been a smaller market share but a more consistent set of services (and, I dare say, a better experience for the customer). Amazon may change some services to make things more consistent.

Phones: Little will change in 2019. Apple and Android will remain dominant. 5G will get press and slow roll-out by carriers; look for true implementation and wide coverage in later years.

Tablets: 2019 may be the "last year of the tablet" -- at least the non-laptop convertible tablet. Tablet sales have been anemic, except for iPads, and even those are declining. Apple could introduce an innovation to the iPad which increases its appeal, but I don't see that. (I think Apple will focus on phones, watches, earphones, and other consumer devices.)

I see little interest in tablets from other manufacturers, probably due to the lack of demand by customers. As Android is the only other (major) operating system for tablets, innovation for Android tablets will have to come from Google, and I see little interest from Google in tablets. (I think Google is more interested in phones, location-based services, and advertising.)

In sum, I see 2019 as a year of "more of the same", with few or no major innovations. I suspect that the market for tech will, at the end of 2019, look very much like the market for tech at the beginning of 2019.