Tuesday, February 12, 2019

Praise for Microsoft

I am not Microsoft's biggest fan. I disliked their products and strategies in the 1990s, when they had a virtual monopoly on desktop operating systems, office software, and development tools. Yet I must give them credit for two recent products: OneDrive and Visual Studio Code.

OneDrive

OneDrive synchronizes files across multiple devices. I can store a file in OneDrive on computer A and later retrieve it on computer B. OneDrive stores data on Microsoft's servers and associates it with my account. If I log in to a Windows computer with my ID and password, I can see all of my files on OneDrive. The files are not copied to the local computer, they are simply available for me to view, change, or delete.

OneDrive also provides storage for online services such as Office Online. This lets me use any computer, even a public one in a library. (I think. I have yet to try this. But it makes sense for Microsoft to do things this way.)

Visual Studio Code

The other product that deserves credit is Visual Studio Code.

Microsoft advertises Visual Studio Code as an editor, yet it is much more. It edits, color-highlights, checks syntax, refactors, debugs (at least with Python), and integrates with git. It has an impressive array of features in a small package. What is significant is that the features are just the right set -- at least for me, and I suspect a large number of developers. It is not weighed down with all of the features of Microsoft's classic Visual Studio package. Visual Studio Code omits the templates and the auto-generation. It replaces the package manager with a series of lightweight plug-ins. It seems to ignore Team Foundation Server (and services), although I could be mistaken about that. (Perhaps there is an enterprise version of VS Code that connects to TFS.)

Beyond the feature set, Visual Studio Code... works. It's a competent product, one that feels good to use. It has just enough to get the job done, and it gets the job done well. I feel comfortable using it. (And that's a rare thing with me and Microsoft products.)

Visual Studio Code is a departure from the traditional Microsoft approach to software. The old Microsoft built software for Windows -- and Windows only. (A few exceptions were made for Mac OS.) Visual Studio Code breaks from that tradition: it is available for Windows, Mac OS, and Linux. This is indeed a ground-breaking project.

OneDrive and Visual Studio Code make for a pleasant experience when developing code. Microsoft deserves credit for bold choices and good tools. If you have not tried them, I recommend that you do.

What have you got to lose?

Tuesday, January 29, 2019

Intelligence, real and artificial

We humans have been working on artificial intelligence for a long time. At least fifty years, by my count, and through most of that time, true artificial intelligence has been consistently "twenty years away".

Of course, one should not talk solely about artificial intelligence. We humans have what we style as real intelligence. Perhaps the term "organic intelligence" is more appropriate, as human intelligence evolved organically over the ages. Let's not argue too much about terms.

The human mind is a strange thing. It is the only thing in the universe that can examine itself. (At least, it's the only one that we humans know about.)

There are many models of the human mind. We have studied the anatomy, the physiology, the chemistry, ... and we still understand little about how it works. Freud studied the human psyche (close enough to the mind for this essay) and Skinner studied animal behaviors with reward systems, and we still know little about the mind.

But there is one model that strikes me as useful when developing artificial intelligence: the notion of the human brain as two different but connected processors.

In this model, humans have not one but two processors: one slow and linear, the other fast and parallel. The slow, linear processor gives us analytical thought, math, logic, and language. The parallel side gives us intuition, using s a pattern-matching system.

The logical side is easy for us to examine. It is linear and relatively slow, and since it has language, it can talk to us. We can follow a chain of reasoning and understand how we arrive at an answer. (We can also explain our reasoning to another person, or write it down.)

The intuitive side is difficult to examine. It is parallel and relatively fast, and since it does not have language, it cannot explain how it arrives at an answer. We don't know why we get the results we get.

From an evolution point of view, it is easy to see how we developed the intuitive (pattern-matching) side. Our ancestors were successful when they identified a rabbit (and ate it) and identified a tiger (and ran away from it). Pattern matching is quite useful for survival.

It is less clear how we evolved the linear-logical side of our brain. Slow, analytic thought may be helpful for survival, but perhaps not as helpful as avoiding tigers. Communication is clearly a benefit when living in groups. No matter how it arose, we have it.

These two sides make up our brain. (Yes, I am aware that there are various levels of the brain, all the way down to the brain stem, but bear with me.)

Humans are successful, I believe, because we have both the logical and the intuitive processors. We use both brains in our everyday life, from recognizing other humans and breakfast cereal, and we think about business strategies and algebra homework. We pick the right processor for the problem at hand.

Now let's shift from our human intelligence to ... not artificial intelligence but computer intelligence, such as it is.

Traditional computing is our logic, math-oriented brains with a turbocharger. Computers are fast, and can perform calculations rapidly and reliably, but they don't have "common sense", or the intuition that we humans use. While fast, we can examine the program and understand how a computer arrives at a result.

Artificial intelligence, on the other hand, corresponds to the intuitive side of human intelligence. It can solve problems (relatively quickly) many times through pattern-matching techniques. And, just as the human intuitive, pattern-matching brain cannot explain how it arrives at a result, neither can artificial intelligence systems. We cannot simple examine the program and look at some variables to understand how the result was determined.

So now we have two artificial systems, one logical and one intuitive systems. These two types of "intelligence" are the two types in humans.

The real advance will be to combine the traditional computing systems (the logical systems) with artificial intelligence (the pattern-matching systems), just as our brains combine the two. Bringing the two disparate systems into one will be necessary for true, Skynet-class, Forbin-class, HAL-9000-class, artificial intelligence.

I expect that joining the two will be quite the challenge. We understand little about our human brains and how the logical and intuitive processors coordinate their work. Getting the logical and intuitive computer systems to work together will be (I think) a long effort.

But when we get it -- watch out!

Wednesday, January 23, 2019

Apple improves, but does not invent

That headline is a bit strong, and fighting words to Apple devotees.

My point is that Apple examines a market, finds a product, and builds a better version. It has built the proverbial better mousetrap, and the world has beaten a path to Apple's door.

Let's look at the history.

The iPhone, Apple's signature and most successful product, was (and is) a hand-held computer that can act as a cell phone. The iPhone was not the first cell phone, nor was it the first hand-held computer. Prior to the iPhone we had the Palm series of computers, and Microsoft's attempt at hand-held computers with a stripped-down version of Windows named, unfortunately, WinCE. (And "wince" is how many us us had to look at the low-resolution screens of the Windows hand-held computers.)

Apple looked at the market of hand-held computers and built a better version. It just happened to have a phone.

Apple's success goes beyond the iPhone.

The iPod was Apple's music player. It was not the first music player, but it was better than existing models. The physical design was better, but more importantly, the iTunes software that let people easily (and consistently) purchase music (rather than download from shady web sites) made the iPod a success.

Apple looked at the market of music players and music downloads and built a better version.

The MacBook Air was (and still is) Apple's design for a laptop computer. Slim, light, and capable, it was better than the competing laptops of the time. Manufacturers have now adopted the MacBook design into their own product lines.

Apple looked at the market of laptop computers and built a better version.

The Apple watch is a product with modest success. It was not the first digital watch. Its connectivity to the iPhone makes it easy to use and more capable than other digital watches, including those with apps that you install on your phone.

Apple looked at the market of digit watches and built a better version.

The Macintosh was Apple's second major success in the computer market. As with the iPod, the hardware was good and the software was better. It was the operating system, with its graphical interface, that made the Macintosh a success. The "Mac" was easier to use than PCs or clones running PC-DOS.

Apple looked at the market of desktop computers (and operating systems) and built a better version.

The history shows that Apple does not invent successful products, but instead improves on existing products. Apple often brings disparate elements together (the iPhone was a combination of cell phone and hand-held computer, the Macintosh was a combination of PC and graphical operating system) into a usable design. That's a good skill, and somewhat rare. But it has its weakness. (More on that later.)

Apple can invent; they designed and built the Apple II in the early microcomputer days. While the Apple II was not the first home computer, it was among the first. The Apple II was built for the consumer market, much like a television. Apple invented the home computer -- perhaps at the same time as others -- and deserves credit for its work.

The problem with Apple's strategy (building better mousetraps) is that there must be mousetraps (of the non-better kind) to begin with. One cannot build the better mousetrap until mousetraps exist, and one has ideas for a better version of the mousetrap.

The challenge for Apple, now, is to find a market and build a better product. I'm not sure where Apple can go. There is AI, which is a nascent market with a few early products (much like the microcomputer market before the IBM PC) but it does not fit well into Apple's overall strategy of selling hardware.

Another possibility is virtual reality and augmented reality. Microsoft offers the HoloLens heads-up display, which is admired but not used all that much. (A few games and experimental applications, but no "killer" app.) Apple could design and sell their own although heads-up display, but they may encounter the same dearth of applications that stymies Microsoft. Apple would have to create (or entice others to create) content. It could be done, but it doesn't fit the historical pattern.

Apple could move into games with a gaming console of their own. There are already games available, and the market has a ready set of customers. The competition is stiff, and Apple will have a challenge to design the "better mousetrap" of a game console, and convince game producers to create versions for Apple's console. Also, gamers -- serious players -- like to modify and enhance their hardware. Apple products are usually sealed shut and closed to modification.

Self-driving cars? The idea has been discussed. But self-driving cars are not commercially available. Apple likes to let others develop the first products and then bring on a better mousetrap.

Apple has avoided the cloud services market. Apple may use cloud technologies for things like Siri and backup, but they don't sell services the way Amazon and Microsoft do. (Mostly, I think, because cloud services move processing off the local device, and Apple wants to sell expensive local devices.)

As tablet sales decline and phone sales plateau, Apple has some interesting challenges. I don't know where they are going to go. (Which means I will be surprised when they do.) I'm not bearish on Apple -- I think they have a bright future -- and I hope to be pleasantly surprised.

Thursday, January 10, 2019

Predictions for tech in 2019

Predictions are fun! They allow us to see into the future -- or at least claim that we can see into the future. They also allow us to step away from the usual topics and talk about almost anything we want. Who could resist making predictions?

So here are my predictions for 2019:

Programming languages: The current "market" for programming languages is fractured. There is no one language that dominates. The ten most popular languages (according to Tiobe) are Java, C, Python, C++, VB.NET, C#, JavaScript, PHP, SQL, and Objective-C. The top ten are not evenly distributed; Java and C are in a "lead group" and the remaining languages are in a second group.

O'Reilly lists Python, Java, Go, C#, Kotlin, and Rust as languages to watch for 2019. Notice that this list is different from Tiobe's "most popular" -- Rust and Kotlin show on that index in positions 34 and 36, respectively. Notably absent from O"Reilly's list are C++ and Perl.

For 2019, I predict that the market will remain fragmented. Java will remain in the lead group unless Oracle, who owns Java, does something that discourages Java development. (And even then, so many systems are currently written in Java that Java will remain in use for years. Java will be the COBOL of the 2020s: used in important business systems but not liked very much by younger developers.) C will remain in the lead group. (The popularity of C is hard to explain. But whatever C has, people like.)

Fragmentation makes life difficult for managers. Which languages should their teams use? A single leader makes the decision easy. The current market, with multiple capable languages allows for debates about development languages. An established project provides the argument of sticking with the current programming language; a new project (with no existing code) makes the decision somewhat harder. (My advice: pick a popular language that gets the job done for you. Don't worry about it being the best language. Good enough is... good enough.)

Operating systems: Unlike the "market" for programming languages, the "market" for operating systems is fairly uniform. I should say "markets": we can consider the desktop/laptop segment, the server segment, and possibly a cloud segment. For the desktop, Windows is dominant, and will remain dominant in 2019. Windows 10 is capable and especially good for large organizations who want centralized administration. MacOS is used in a number of shops, especially smaller organizations and startups, and will continue to have a modest share.

For servers, Linux dominates and will continue to dominate in 2019. Windows runs some servers, and will continue to, especially in organizations who consider themselves "Microsoft shops".

The interesting future for operating systems is the cloud segment. Cloud services run on operating systems, usually Linux or Windows, but this is changing on two fronts. The first is the hypervisor, which sits below the virtual operating system in a cloud environment; the second is containers, which sit above the virtual operating system (and which contain an application).

Hypervisors are well-understood and well established. Containers are new (well, new-ish) and not as well understood, but gaining acceptance. Between the two sits the operating system, which is coming under pressure as hypervisors and containers perform tasks that were traditionally performed by operating systems.

In the long run, hypervisors, containers, and operating systems will achieve a new equilibrium, with operating systems doing less than they have in the past. The question will not be "Which operating system for my cloud application?" but instead "Which combination of hypervisor, operating system, and container for my cloud application?". And even then, there may be large shops that use a mixture of hypervisor, operating system, and container for their applications.

Virtual reality and augmented reality: Both will remain experimental. We have yet to find a "killer app" for augmented reality, something that combines real-world and supplied visuals in a compelling application.

Cloud services: Amazon.com dominates the market, and I see little to change that. Microsoft and Google will maintain (and possibly increase) their market shares. Other players (IBM, Dell, Oracle) will remain small.

The list of services available from cloud providers is impressive and daunting. Amazon is in a difficult position; its services are less consistent than Microsoft's and Google's. Both Microsoft and Google came into the market after Amazon and developed their offerings more slowly. The result has been a smaller market share but a more consistent set of services (and, I dare say, a better experience for the customer). Amazon may change some services to make things more consistent.

Phones: Little will change in 2019. Apple and Android will remain dominant. 5G will get press and slow roll-out by carriers; look for true implementation and wide coverage in later years.

Tablets: 2019 may be the "last year of the tablet" -- at least the non-laptop convertible tablet. Tablet sales have been anemic, except for iPads, and even those are declining. Apple could introduce an innovation to the iPad which increases its appeal, but I don't see that. (I think Apple will focus on phones, watches, earphones, and other consumer devices.)

I see little interest in tablets from other manufacturers, probably due to the lack of demand by customers. As Android is the only other (major) operating system for tablets, innovation for Android tablets will have to come from Google, and I see little interest from Google in tablets. (I think Google is more interested in phones, location-based services, and advertising.)

In sum, I see 2019 as a year of "more of the same", with few or no major innovations. I suspect that the market for tech will, at the end of 2019, look very much like the market for tech at the beginning of 2019.

Thursday, December 6, 2018

Rebels need the Empire

The PC world is facing a crisis. It is a silent crisis, one that few people understand.

That crisis is the evil empire, or more specifically, the lack of an evil empire.

For the entire age of personal computers, we have had an evil empire. The empire changed over time, but there was always one. And that empire was the unifying force for the rebellion.

The first empire was IBM. Microcomputer enthusiasts were fighting this empire of large, expensive mainframe computers. We fought it with small, inexpensive (compared to mainframes) computers. We offered small, interactive, "friendly" programs written in BASIC in opposition to batch mainframe systems written in COBOL. The rebellion used Apple II, TRS-80, and other small systems to unite and fight for liberty. This rebellion was successful. So successful that IBM decided to get in on the personal computer action.

The second empire was also IBM. The IBM PC became the standard for computing, and the diverse set of computers prior to the IBM model 5150 was wiped out. Rebels refused to use IBM PCs and attempted to keep non-PC-compatible computers financially viable. That struggle was lost, and the IBM design became the standard design. Once Compaq introduced a PC-compatible (and didn't get sued) other manufacturers introduced their own PC compatibles. The one remnant of this rebellion was Apple, who made non-compatible computers for quite some time.

The third empire was Microsoft. The makers of IBM-compatible PCs needed an operating system and Microsoft was happy to sell them MS-DOS. IBM challenged Microsoft with OS/2 (itself a joint venture with Microsoft) but Microsoft introduced Windows and made it successful. Microsoft was so successful that its empire was, at times, considered larger and grander than IBM mainframe empire. The rebellion against Microsoft took some time to form, but it did arise as the "open source" movement.

But Microsoft has fallen from its position as evil empire. It still holds a majority of desktop computer operating systems, but the world of computing has expanded to web servers, smartphones, and cloud systems, and these are outside of Microsoft's control.

In tandem with Microsoft's decline, open source has become accepted as the norm. As such, it is no longer the rebellion. The software market exists in tripartite: Windows, macOS, and Linux. Each is an acceptable solution.

Those two changes -- Microsoft no longer the evil empire and open source no longer the rebellion -- mean that, at the moment, there is no evil empire.

Some companies have large market shares of certain segments. Amazon.com dominates the web services and cloud market -- but competitors are reasonable and viable options. Microsoft dominates the desktop market, especially the corporate desktop market, but Apple is a possible choice for the corporate desktop.

No one vendor controls the hardware market.

Facebook dominates in social media, but is facing significant challenges in areas of privacy and "fake news". Other media channels like Twitter are looking to gain at Facebook's expense.

Even programming languages have no dominant player. According to the November report from Tiobe, Java and C have been the two most popular languages and neither is gaining significantly. The next three (C++, Python, and VB.net) are close, as are the five following (C#, JavaScript, PHP, SQL, and Go). No language is emerging as a dominant language, as we had with BASIC in the 1980s and Visual Basic in the 1990s.

A world without an evil empire is a new world for us. Personal computers were born under an evil empire, operating systems matured under an evil empire, and open source became respectable under an evil empire. I like to think that such innovations were driven (or at least inspired) by a rebellion, an active group of people who rejected the market leader.

Today we have no such empire. Will innovation continue without one? Will we see new hardware, new programming languages, new tools? Or will the industry stagnate as major plays focus more on market share and less on innovation?

If the latter, then perhaps someday a new market leader will emerge, strong enough to win the title of "evil empire" and rebels will again drive innovation.

Thursday, November 8, 2018

Why is there still a MacBook Air?

This week, Apple introduced upgrades to a number of its products. They showed a new Mac Mini and a new MacBook Air. The need for a new Mac Mini I understand. The need for a new MacBook Air I do not.

The original MacBook Air was revolutionary in that it omitted a CD/DVD reader. So revolutionary that Apple needed a way for a MacBook Air to "borrow" a CD/DVD reader from another computer (another Apple computer) to install software.

The MacBook Air stunned the world with its thinness and its low weight -- hence the adjective "Air". Compared to laptops of the time, even Apple's MacBooks, the MacBook Air was almost weightless.

But that was then. This is now.

Apple has improved the MacBook (without the "Air") to the point that MacBooks and MacBook Airs are indistinguishable. They are both thin. They are both lightweight. They both have no CD/DVD reader.

Yes, there are some minor points and one can tell a MacBook from a MacBook Air. MacBooks are slightly smaller and have only one USB C port, whereas MacBook Airs are larger and have multiple ports.

But in just about every respect, the MacBook Air is a new and improved MacBook. When you consider the processor, the memory and storage, the display, and the capabilities of the two devices, the MacBook Air is simply another member of the MacBook line. So why keep it? Why not just call it a MacBook?

Apple could certainly have two MacBooks. They have two MacBook Pro computers, a 13-inch model and a 15-inch model. They could have a 12-inch MacBook and a 13-inch MacBook. Yet they keep the "Air" designation. Why?

Its possible that the "MacBook Air" name has good market recognition, and Apple wants to leverage that. If so, we can expect to see other "Air" products, much like the iPad Air.

Monday, October 29, 2018

IBM and Red Hat Linux

The news that IBM had an agreement to purchase Red Hat (the distributor and supporter of a Linux distro for commercial use) was followed quickly by a series of comments from the tech world, ranging from anger to disappointment.

I'm not sure that the purchase of Red Hat by IBM is a bad thing.

One can view this event in the form of two questions. The first is "Should Red Hat sell itself (to anyone)?". The second is "Given that Red Hat is for sale, who would be a good purchaser?".

The negative reaction, I think, is mostly about the first question. People are disappointed (or angered) by the sale of Red Had -- to anyone.

But once you commit to a sale, the question changes and the focus is on the buyer. Who are possible buyers for Red Hat?

IBM is, of course, a possibility. Many people might object to IBM, and if we think of the IBM from its monopoly days and its arrogance and incompatible hardware designs, then IBM would be a poor choice. (Red Hat would also be a poor acquisition for that IBM, too.)

But IBM has changed quite a bit. It still sells mainframes; its S/36 line has mutated into servers, and it has sold (long ago) its PC business. It must compete in the cloud arena with Amazon.com, Microsoft, and Google (and Dell, and Oracle, and others). Red Hat helps IBM in this area. I think IBM is not so foolish as to break Red Hat or make many changes.

One possibility is that IBM purchased Red Hat to prevent others from doing so. (You buy something because you need it or because you want to keep it from others.) Who are the others?

Amazon.com and Microsoft come quickly to mind. They both offer cloud services, and Red Hat would help both with their offerings. The complainers may consider this; would they prefer Red Hat to go to Amazon or Microsoft? (Of the two, I think Microsoft would be the better owner. It is expanding its role with Linux and moving its business away from Windows and Windows-only software to a larger market of cloud services that support both Windows and Linux.)

There are other possible purchasers. Oracle has been mentioned by critics (usually as a "could be worse, could be Oracle" comment). Red Hat fills a gap in Oracle's product line between hardware and its database software, and also provides a platform for Java (another Oracle property).

Beyond those, there are Facebook, Dell, and possibly Intel, although I consider the last to be a long shot. None of them strike me as a good partner.

Red Hat could be purchased by an equity/investment company, which would probably doom Red Hat to partitioning and sales of individual components.

In the end, IBM seems quite a reasonable purchaser. IBM has changed from its old ways and it supports Linux quite a bit. I think it will recognize value and strive to keep it. Let's see what happens.