Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Thursday, July 23, 2020

IBM and Apple innovate in different ways

Two of the most influential companies in the PC universe are IBM and Apple. There are others, including Microsoft. But I want to compare just IBM and Apple. These two companies have similarities, and differences.

IBM and Apple are both hardware companies. Apple is still a hardware company, although its main business is phones and not computers. IBM is more of a services company; it was a hardware company in 1981 when it introduced the IBM PC.

Both companies innovated and both companies created designs that influence the market today.

IBM introduced the detached keyboard (other systems were all-in-one designs or keyboard-and-CPU with a separate display). IBM also introduced the internal hard drive, the original 8-inch floppy disk, the 3.5-inch floppy disk, the ThinkPad TrackPoint (the "pointing stick"), and the VGA display and video card.

Apple introduced the mouse for personal computers (the original mouse was two decades earlier and for a system much larger than a PC), the PowerBook laptop (a year before the first ThinkPad), AppleTalk, touchscreen for iPhones, and (notably) iTunes which gave consumers a reliable, legal way to load music onto their devices.

Apple stands out in that it innovates not just by adding features, but by removing them. Apple was first in delivering a computer without a floppy drive, and then a computer without a CD drive. Apple famously removed the headphone jack from its phones. It also omitted the Home, End, PgUp, and PgDn keys on its laptops, going back as far as the first PowerBook. (As the PowerBook was not compatible with the IBM PC, it had no need of those keys.)

Apple, more than IBM or any other hardware supplier, has innovated by removing things. The makers of Windows PCs and laptops had typically followed Apple's lead. They have, over time, removed floppy drives, CD drives, and most laptops now require specific multi-key presses for Home, End, PgUp, and PgDn.

IBM innovated by adding features. Apple innovates by trimming away features. That's quite a difference.

Of course, one can remove only so much. Apple has trimmed the phone to a simple slab with a single port for charging and data transfer. It has trimmed the Macbook to a thin wedge that opens to a screen, keyboard, and trackpad. There is very little left to remove, which means that Apple has little room to innovate along its traditional methods.

But notice that Apple's innovation-by-reduction has been in hardware. Its operating systems are not the equivalent of a slim wedge. To the contrary, Apple's mac OS and iOS are somewhat bulky, which is something Apple shares with Windows and Linux.

Of the major operating systems, mac OS probably has the best chance of slimming. Apple has the "remove things to make it better" mindset, which helps to remove features. Apple also has close integration between operating system and hardware, and drops support for older hardware, which lets it remove drivers for older devices. Windows and Linux, in contrast, want to support as much hardware as they can, which means adding drivers and allowing for older devices.

Let's see if Apple's "less is more" approach works it way into mac OS and into Swift, Apple's favored language for development.

Wednesday, January 23, 2019

Apple improves, but does not invent

That headline is a bit strong, and fighting words to Apple devotees.

My point is that Apple examines a market, finds a product, and builds a better version. It has built the proverbial better mousetrap, and the world has beaten a path to Apple's door.

Let's look at the history.

The iPhone, Apple's signature and most successful product, was (and is) a hand-held computer that can act as a cell phone. The iPhone was not the first cell phone, nor was it the first hand-held computer. Prior to the iPhone we had the Palm series of computers, and Microsoft's attempt at hand-held computers with a stripped-down version of Windows named, unfortunately, WinCE. (And "wince" is how many us us had to look at the low-resolution screens of the Windows hand-held computers.)

Apple looked at the market of hand-held computers and built a better version. It just happened to have a phone.

Apple's success goes beyond the iPhone.

The iPod was Apple's music player. It was not the first music player, but it was better than existing models. The physical design was better, but more importantly, the iTunes software that let people easily (and consistently) purchase music (rather than download from shady web sites) made the iPod a success.

Apple looked at the market of music players and music downloads and built a better version.

The MacBook Air was (and still is) Apple's design for a laptop computer. Slim, light, and capable, it was better than the competing laptops of the time. Manufacturers have now adopted the MacBook design into their own product lines.

Apple looked at the market of laptop computers and built a better version.

The Apple watch is a product with modest success. It was not the first digital watch. Its connectivity to the iPhone makes it easy to use and more capable than other digital watches, including those with apps that you install on your phone.

Apple looked at the market of digit watches and built a better version.

The Macintosh was Apple's second major success in the computer market. As with the iPod, the hardware was good and the software was better. It was the operating system, with its graphical interface, that made the Macintosh a success. The "Mac" was easier to use than PCs or clones running PC-DOS.

Apple looked at the market of desktop computers (and operating systems) and built a better version.

The history shows that Apple does not invent successful products, but instead improves on existing products. Apple often brings disparate elements together (the iPhone was a combination of cell phone and hand-held computer, the Macintosh was a combination of PC and graphical operating system) into a usable design. That's a good skill, and somewhat rare. But it has its weakness. (More on that later.)

Apple can invent; they designed and built the Apple II in the early microcomputer days. While the Apple II was not the first home computer, it was among the first. The Apple II was built for the consumer market, much like a television. Apple invented the home computer -- perhaps at the same time as others -- and deserves credit for its work.

The problem with Apple's strategy (building better mousetraps) is that there must be mousetraps (of the non-better kind) to begin with. One cannot build the better mousetrap until mousetraps exist, and one has ideas for a better version of the mousetrap.

The challenge for Apple, now, is to find a market and build a better product. I'm not sure where Apple can go. There is AI, which is a nascent market with a few early products (much like the microcomputer market before the IBM PC) but it does not fit well into Apple's overall strategy of selling hardware.

Another possibility is virtual reality and augmented reality. Microsoft offers the HoloLens heads-up display, which is admired but not used all that much. (A few games and experimental applications, but no "killer" app.) Apple could design and sell their own although heads-up display, but they may encounter the same dearth of applications that stymies Microsoft. Apple would have to create (or entice others to create) content. It could be done, but it doesn't fit the historical pattern.

Apple could move into games with a gaming console of their own. There are already games available, and the market has a ready set of customers. The competition is stiff, and Apple will have a challenge to design the "better mousetrap" of a game console, and convince game producers to create versions for Apple's console. Also, gamers -- serious players -- like to modify and enhance their hardware. Apple products are usually sealed shut and closed to modification.

Self-driving cars? The idea has been discussed. But self-driving cars are not commercially available. Apple likes to let others develop the first products and then bring on a better mousetrap.

Apple has avoided the cloud services market. Apple may use cloud technologies for things like Siri and backup, but they don't sell services the way Amazon and Microsoft do. (Mostly, I think, because cloud services move processing off the local device, and Apple wants to sell expensive local devices.)

As tablet sales decline and phone sales plateau, Apple has some interesting challenges. I don't know where they are going to go. (Which means I will be surprised when they do.) I'm not bearish on Apple -- I think they have a bright future -- and I hope to be pleasantly surprised.

Thursday, December 6, 2018

Rebels need the Empire

The PC world is facing a crisis. It is a silent crisis, one that few people understand.

That crisis is the evil empire, or more specifically, the lack of an evil empire.

For the entire age of personal computers, we have had an evil empire. The empire changed over time, but there was always one. And that empire was the unifying force for the rebellion.

The first empire was IBM. Microcomputer enthusiasts were fighting this empire of large, expensive mainframe computers. We fought it with small, inexpensive (compared to mainframes) computers. We offered small, interactive, "friendly" programs written in BASIC in opposition to batch mainframe systems written in COBOL. The rebellion used Apple II, TRS-80, and other small systems to unite and fight for liberty. This rebellion was successful. So successful that IBM decided to get in on the personal computer action.

The second empire was also IBM. The IBM PC became the standard for computing, and the diverse set of computers prior to the IBM model 5150 was wiped out. Rebels refused to use IBM PCs and attempted to keep non-PC-compatible computers financially viable. That struggle was lost, and the IBM design became the standard design. Once Compaq introduced a PC-compatible (and didn't get sued) other manufacturers introduced their own PC compatibles. The one remnant of this rebellion was Apple, who made non-compatible computers for quite some time.

The third empire was Microsoft. The makers of IBM-compatible PCs needed an operating system and Microsoft was happy to sell them MS-DOS. IBM challenged Microsoft with OS/2 (itself a joint venture with Microsoft) but Microsoft introduced Windows and made it successful. Microsoft was so successful that its empire was, at times, considered larger and grander than IBM mainframe empire. The rebellion against Microsoft took some time to form, but it did arise as the "open source" movement.

But Microsoft has fallen from its position as evil empire. It still holds a majority of desktop computer operating systems, but the world of computing has expanded to web servers, smartphones, and cloud systems, and these are outside of Microsoft's control.

In tandem with Microsoft's decline, open source has become accepted as the norm. As such, it is no longer the rebellion. The software market exists in tripartite: Windows, macOS, and Linux. Each is an acceptable solution.

Those two changes -- Microsoft no longer the evil empire and open source no longer the rebellion -- mean that, at the moment, there is no evil empire.

Some companies have large market shares of certain segments. Amazon.com dominates the web services and cloud market -- but competitors are reasonable and viable options. Microsoft dominates the desktop market, especially the corporate desktop market, but Apple is a possible choice for the corporate desktop.

No one vendor controls the hardware market.

Facebook dominates in social media, but is facing significant challenges in areas of privacy and "fake news". Other media channels like Twitter are looking to gain at Facebook's expense.

Even programming languages have no dominant player. According to the November report from Tiobe, Java and C have been the two most popular languages and neither is gaining significantly. The next three (C++, Python, and VB.net) are close, as are the five following (C#, JavaScript, PHP, SQL, and Go). No language is emerging as a dominant language, as we had with BASIC in the 1980s and Visual Basic in the 1990s.

A world without an evil empire is a new world for us. Personal computers were born under an evil empire, operating systems matured under an evil empire, and open source became respectable under an evil empire. I like to think that such innovations were driven (or at least inspired) by a rebellion, an active group of people who rejected the market leader.

Today we have no such empire. Will innovation continue without one? Will we see new hardware, new programming languages, new tools? Or will the industry stagnate as major plays focus more on market share and less on innovation?

If the latter, then perhaps someday a new market leader will emerge, strong enough to win the title of "evil empire" and rebels will again drive innovation.

Thursday, July 7, 2016

DOS, Windows, sharing, and mobile

Windows, when it arrived on the scene, changed the world of computing. Prior to Windows, DOS ruled the computing world, and it was a limited world. Windows expanded that world with new capabilities. In some ways, mobile operating systems (Android and iOS) move us back towards the ways of DOS.

IBM PCs (or compatibles, or near-compatibles) running DOS were simple devices. They could run one program at a time; running multiple programs at once was not possible. (Technically, it was possible with a "terminate and stay resident" function call, but such programs were few. For this essay, I'll stick to the "regular" programs.)

Windows brought us an expanded view of computing. Instead of running a single program at a time, Windows allowed for multiple. Windows provided a common way to present text and graphics on the screen, to print to printers, and to share data. Windows was a large step upward from the world of DOS.

Mobile operating systems -- Android and iOS -- provide a different experience. Instead of multiple applications and multiple windows, these operating systems present one application (or "app") at a time. Multiple apps may run, but only one has the screen. Thus, you can listen to music, check e-mail, and get a text message all at the same time. Mobile apps keep the multitasking aspect, but reduce the interaction to one app at a time.

Reducing the number of interactive apps to one is a reduction in capabilities, although it is a simpler experience, and one that makes sense for a phone. (I think it also makes sense for a tablet.)

What I don't see in the mobile operating systems (and what I don't see in Windows, either) is improvements in the ability to share data across applications. DOS had files and pipes (concepts lifted from Unix). Windows added the clipboard, and then Dynamic Data Exchange (DDE) and later, drag-and-drop. The clipboard was popular and is still used today. DDE never got traction, and drag-and-drop is limited to files.

Sharing data across applications is difficult. Each application has its own ideas about data. Word processors hold characters, words, sentences, paragraphs, and documents. Spreadsheets hold numeric values, formulas, cells, rows, columns, and sheets. Databases hold rows and columns -- or "documents" (different from word processor documents) for NoSQL databases. The transfer of data from one application to another is not obvious, and therefore the programming of such transfers is not obvious.

But Windows has had the clipboard for thirty years, and DDE and drag-and-drop for almost as long. Have we had no ideas in that time?

Perhaps our current mobile operating systems are the DOSes of today, waiting for a new, bold operating system to provide new capabilities.

Wednesday, January 28, 2015

Not just innovation, but successful innovation

We've heard of "The Innovator's Dilemma", Clayton Christensen's book that describes how market leaders focus on customer needs and are surpassed by other, more innovative companies. That certainly seems to apply to the IT industry. Consider:

IBM, in the 1960s, introduced the System/360, an innovation. (It was a general-purpose computer when most computers were designed and built for specific purposes.) Later, it offered the System/370 which was a bigger, better version of the System/360. They were successful, yet IBM missed the innovation of minicomputers from DEC, Wang, and Data General. IBM offered the old design, yet customers wanted innovation. (To be fair, IBM later offered its own minicomputers, different from its mainframes, and customers did buy lots of them.)

IBM introduced the IBM PC and instantly became the leader for the PC market, eclipsing Apple, Radio Shack, Commodore, and the dozens of other manufacturers. IBM enjoyed success for a number of years. The IBM PC XT was a better version of the PC, and the IBM PC AT was a bigger, better version of the IBM PC (and PC XT). Yet IBM stumbled with the PCjr and the PS/2 lines, and never recovered. Compaq took the lead with its Deskpro line of PCs. IBM was offering innovation, yet customers wanted the old designs.

DEC was successful in the minicomputer business, yet failed in the PC business. Its early personal computers were smaller versions of its minicomputers; compared to PCs they were expensive and complicated. DEC was offering the old designs, yet customers wanted someone else's design.

Commodore built success with its PET, CBM, and especially its C-64 models. It failed with its Amiga, a very innovative computer. Commodore offered innovation, yet customers wanted the IBM PC.

It seems that successful innovation requires two components: a break from past designs and a demand from customers.

Apple innovates -- successfully. From the Apple II to the Macintosh to the iPod (the innovation there was really iTunes) to the iPad, Apple has introduced products that break from its past *and* that meet customer demand. The genius of Apple is that many of its products don't fill a demand from customers, but create the demand. Few people realized that they wanted iPhones or iPads (or Macintosh computers) until they saw one.

There is a lesson here for Microsoft. The PC market is changing; innovation is needed. Microsoft, if it wants to remain a market leader, must innovate successfully. It must introduce new products and services that will meet a customer demand.

The same lesson holds for other technology companies. IBM, Oracle, and even Red Hat should look to successful innovation.

Tuesday, December 9, 2014

Open source .NET is less special and more welcoming

The Microsoft "toolchain" (the CLR, the .NET framework libraries, and the C# compiler) were special. They were Microsoft's property, guarded jealously and subject to Microsoft's whims. They were also the premiere platform and tools for development in Windows and for Windows. If you were serious about application development (for Windows), you used the Microsoft tools.

There were other toolchains. The Java set includes the JVM and the Java compiler. The major scripting languages (Perl, Python, Ruby, and PHP) each have their own runtime engines and class libraries. None were considered special in the way that the Microsoft toolchain was special. (The other toolchains were -- and still are -- considered good, and some people considered them superior to the Microsoft toolchain, but even most non-Microsoft fans would admit that the Microsoft toolchain was of high quality.)

Microsoft's announcement to open the .NET framework and the C# compiler changes that status. Microsoft wants to expand .NET to the Linux and MacOS platforms. They want to expand their community of developers. All reasonable goals; Microsoft clearly sees opportunities beyond the Windows platform.

What interests me is my reaction to the announcement. For me, opening the .NET framework and moving it to other platforms reduces the "specialness" of .NET. The Microsoft toolchain becomes just another toolchain. It is no longer the acknowledged leader for development on Windows.

The demotion of the Microsoft toolchain is accompanied by a promotion of the Java toolchain. Before, the Microsoft toolchain was the "proper" way to develop applications for Windows. Now, it is merely one way. Before, the Java toolchain was the "rebel" way to develop applications for Windows. Now, it is on par with the Microsoft toolchain.

I feel more comfortable developing a Java application to run on Windows. I also feel comfortable developing an application in .NET to run on Windows or Linux. (Yes, I know that the Linux version of .NET is not quite ready. But I'm comfortable with the idea.)

I think other folks will be comfortable with the idea. Comfortable enough to start experimenting with the .NET framework as people have experimented with the Java toolchain. Folks have created new languages to run under the JVM. (Clojure, Scala, and Groovy are popular ones, and there are lots of obscure ones.) I suspect that people avoided experimenting with the Microsoft toolchain because they feared changes or repercussions from Microsoft. Perhaps we will see experiments with the CLR and the .NET framework. (Perhaps new versions of the IronPython and IronRuby projects, too.)

By opening their toolchain, Microsoft has made it more accessible, technically and psychologically. They have reduced the barriers to innovation. I'm looking forward to the results.

Monday, October 6, 2014

Innovation in mobile and cloud; not in PCs

The history of IT is the history of innovation. But innovation is not evenly distributed, and it does not stay with one technology.

For a long time, innovation focussed on the PC. The "center of gravity" for innovation was, for a long time, the IBM PC and PC-DOS. Later it became the PC (not necessarily from IBM) and Windows. Windows NT, Windows 2000, and Windows XP all saw significant expansions of features.

With the rise of the Web, the center of gravity shifted to web servers and web browsers. I think that it is no coincidence that Microsoft offered Windows XP with no significant changes. People accepted Windows XP as "good enough" and looked for innovation in other areas -- web browsers, web servers, and databases.

This change broke Microsoft's business model. That business model (selling new versions of Windows and Office to individuals and corporations every so often) was broken when users decided that Windows XP was good enough, that Microsoft Office was good enough. They moved to newer versions reluctantly, not expectantly.

Microsoft is changing its business model. It is shifting to a subscription model for Windows and Office. It has Azure for cloud services. It developed the Surface tablet for mobile computing. Microsoft's Windows RT was an attempt at an operating system for mobile devices, an operating system that had reduced administrative tasks for the user. These are the areas of innovation.

We have stopped wanting new desktop software. I know of no new projects that target the desktop. I know of no new projects that are "Windows only" or "PC only". New projects are designed for mobile/cloud, or possibly web browsers and servers. With no demand for new applications on the desktop, there is no pressure to improve the desktop PC - or its operating system.

With no pressure to improve the desktop, there is no need to change the hardware or operating system. We see changes in three areas: larger memory and disks (mostly from inertia), smaller form factors, and prettier user interfaces (Windows Vista and Windows 8 "Metro"). With each of these changes, users can (rightfully) ask: what is the benefit to me?

It is a question that newer PCs and operating systems have not answered. But tablets and smartphones answer it quite well.

I think that Windows 10 is the "last hurrah" for Windows -- at least the desktop version. Innovations to Windows will be modifications for mobile/cloud technologies: better interactions with virtualization hypervisors and container managers. Aside from those, look for little changes in desktop operating systems.

Tuesday, November 26, 2013

Microsoft generally doesn't innovate

The cool reception of Microsoft's Surface tablets, Windows 8, and Windows RT has people complaining about Microsoft's (in)ability to innovate. Somehow, people have gotten the idea that Microsoft must innovate to stay competitive.

Looking at the history of Microsoft products, I cannot help but think that innovation has been only a small part of their success. Microsoft has been successful due to its marketing, its contracts, its monopoly on Windows, and its proprietary formats.

Still not convinced? Consider these Microsoft products, and their "innovativeness":

  • MS-DOS: purchased from another company
  • C compiler: purchased from Lattice, initially
  • Windows: a better version of OS/2, or MacOS, or AmigaDOS
  • SourceSafe: purchased from another company
  • Visual Studio: a derivation of their earlier IDE, cloned from Borland's TurboPascal
  • C# and .NET: a copy of Java and the JVM
  • Windows RT: a variant of Apple's iOS
  • Azure: a product to compete with Amazon.com's web services and cloud
  • the Surface tablet: a variant on Apple's iPad
  • Word: a better version of WordPerfect
  • Excel: a better version of Microsoft's Multiplan, made to compete with Lotus 1-2-3
  • the Xbox: a game console to compete with Sony and Nintendo game consoles

I argue that Microsoft is an excellent copier of ideas, and not an innovator. All of these products were not innovations by Microsoft.

Some might observe that the list of Microsoft products is much longer than the one I have presented. Others my observe that Microsoft continually improves its products, especially after purchasing one from another company. (Certainly the case for their C compiler, Visual SourceSafe, and C#.)

To be fair, I should list the innovative products from Microsoft:

  • Microsoft BASIC
  • Visual BASIC

Microsoft is not devoid of innovation. But innovation is not Microsoft's big game. Microsoft is better at copying existing products and technologies, re-casting them into Microsoft's own product line, and improving them over time. Those are their strengths.

People may decry Microsoft's lack of innovation. But this is not a new development. Over its history, Microsoft has focussed on other strategies, and gotten good results.

I don't worry about Windows 8 and the Surface tablets being "non-innovative". They are useful products, and I have confidence in Microsoft's abilities to make them work for customers.

Saturday, October 8, 2011

What Microsoft's past can tell us about Windows 8

Microsoft Windows 8 changes a lot of assumptions about Windows. It especially affects developers. The familiar Windows API has been deprecated, and Microsoft now offers WinRT (the "Windows Runtime").

What will it be like? What will it offer?

I have a guess.

This is a guess. As such, I could be right or wrong. I have seen none of Microsoft's announcements or documentation for Windows 8, so I might be wrong at this very moment.

Microsoft is good at building better versions of competitors' products.

Let's look at Microsoft products and see how they compare to the competition.

MS-DOS was a bigger, better CP/M.

Windows was a better (although perhaps not bigger) version of IBM's OS/2 Presentation Manager.

Windows 3.1 included a better version of Novell's Netware.

Word was a bigger version of Wordstar and Wordperfect.

Excel was a bigger, better version of Lotus 1-2-3.

Visual Studio was a bigger, better version of Borland's TurboPascal IDE.

C# was a better version of Java.

Microsoft is not so much an innovator as it is an "improver", one who refines an idea.


It might just be that Windows 8 will be not an Innovative New Thing, but instead a Bigger Better Version of Some Existing Thing -- and not a bigger, better version of Windows 7, but a bigger, better version of someone else's operating system.

That operating system may just be Unix, or Linux, or NetBSD.

Microsoft can't simply take the code to Linux and "improve" it into WinRT; doing so would violate the Linux license.

But Microsoft has an agreement with Novell (yes, the same Novell that saw it's Netware product killed by Windows 3.1), and Novell has the copyright to Unix. That may give Microsoft a way to use Unix code.

It just may be that Microsoft's WinRT will be very Unix-like, with a kernel and a separate graphics layer, modules and drivers, and an efficient set of system calls. WinRT may be nothing more than a bigger, better version of Unix.

And that may be a good thing.