Monday, May 31, 2010

Apple neuters the notion of open source

Quietly, Apple has out-maneuvered the open source movement.

The iTunes/AppStore combination works against open source, or at least neuters it on the iPhone/iPod/iPad platform. All applications must be approved by Apple, and therefore Apple has control over all applications (and all possible applications) that can be installed on any of the devices.

Open source relies on a number of assumptions. The primary assumption is the openness of source code. This openness drives many of the licenses.

Open source also relies on the availability of a compiler (or run-time engine in the case of scripting languages). I can release open source projects for C, C++, Java, Perl, Python, Ruby, and lots of other languages and people could use them. If I created a private language and kept the compiler to myself (the compiler itself not just the source for the compiler), then I have prevented anyone else from creating an open source project in that language.

Open source also relies on the ability to install an application. If I have the source code for a project and a compiler to build the application, yet I cannot install a program onto my computer, then open source does not help me.

It is this last case that Apple has used with its consumer platform (the iPhone and its ilk). I can write programs as much as I want, but I cannot install them on my iPhone. Only "blessed" applications can be installed. I could go to the trouble of getting the application approved by Apple and added to its App Store, but that still blocks open source -- anyone else cannot install their (modified) version of the app unless they get that modified version approved.

So we can see that open source is about more that the source code. A successful open source movement requires the source code, the ability to turn the source into a runnable program, the ability to install the application on your computer, and the ability to run the program -- all without approval from another party.

These issues are at the heart of the "trusted computing" platform of a few years ago. With that proposal, the "choke point" was on the execution of the code. The trusted computing platform would run only code that had been signed (approved) by the appropriate authority.

Since Apple has locked the open source movement out of its consumer platform, and Microsoft is doing little to encourage or enable open source on Windows (or Windows Mobile), it seems that open source will be limited on desktops to Linux and limited on phones to Android. Microsoft and Apple will succeed (for a while) with their walled gardens. In the long run, I think a lot of the creative work will occur in the open source world and Apple and Microsoft will be hard-pressed to keep up with it.

Sunday, May 30, 2010

The End of the PC Era

Repent for the end is nigh!

Repentence may not be necessary, but the end is near. The end of the PC era, that is.

I'm going to take some liberty with the phrase "PC" and start the PC era in 1977 with the introduction of the Apple II computer. (Or the "Apple ][", as we wrote it then.) I'm going to ignore the marketing for the IBM PC in 1981.

The Apple II (and the Radio Shack TRS-80, and the Commodore PET, and the Heathkit H-8) began the era of personal computers. (Yes, technically the IMSAI 8008 could be considered the beginning, but that was strictly for serious hackers. The Apple and Radio Shack and Commodore were made for "real people".)

Why do I send that the end is near? I suspect that soon Apple will discontinue the Macintosh line. (All of it, including the iMac, the MacBook, and the Mac Mini.)

With the popularity of the iPhone, the iPod, and the iPad, consumers have no need for a full-blown PC like the Macintosh, nor the patience to put up with the work for personal computers. The iPhone/Pod/Pad collection is good enough, and much simpler to use.

When Apple announces the end of the Macintosh, a few things will happen:

1) The Microsoftians will cheer
2) The Linux crowd will look up from their screens and see nothing to cheer (or boo)
3) A lot of graphic designers will be rather unhappy

Graphic designers will be unhappy because their platform of choice will disappear. They will have to move to either Windows or Linux. The iPad will not support their work. But neither do Windows or Linux, in the way that the Macintosh supported it. Yet this is inevitable -- the market for graphics work is too small to interest Apple.

The Linux crowd will be neutral. They don't need the Mac or Mac OSX -- or so they think.

THe Microsoft crowd will be ecstatic -- they will think that they have "won" some might battle against the forces of non-Microsoft darkness. Some will proudly (and loudly) proclaim that this decision by Apple proves the superiority of the Microsoft .NET platform. Sadly, they will be wrong.

By abandoning the Macintosh, Apple will be free to pursue the consumer market. The iPhone/iPod/iPad set offer a much better consumer experience than the Macintosh. Apple has every incentive to move people to the "triple-IP set" than keep them on the Macintosh.

Microsoft ends up with the corporate desktop, something they already own. Yet the corporate desktop is changing, moving to the virtual world. The corporate world will abandon desktop hardware for virtual desktops, and the equipment on the desktop will be a minimal system, just enough to run the virtual environment. In many ways, this configuration is the old IBM 3270 terminal on a desk. (But with spiffier graphics.)

These two moves bode poorly for desktop manufacturers. People will by iPhone and not desktops. Corporations will buy virtual PCs and servers but not real PCs. Even the small "Mom and Pop" operations will buy virtual PCs from the cloud.

I predict that by 2015, the PC market will collapse. Sales of real PCs will plummet. (Sales of gaming consoles will remain robust, though.) And by then, everyone will acknowledge the end of the PC era.

Unfortunately for the Linux crowd, the demise of PCs will mean a demise to hackers. The Linux world will have to move to virtual PCs on servers. I'm not sure how that is going to work for the average hacker.


Saturday, May 22, 2010

Yet it moves

Managers and programmers worry about different things. Programmers worry about code and becoming technically stale. Managers worry about delivering an acceptable solution on time and under cost.

The differences extend to the selection of tools. When picking the programing language for a project, a programmer will worry about the power of the tool and the ability to perform the necessary work. Managers worry about these things (or so they will claim) but they also worry about staffing and risk. Can they hire programmers who are familiar with the language? (Or replace programmers who leave?)

Since managers avoid risk, they prefer tools that have a proven track record, especially in their own organization. By using commonly available languages and development environments, managers have a ready supply of available workers -- the rest of their organization. And if other teams in the organization have selected tools based on the market, then the pool of potential candidates includes programmers in the entire industry.

So when starting a new project, many managers will make the safe choice and pick the popular language. It doesn't really matter that the language is not the best choice for the programming task; the ability to swap in new staff outweighs the difference in language performance.

Yet managers make an assumption, one that leads to problems in the long term. They assume that once started, the project will remain in the chosen language.

There are clear market leaders for programming languages. COBOL was (and perhaps still is) the leader for accounting and financial applications. FORTRAN rocks for scientific applications. The picture for office applications is less consistent.

In the 1980s, the languages of choice for office applications were Pascal and Microsoft Basic. The Macintosh OS interface was based on Pascal, and so was the interface for Microsoft Windows.

During the 1990s, the popular languages were C and later C++ and Visual Basic.

At the turn of the century, Java ascended.

Here in 2010 we like C# and .NET.

This history is an extremely simplified version. In each period, there were other languages. Applications started in an earlier age often stayed in their initial language, with a small percentage moving to a later language. But I think the simple version captures the basic gist of language popularity.

Here's the point: the "popular" language, the "safe" language, changes. The safe language (for a manager) of 1990 is not a safe language in 2010. In fact, managers who recommend that a new project use the C language would be asked for a justification for such an unusual choice. (They may have a specific need, but they would be asked. Managers recommending C# would receive nods of approval without the need for supporting justifications.)

An efficient project needs a ready supply of programmers, and this is what managers worry about.

Yet if the criteria for an efficient project is a ready supply of staff, why don't managers plan for future changes in technology? Today, managers start a project with a language and that language is locked in for the entire life of the project. There is no plan to re-tool the project ten years hence. (It seems that we pick new languages every decade.)

Instead of planning for the (seemingly inevitable) change, managers wait until they are forced to move to a new language. They may want a new platform (C#/.NET replacing C++/MFC) or perhaps recognize a dearth of programming talent. Managers assume that a project will start in a language and continue to use it... forever.

I'm not asking that managers pick the exact time to re-code the project, nor am I asking that they specify the new language in advance. That would require quite the forecasting powers.

I am asking that managers plan on a change in technology, somewhere between eight and twelve years into the life of the project. I am asking that managers work with their technical staff to reduce the work of that transition.

Some would say that planning for such a change is impossible -- we cannot predict the rise of new technology, nor plan for it. I agree with the statement that we cannot make specific (accurate) predictions about new technology. (Who could have predicted the iPhone in 1990? Or the cloud?)

I believe that we can build systems and applications that are more easily converted to new technology. We've been converting systems for the past fifty years. In that time, we have learned some key ideas: keep I/O separate from processing, use multiple levels of logic, make code readable, ... the list goes on.

If managers were concerned about the future technology shift (specific or general), they would be measuring the system architecture and ensuring that it was well-organized and prepared for change. Yet they don't.

I can hear the cries of managers throughout the industry: "Yes we do care! We require system documentation and code reviews and project planning meetings! How can you say we don't care about the future?"

To this I respond: Then why is it such an effort to move your systems to new platforms? Why is it so hard to extract business logic from the system? Why is it so hard to assess the quality of your systems? And show me a project plan that includes the task "change platform". (Not a plan developed late in the life of the system, but early on and scheduling the task several years in the future.)

If the system were prepared to move to a new language (or platform), if the code were well-organized and readable, these tasks would be easy.

Our systems have been built on the assumption of a firm, non-moving  foundation. Yet it moves.


Thursday, May 20, 2010

Is Microsoft still relevant?

The latest dust-up in the cyber-world has been between Apple and Google, with Adobe playing the role of football. Apple has all but kicked Adobe to the curb, refusing to accept Flash on the iPhone/iPod/iPad platform. Google has recently made noise about accepting Flash on the Droid platform.

Microsoft is nowhere in this debate  While Microsoft has announced Windows Phone 7, they have avoided the flash debate. Possibly because they are pushing their Silverlight tech and not Flash. (I've seen Silverlight, and am impressed with the idea, but scared at the frequency of major updates.)

By remaining silent, Microsoft cedes ground to Google. (Apple loses to Google in my opinion.)

The fastest expanding market is for Droid phones. Apple iPhone/iPod/iPad tech follows in second. Microsoft is a distant third (if they are third -- Symbian may have that slot) and accounts for a very small slice of the market.

For phone apps, Apple has the existing market, and Google has the expanding market. Microsoft has... a nice tool (Silverlight) with a weakly-accepted OS (WinPhone 7) and a questionable name ("Zune").

In this market, I would invest in Apple and Google. I don't know which will win the handset battle, but I am confident it will be one of the two.

Microsoft can go back to working on new releases of SQL Server and MS Office. Not that anyone will notice.


Monday, May 10, 2010

Not two but five

We in the industry often partition vendors into two camps. It doesn't matter what the camps are; two is the number that we often pick. So we have the Microsoft-Apple division, the Microsoft-Open Source division, or perhaps the Microsoft-Google division.

It seems to me that there are five major clusters of technology: Microsoft, Oracle/Sun/Java, Apple, Google, and Open Source. Minor clusters include Amazon.com and Yahoo.

Microsoft has by far the most complete collection of technologies for its cluster: C# as the primary language (with C++ and Visual Basic as secondaries), .NET and CLR for its framework and run-time, Visual Studio as the IDE, SQL Server for the database, Internet Explorer as the browser, ASP (sorry -- ASP.NET) and Silverlight for web applications, Azure for the cloud, IIS for a web server, and Office for word processing and spreadsheets, and Xbox for games. The holes in its technology set include a scripting language (VBscript is not strong enough to fill this slot) and a retail store for books and music.

Oracle has a less complete set. It has Java for the language, the JVM and multiple class libraries for run-time support, Eclipse as an IDE, the Oracle and MySQL databases, JSP for web apps, and StarOffice for word processing and spreadsheets.

Apple is a strong competitor for Microsoft. It has Objective-C for language (with C and C++ as secondaries), the Carbon and Cocoa run-times, the Safari browser, Microsoft Office, and iTunes for books and music. It lacks a database (although the Unix/Linux databases work) but doesn't seem to miss it, since it focusses on retail applications.

Google has web search, Google Docs (which offer collaboration), the App Engine for cloud apps, online books, and Chrome. It lacks an IDE, and SQL database, and a web server.

Open source has multiple languages (C++, Perl, Python, Ruby, and several others), the Eclipse IDE, the MySQL and Postgres SQL databases, the CouchDB NoSQL database, Open Office for word processing and spreadsheets, and the Apache web server.

The corporate entities have their markets, too. Microsoft and Oracle are vying for the large-scale enterprise customers. Apple is fighting (successfully) for the individual consumer. Google seems to be looking at the small business. Open source is not looking to acquire customers but contributors, which makes comparisons to vendors difficult.

Where will they be in a few years? My guess is not very far from where they are today. The fragmentation of the market seems a stable configuration. I expect that Large companies will continue to purchase Microsoft and Oracle solutions. Here I think the bifurcation of the market into Microsoft and Oracle will continue, with companies using one or the other and sometimes both.

I expect Apple to keep its lead in the retail market (despite its alienation of hackers). The current set of individual developers will be replaced by development shops. As companies enter the Apple iPhone and iPad market, look to see an increase in the complexity (and price) of development tools.

I expect that the development of large-scale applications will become more complex, with Microsoft and Oracle offering more expensive solutions for development. I also expect individuals to leave this market, driven out by the high cost of the development tools and the complexity of the toolsets and run-time packages. As it is, an individual is hard-pressed to understand all of the technology necessary to build and deploy a Windows-compliant application. (Even considering the "drag and drop" deployment that Microsoft offers.)

The only place for an individual will be in open source. Open source provides opportunities for individuals, small groups, and large groups. Right now, there are opportunities for individuals, either working alone or working with existing projects. A few open source projects have figured out how to monetize their work; if more projects can do this, the demand for individuals will increase, to the detriment of the proprietary projects.

So I don't see the development world in two groups. I don't see it as "Microsoft vs. Apple", or "Microsoft vs. Google". I see it as a set: Microsoft, Oracle, Apple, Google, or open source.

Thursday, May 6, 2010

What do open source folks worry about?

I'm looking at the session lists for two upcoming open source conferences, and I've noticed something: There are few sessions for Java and no -- none at all -- sessons for C or C++. The sessions cover a variety of topics, from Perl and Python to NoSQL databases, from project leadership to legal issues.

But nothing on C or C++.

There are no sessions on C# or .NET technologies, not even the 'mono' project.

This dearth of C sessions might be due to a bias on the part of the conference organizers. They select the presentations for the conference and may have decided to avoid C and C++ topics.

Yet the conference organizers must select courses from a pool of submissions. If no one submits presentations on C or C++, then the organizers cannot select them. Also, the organizers must select sessions that will attract attendees, lest they have a very small conference.

So I conclude that there is little interest in the open source community for C and C++. Or perhaps I should conclude that there is *more* interest in other topics, such as the latest version of Perl and techniques for building communities.

If we accept this conclusion, then we can make some other deductions:

People using open source have little concern for hard-core run-time performance. C and C++ provide programs that run fast. Despite the efforts of Microsoft with C# and .NET, and Sun with Java and its JVM, and even the folks maintaining Perl and Python and Ruby, C and C++ have the capability of creating fast programs.

But that doesn't mean that C and C++ let you create programs quickly. C and C++ offer high performance at run-time, at the price of careful design and construction. Java and C# (combined with their IDEs) let you develop programs faster, although the programs themselves won't run as fast.

The same goes for Perl and Python and Ruby. One can develop programs faster, but the programs will run slower than their C counterparts.

Besides program performance, folks in open source are more interested in languages for individuals. C++, C#, and Java are languages for large teams. The supporting libraries and development tools are built for teams of people. (Especially so for the Microsoft stack.) Languages such as Perl and Ruby are designed for individuals and small teams. The libraries are smaller, the development stacks are smaller and lighter.

Interestingly, I believe cost is not an issue. Perl, Python, and Ruby are available for free. Yet so is Java and its Eclipse IDE. And so are C and C++ in the GCC compiler set. Microsoft offers "Express" versions of Visual Studio for free, so one can use C# at no cost.

So to summarize: people working with open source are not concerned (overly) with run-time performance, nor are they concerned about building large development teams, nor are they concerned with the cost of their tools.

So what are they thinking?