Friday, July 31, 2009

RIP Software Development Conference

I am behind the times. Not in the loop. Uninformed.

Techweb killed the SD conferences. These were the "Software Development" conferences that I liked. (So much that I would pay my own way to attend them.)

Techweb killed them back in March, shortly after the "SD West 2009" con.

Here's an excerpt of the announcement that Techweb sent to exhibitors.

Due to the current economic situation, TechWeb has made the difficult decision to discontinue the Software Development events, including SD West, SD Best Practices and Architecture & Design World. We are grateful for your support during SD's twenty-four year history and are disappointed to see the events end.

Developers remain important to TechWeb, and we encourage you to participate in other TechWeb brands, online and face-to-face, which include vibrant developer communities:
...
Again, please accept our sincerest gratitude for your time, effort and contributions over the years. It is much appreciated.


The full text is posted on Alan Zeichick's blog.

The SD shows were inspirational. They brought together people with the one common interest of writing software. The shows were not sponsored by a single company, nor did they focus on one technology. People came from different industries to discuss and learn about all aspects of software. As one fellow-attendee said: the conferences were "ecumenical".

While I'm saddened at the loss (and bemused at my ignorance of their demise), I'm disappointed with Techweb's approach. Their announcement is bland and uninspiring. The brutal utility of the message tells us of Techweb's view of its mission: running conferences efficiently (and profitably). Their announcement can be paraphrased: "SD was not profitable, so we killed it. We've got these other shows; please spend money on them."

In contrast, the O'Reilly folks have a very different mission: building a community. They run conferences for the community, not as their means of existence. (They also publish books, host web sites, and run other events.) If a conference should become unprofitable, then it becomes a drag on their mission of building community and I would expect them to cancel it. But here's the difference: I would expect O'Reilly to provide another means for people to meet and discuss and learn, and I would expect O'Reilly to phrase their announcement in a more positive and inspirational light. Something along the lines of:

We've been running the (fill in the name) conference for (number) years, bringing people together and building the community. In recent years, the technology and the environment have changed, and the conference is no longer the best way to meet the needs of the practitioners, presenters, and exhibitors. We're changing our approach, and creating a new (whatever the new thing is) to exchange experiences and learn from each other. We invite you to participate in this new aspect of our community.

OK, that's not the perfect announcement, but it's much closer to what I want from conference organizers.

I'm not a marketing expert; I'm a programmer. But I know what I want: Someone who listens. O'Reilly does that. I'm not sure that Techweb does.

Tuesday, July 28, 2009

Last century's model?

When it comes to processes, commercial software development is living in the industrial age. And it doesn't have to be that way.

Lots of development shops use the standard "integration" approach to building software. Small teams build components, which are then integrated into assemblies, which are then integrated into subsystems, which then become systems, which are then assembled into the final deliverable. At each step, the component/assembly/subsystem/system is tested and then "released" to the next higher team. Nothing is released until it passes the group's quality assurance process.

This model resembles (one might say "duplicates") the process used for physical entities by defense contractors, automobile assembly plants, and other manufacturers. It leads to a long "step" schedule, with each component dependent on the release of pieces.

But does it really have to be that way?

For the folks building a new fighter plane, the model makes sense. If I'm working on a new jet engine, you can't have it, because it can be in only one physical place at any given time. I need it until I'm done. There's no way I can work on it and let you have a copy. (Oh, we could build two of them, but that would add a great deal of expense and the synchronization problems are significant.) We are limited to the "build the pieces and then bring them together" process.

Once the complete prototype is proven, we can change to an asychronous assembly process, one that creates a large number of the components and assembles them as needed. Bot for software, that amounts to duplicating the "golden CD" or posting the files on the internet.

Software is different from physical assemblies. You *can* have a copy while I work on it. Software is bits, and easily copied. Rather than hold a component in hiding, you can make the current version available to everyone else on the project. Other teams can integrate the latest version into their component, test it, and give you feedback. You would get feedback faster, since you don't have to wait for the "integration phase" (when you've committed to the design of your component).

And this is in fact what many open source projects do. They make their latest code available... to anyone. The software projects that I have seen have two sets of available code: the latest build (usually marked as "development" or "unstable") and the most recent "good" release (marked as "stable").

If you're on a project (or running a project) that uses the old "build pieces independently and then bring them together" process, you may want to think about it.

Sunday, July 26, 2009

Just what is a cloud, anyway?

The dominant theme at last week's OSCON conference was cloud computing. So what can I say about cloud computing?

As I see it, "cloud computing" is a step towards a the commoditization of computing power. The cloud model moves server hardware and base software out of corporate data centers and into provider data centers. Clients (mostly corporations) can use cloud computing services "on demand", paying more as they use more and less as they use less.

Cloud computing services fill the second tier, between the front-end browser and the legacy back-end processing systems. This is where web processing is today.

The big players have signed on to this new model. Microsoft has its "Azure" offering, Amazon.com has EC2 and S3, and Google has its App Engine.

Cloud providers are similar to the early electric companies. They build and operate the generators and transmission lines, and insist on meters and bills.

Moving into the cloud requires change. Your applications must be ready to work in the cloud. They have to talk to the cloud APIs and be designed to run in multiple instances. Indeed, one of the features of the cloud is that new instances of your application can come on-line as you request more computing power.

Like the early electricity companies, each provider has its own API. Apps for the Amazon.com platform cannot be (easily) transferred to Google. And Microsoft not only has its own API but uses the .NET platform with its development languages. I suspect that common APIs will emerge, but only after time and possibly with government assistance. (Just as the government set standards for control pedals in automobiles.)

I suspect that apps on the cloud will be different from today's apps. Mainframes had their standards apps: accounting and finance, mainly. When minicomputers arrived, people ported the accounting apps to minis with some success but also created new applications such as word processing. Later, PCs arrived and absorbed the word processing market but also saw new apps such as spreadsheets. Networked PCs created e-mail but left the old apps in stand-alone mode. The web saw new applications like LiveJournal and Facebook. (OK, yes, I know that e-mail existed prior to networked PCs. But networked PCs made e-mail possible for most people. It was the killer app for networks.)

Each new platform sees new applications. Maybe the new platform is created to serve the new apps; maybe the new apps are created as a response to the new environment. I don't know the direction of causality, but I'm leaning towards the former. With cloud computing, expect to see new applications, things that don't work on PCs or on the currrent web. The apps will meet new needs and have advantages (and risks) beyond today's apps.

Sunday, July 19, 2009

Skating to the puck

I recently reviewed an internal study of competing technologies for a new, large-scale, Windows application. It was the typical survey of the possible platforms for the application, summarizing the strengths and weaknesses of each. This kind of analysis has been done hundreds (or thousands, or tens of thousands) of times by companies and organizations around the world and back to the dawn of computing. The thinking is: Before we set out on this project, let's review the possible technologies and pick the best one.

As I was reading the study, I realized that the study was wrong.

Not wrong in the sense of improperly evaluating technologies, or wrong in the sense that the authors ignored a possible platform. They had included the major platforms and listed the strengths and weaknesses in an unbiased presentation.

It was wrong in the sense of time. The report had the wrong tense. It looked at the present capabilities of the platforms. It should be looking at the future.

The project is a long term project. The development is planned for five years, with a lifetime of ten years after that development. (That's a simplified version of the plan. There will be releases in the five year development phase, and enhancements and maintenance during the ten-year follow-on phase.)

For such a project, one needs a view of the future, not a view of the present. Or, as Wayne Gretzky learned from his father: "skate to where the puck is going to be, not to where it has been."

The study looked at the major technologies (.NET, Java, Silverlight, Air, and Flash) and reviewed their current capabilities. The authors made no projections of the possible futures for these platforms.

I understand that predictions are hard. (Especially predictions about the future.) But the majority of the development effort will be made in the future, from two to three years out, and continuing for a decade. Looking at the current state of technologies and deciding the next fifteen years on them is insufficient. You have to look at where each technology is going.

This study was part of a new development effort, to replace an existing product was difficult to maintain. The existing product was about ten years old, with parts going back fifteen years. The difficulties were due to the technology and design decisions that had been made at the inception of the project and during its life, some only a couple of years ago.

This team is embarking on a repetition of their previous development effort. Instead of creating a long-lasting design on robust technology, they are building a system that, in a few years, will be difficult to maintain.

Pucks move and technology changes. If you always skate to where the puck currently is, you will always be behind.

Tuesday, July 14, 2009

Not with a bang

Has the Age of Windows passed? I think it has. Not only that, I think the age of the web application is passing. We're now entering the age of the smartphone app.

In the past, transitions from one technology to another have been sharp and well-defined. When IBM released the first PC (the model 5150) companies jumped onto the PC-DOS bandwagon. Products were either ported from CP/M to PC-DOS or made for PC-DOS with no attempt at compatibility with the older systems. (A few perhaps, but the vast majority of applications were made for PC-DOS and the IBM PC.)

When Microsoft introduced Windows 3.1, manufacturers climbed onto the bandwagon and created Windows applications and abandoned MS-DOS. Windows 3.1 was a "windows thing" on top of the "DOS thing", so the old MS-DOS applications still ran, but all new applications were for Windows.

(I'm ignoring certain technologies, such as OS/2, CP/M-86, and the UCSD p-System. I'm also ignoring Macintosh, although I suspect that the Apple II/Macintosh transition was also fairly swift.)

Back to Windows. Since the rise of Windows, we've had one major transition and we're in the midst of another major transition. The first was the shift from Windows (or client/server) applications to web applications. The second, occurring now, is from Windows and desktop web applications to mobile web applications.

The shift from client/server to web app occurred slowly. There was no "killer app" for the web, no counterpart to Lotus 1-2-3 that pulled people to PCs or network support that pulled people to Windows 3.1. The transition was much slower. (One could argue that the killer app for the web was Google, or YouTube, but it is a difficult case.)

Back to Windows. I think that the Age of Windows is over. Think about it: all new applications are designed for either the web or a mobile phone (usually the iPhone).

Don't believe me? Try this test: Name a major commercial application designed for Windows that has been released in the past year. Not applications that run in a browser, but on Windows (and only on Windows). I was going to rule out applications from Microsoft, but I cannot think of new applications for Windows, even from them. (New versions of products don't count.)

I cannot think of any new applications. I can think of new applications for the web (Facebook, Twitter, DOPPLR, etc.) but these live in the browser, and none of them are tied to Internet Explorer. The iPhone has oodles of new applications, including games such as Labyrinth and the Ocarina player. I don't know of anything significant in the commercial space that is specific to the iPhone, but I suspect that it is coming.

But its more than just a move away from Windows. The market has moved, quietly, from Windows to web apps. Windows applications join their older cousins written in COBOL for "big iron" in the maintenance yard. That's old news.

The market is moving again.

As I see it, Facebook is the last major desktop web application. By "desktop web application" I mean an application that was designed for a web browser on a desktop PC. Faeebook was certainly designed that way, with the iPhone version as an afterthought.

Twitter, on the other hand, was designed out of the gate as an iPhone application, with the Windows client as a concession to the technical laggards.

The creativity has shifted to the smart phone. New applications will be made for the iPhone and other smart phones. The talented developers are thinking about smartphones, not desktop web, and certainly not Windows. Older platforms such as the desktop web and Windows are now "mature" platforms. Their applications will be maintained, but the platforms get nothing new. The new applications will be designed for smartphones. A few apps may carry over from the smartphone, but it will be difficult: smartphones have mobility, position awareness, and a degree of intimacy not available to desktop users.

Not with a bang, but a whimper, does the curtain fall on Windows. And desktop web applications.

Sunday, July 12, 2009

Upper bounds

Sometimes, our environment limits us. Sometimes our physical capabilities limit us. It's good to know which, because a limit in one can reduce the usefulness of plentitude in the other.

For example, our visual system limits our ability to channel surf on a cable network. These limits create an upper bound on the number of channels that we can use on a cable network.

Why? Because we can surf only so fast, and therefore surf a finite number of stations in a given period of time. Let's assume that I can surf from channel to channel, spending one half second on each channel. If I start at channel 2 and work my way upwards, it will take me some amount of time to reach the end and "wrap" back to channel 2. With twenty channels, it takes ten seconds. With two hundred channels, almost two minutes. With a thousand channels, about twenty minutes.

If we had ten thousand channels, it would take the better part of a day to surf them. By the time we decided on a channel, the program would be long over.

Here's the interesting observation: With a thousand channels, by the time one surfs the entire collection, the original program (if a thirty-minute show) is more than half over. We have spent too much time surfing and not enough time watching. ur physical capabilities (the ability to process a video signal and decide to stay or go) creates an upper limit to the number of channels.

That limit holds for the strategy of surfing. If we use a different strategy (perhaps looking for specific programs or types of programs, or using an index, or relying on a TIVO-like prediction system) then we can use a larger cable network.

This effect comes in to play with a lot of things, not just cable television. The web is a large collection of channels. The applications on the Apple iTunes store is a large collection. Books in a bookstore. Videos on Youtube (or Hulu).

Books and web videos don't have the time-limit of television programs, yet we all have finite time, finite resources to spend on viewing, listening, processing, or reading. We are all constrained to make choices in finite time.

The trick is knowing our limits.

Sunday, July 5, 2009

Revisiting Babel

Is it possible to move faster by going slower? The concept defies our intuition, yet such a thing may be possible.

For example, on one recent project a team made significant progress. The details were related to me at the Open Source Bridge 2009 conference. I won't go into them here, as they are not important. The important point is that the team completed its task, faster than expected, and under budget.

This progress was surprising as the different team members came from different countries and spoke different languages. They all spoke English, and used it as the common language for the project, but none were fluent in it.

How could such a team make any progress, never mind rapid progress?

The speculation is that since English was a second language, team members spent more than the usual amount of time listening to other members. Processing a second language is often harder; one must pay more attention and often ask for clarification.

The simple acts of listening and asking for clarification may have made the difference.