Monday, September 9, 2013

Microsoft is not DEC

Some have pointed out the comparisons of Microsoft to the long-ago champion of mini-computers DEC.

The commonalities seem to be:

  • DEC and Microsoft were both large
  • DEC and Microsoft had strong cultures
  • DEC missed the PC market; Microsoft is missing the mobile market
  • DEC and Microsoft changed their CEOs

Yet there are differences:

DEC was a major player; Microsoft set the standard DEC had a successful business in minicomputers but was not a standard-setter (except perhaps for terminals). There were significant competitors in the minicomputer market, including Data General, HP, and even IBM. Microsoft, on the other hand, has set the standard for desktop computing for the past two decades. It has an established customer base that remains loyal to and locked into the Windows ecosystem.

DEC moved slowly; Microsoft is moving quickly DEC made cautious steps towards microcomputers, introducing the PRO-325 and PRO-350 computers which were small versions of PDP-11 processors running a variant of RT-11, a proprietary and (more importantly) non-PC-DOS operating system. DEC also offered the Rainbow which ran MS-DOS but did not offer the "100 percent PC compatibility" required for most software. Neither the PRO and Rainbow computers saw much popularity. Microsoft, in contrast, is offering cloud services with Azure and seeing market acceptance. Microsoft's Surface tablets and Windows Phones (considered quite good by those who use them, and quite bad by those who don't) do parallel DEC's offerings in their popularity, and this will be a problem for Microsoft if they choose to keep offering hardware.

The IBM PC set a new standard; mobile/cloud has no standard The IBM PC defined a new standard for microcomputers (the new market). Overnight, businesses settled on the PC as the unit of computing, with PC-DOS as the operating system and Lotus 1-2-3 as the spreadsheet. The mobile/cloud environment has no comparable standard hardware or software. Apple and Android are competing for hardware (Apple has higher revenue while Android has higher unit sales) and Amazon.com is dominant in the cloud services space but not a standards-setter. (The industry is not cloning the AWS interface.)

PCs replaced minicomputers; mobile/cloud complements PCs Minicomputers were expensive and PCs (except for the very early microcomputers) were able to perform the same functions as minicomputers. PCs could perform word processing, numerical analysis with spreadsheets (a bonus, actually), data storage and reporting, and development in common languages such as BASIC, FORTRAN, Pascal, C, and even COBOL. Tablets do not replace PCs; data entry, numeric analysis, and software development remains on the PC platform. The mobile/cloud technology expands the set of solutions, offering new possibilities.

Comparing Microsoft to DEC is a nice thought experiment, but the situations are different. Was DEC under stress, and is Microsoft under stress? Undoubtedly. Can Microsoft learn from DEC's demise? Possibly. But Microsoft's situation is not identical to DEC's, and the lessons from the former must be read with care.

Sunday, September 8, 2013

The coming problem of legacy Big Data

With all the fuss about Big Data, we seem to have forgotten about the problems of legacy Big Data.

You may think that Big Data is too new to have legacy problems. Legacy problems affect old systems, systems that were designed and built by Those Who Came Before And Did Not Know How To Plan For The Future. Big Data cannot possibly have those kinds of problems, because 1) the systems are new, and 2) they have been built by us.

Big Data systems are new, which is why I say that the problems are coming. The problems are not here now. But they will arrive, in a few years.

What kind of problems? I can think of several.

Data formats Newer tools (or newer versions of existing tools) change the formats of data and cannot read old formats. (For example, Microsoft Excel, which cannot read Lotus 1-2-3 files.)

Data value codes Values used in data to encode specific ideas, changed over time. These might be account codes, or product categories, or status codes. The problem is not that you cannot read the files, but that the values mean things other than what you think.

Missing or lost data Non-Big Data (should that be "Small Data"?) can be easily stored in version control systems or other archiving systems. Big Data, by its nature, doesn't fit well in these systems. Without an easy way to back up or archive Big Data, many shops will take the easy way and simply not make copies.

Inconsistent data Data sets of any size can hold inconsistencies. Keeping traditional data sets consistent requires discipline and proper tools. Finding inconsistencies in larger data sets is a larger problem, requiring the same discipline and mindset but perhaps more capable tools.

In short, the problems of legacy Big Data are the same problems as legacy Small Data.

The savvy shops will be prepared for these problems. They will put the proper checks in place to identify inconsistencies. They will plan for changes to formats. They will ensure that data is protected with backup and archive copies.

In short, the solutions to the problems of legacy Big Data are the same solutions to the problems of legacy Small Data.

Thursday, September 5, 2013

Measure code complexity

We measure many things on development projects, from the cost to the time to user satisfaction. Yet we do not measure the complexity of our code.

One might find this surprising. After all, complexity of code is closely tied to quality (or so I like to believe) and also an indication of future effort (simple code is easier to change than complicated code).

The problem is not in the measurement of complexity. We have numerous techniques and tools, spanning the range from "lines of code" to function points. There are commercial tools and open source tools that measure complexity.

No, the problem is not in techniques or tools.

It is a matter of will. We don't measure complexity because, in short, we don't want to.

I can think of a few reasons that discourage the measurement of source code complexity.

- The measurement of complexity is a negative one. That is, more complexity is worse. A result of 170 is better than a result of 270, and this inverted scale is awkward. We are trained to like positive measurements, like baseball scores. (Perhaps the golf enthusiasts would see more interest if they changed their scoring system.)

- There is no direct way to connect complexity to cost. While we understand that a complicated code base is harder to maintain that a simple one, we have no way of converting that extra complexity into dollars. If we reduce our complexity from 270 to 170 (or 37 percent), do we reduce the cost of development by the same percentage? Why or why not? (I suspect that there is a lot to be learned in this area. Perhaps several Masters theses can be derived from it.)

- Not knowing the complexity shifts risk from managers to developers. In organizations with antagonistic relations between managers and developers, a willful ignorance of code complexity pushes risk onto developers. Estimates, if made by managers, will ignore complexity. Estimates made by developers may be optimistic (or pessimistic) but may be adjusted by managers. In either case, schedule delays will be the fault of the developer, not the manager.

- Developers (in shops with poor management relations) may avoid the use of any metrics, fearing that they will be used for performance evaluations.

Looking forward, I can see a time when we do measure code complexity.

- A company considering the acquisition of software (including the source code), may want an unbiased opinion of the code. They may not completely trust the seller (who is biased towards the sale) and they may not trust their own people (who may be biased against 'outside' software).

- A project team may want to identify complex areas of their code, to identify high-risk areas.

- A development team may wish to estimate the effort for maintaining code, and may include the complexity as a factor in that effort.

The tools are available.

I believe that we will, eventually, consider complexity analysis a regular part of software development. Perhaps it will start small, like the adoption of version control and automated testing. Both of those techniques were at one time considered new and unproven. Today, they are considered 'best practices'.

Wednesday, September 4, 2013

Javascript is the new BASIC

The idea that Javascript is the new BASIC is not unique. Others have made the same observation, here and here. I will add my humble thoughts.

BASIC, not Microsoft's Visual Basic but the elder brother with line numbers, was the popular language at the beginning of the personal computer era.

The popularity of BASIC is not surprising. BASIC was easy to learn and just about every microcomputer had it, from the Apple II to the Commodore PET to the Radio Shack TRS-80. Books and magazine articles discussed it.

Alternate languages were available, for some computers. The system I used (a Heathkit H-89) ran the HDOS and CP/M operating systems, and there were compilers for FORTRAN, COBOL, C, and Pascal. But these other languages were expensive: a FORTRAN compiler cost $150 and COBOL $395 (in 1980 dollars).

The biggest competitor to BASIC was assembly language. Assemblers were modestly priced, but the work necessary for the simplest of tasks was large.

BASIC was available, and we used it.

BASIC wasn't perfect. It was designed as a teaching language and had limited programming constructs. While it had an 'IF' statement, most variants had no 'IF/ELSE' and none had a 'WHILE' loop. Variable names were a single letter and an optional digit. It had no support for object-oriented programming. It was interpreted, which carried the double drawbacks of poor performance and visible source code. Your programs were slow, and the only way to distribute them was by giving away the source (which, given BASIC's limitations, was unreadable for a program of any significant size).

BASIC was popular and reviled at the same time. Dykstra famously declared "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." But it wasn't just him; we all, deep down, knew that BASIC was broken.

We were forced to program around BASIC's limitations. We learned some good habits and lots of bad ones, some which haunt us to this day. Yet it's limitations forced us to think about the storage of data, the updating of files, and the complexity of calculations.

We also looked forward to new versions of BASIC. While some computers had BASIC baked into ROM (the Commodore C-64 and the IBM PC), other computers had ways of using new versions (the IBM PC had a 'BASICA' that came with PC-DOS).

BASIC was not just the language of the day but the language of the future.

Today, Javascript is the language that is available and easy to learn. It is not baked into ROMs (well, not usually) but it is baked into browsers. Information about Javascript is available: there are lots of web pages and books.

Like BASIC, Javascript is not perfect. No one (to my knowledge) has claimed that learning Javascript will permanently stunt your programming skills, but the feeling I get from Javascript programmers is similar to the feeling I remember about BASIC programmers: They use the language and constantly hope for something better. And they are working around Javascript's limitations.

BASIC was the language that launched the PC revolution. What will Javascript bring?


Friday, August 30, 2013

The collision between BYOD and enterprise

There is a conflict looming in the business world. A new meme is rising to challenge the concept of enterprise software.

That meme is "Bring Your Own Device" (BYOD).

First, a short discussion of enterprise software. What, exactly, is "enterprise software".

It can be hard to define enterprise software. What is it about a specific application that makes it an "enterprise" application? Is it simply the applications used in business (like Microsoft Word and Excel)? Is it the ability to integrate with Microsoft's Active Directory? Is it an expensive support contract?

Here's my definition: Enterprise software is selected by one person and used by another. In most (if not all) large organizations (business, non-profit, and government) someone, perhaps a committee of someones, selects the "standard software" used within the organization. The chosen software is then foisted upon the troops, who must use the specified software or face the wrath of the standards committee, senior managers, and the Human Resources department.

Companies have reasons to standardize software. The reasons are varied, yet generally devolve into one of these:

  • Common data formats to exchange information
  • The company can buy licenses at a discount
  • Support teams can reduce costs by focusing on a limited set of software
  • People can easily move from one project to another

These are all valid reasons of a company to standardize on software.

But look at what happens in the new world of "Bring Your Own Devices" (with the implication of bringing your own software):

  • The employee is buying the license, not the company. No volume discounts!
  • The employee supports himself (or herself). No support team needed!

The advantage of "Bring Your Own Device (and software)" to the company is that the cost of acquisition and maintenance shifts to the employee. Yes, companies will claim that they implement BYOD(S) to improve workforce morale, but really the accountants have looked at the numbers and blessed the decision based on the reduction of expenses.

Once a company implements BYOD(S), the justifications for standardized software become:
  • Common data formats to exchange information
  • People can easily move from one project to another
The former has some merit, but is weak. In the days of Wordstar, WordPerfect, and MS-Word (three word processors with very different data formats) the ability to exchange information across products was limited. Today's word processors (and spreadsheets, and even virtualization managers) have moved to common, open formats.

Which leaves the only reason to enforce standard software is to move people from one project to another. That argument is difficult, given that employees, assigned to any project, will own their devices and software. They still pick the software for their device!

So there you have it. The BYOD(S) movement, which forces people to purchase their own hardware and software, also moves us away from enterprise software.

* * * * *

Of course, the situation is not that simple. The logic works for my definition of enterprise software, which was simplistic. Enterprise software is selected by someone and used by another, but it is much more. Enterprise software is designed for collaboration among employees, with common data and often restrictions on visibility and operations upon that data, based on user identity.

The significant enterprise systems (ERP systems, calendaring, and the legacy accounting systems) will remain enterprise systems. Employees will use apps to access them, from tablets or smartphones or even PCs.

The other "enterprise systems", the desktop apps that standards committees often argue about, will, on the other hand, evaporate into the realm of employee-owned devices. Word processors, spreadsheets, presentation software, and many other applications will cease to be a worry of the standards committees. Instead, the standards committees will worry about data formats and storage locations.

Monday, August 26, 2013

Microsoft loses ability to FUD

Something interesting happened in the recent few years, something that has gone unnoticed.

Microsoft has lost its ability to "FUD".

"FUD" is an acronym for "Fear, Uncertainty, and Doubt". It is a marketing technique, used (if not invented) by IBM in the mainframe era.

FUD can be used only by a large, established company. It is a bullying technique, in which the large company prevents customers from purchasing products or services from smaller competitors.

Here's how it works:

Step 1: Company A, the large established company, has a dominant position with some set of products or services.

Step 2: Company B, a smaller company, introduces a new product that changes the market. Perhaps it is a new product, one that uses a different paradigm from the existing products (which happen to be offered by Company A).

Step 3: Company A, perceiving Company B as a threat, talks with customers and claims to have a product or service that will perform the same functions as Company B's product. Company A tells the customer to wait for their product, and delay any purchase of Company B's product.

It's a neat arrangement. Company A can use their reputation to throttle the sales for Company B, simply by claiming to have a similar product. Of course, Company A's product will be compatible with the rest of Company A's product line, to which many customers are already committed. Company B loses potential sales, while customers wait for the offering from Company A.

Some may call it unfair. Perhaps it is. It allows a large company to keep market share, at the expense of small companies.

FUD works solely on reputation. A large company can use it, because they have a credible claim to deliver the promised product or services. A small company cannot use FUD; customers judge them only after they deliver.

IBM used this technique. Microsoft used this technique.

But Microsoft cannot use this technique today.

Legal challenges to Microsoft has forced them to use open formats for documents and spreadsheets. This allows competing products that can use the file from Microsoft products. New companies have introduced new products that do not compete with Microsoft, but exist in a different space (think "Facebook" and "Twitter"). Microsoft no longer dominates the market.

Customers can purchase products from Apple, Asus, and Samsung, and services from Amazon.com, Google, and even IBM. People and enterprises can purchase competing products for Microsoft Word and Microsoft Excel, without fear of incompatibility.

Customers have lost their fear of Microsoft. Now, Microsoft has to prove their worth to customers.

Sunday, August 25, 2013

The coming split of Windows

Microsoft has always had a "Windows on every PC" philosophy, since the dawn of Windows. (Prior to the first release of Windows, the philosophy was "Microsoft software on every PC".) The new releases of Windows 8 for PC and smart phones continues that philosophy.

I think that philosophy is changing.

The different platforms of tablets, desktop PCs, and servers, are, well, different. They have different needs, and they serve different purposes. This is especially visible with the Surface RT, which doesn't run Windows EXE files. (Technically, it does. It runs the Office EXE files for Word and Excel. Windows RT does not allow you to install EXE files.)

In the mobile/cloud world, the tablet and server run different programs, with different purposes and with very different designs. Tablets focus on the user interface, and servers handle data storage and calculations. Even today, Windows for the desktop is different from Windows Server.

I expect Microsoft to move away from "one Windows on every platform" and introduce variants of Windows for each platform. Microsoft will keep the "Windows" name; it is a recognized and trusted brand. But the Windows for tablets (Windows RT) will be different from the Windows for servers (Windows Server) and those will be different for Windows for the desktop. I expect Windows Phone to be very similar to Windows RT.

Such a product line matches Apple's offerings, and in a sense matches Google's. Apple iOS powers phones and tablets; MacOS powers their desktops. Google offers Android for phones and tablets, ChromeOS powers their laptop, and they have a number of cloud offerings for server-based computing.

A split along hardware lines also makes sense technically. The three platforms offer different capabilities and must provide different types of services. Mobile devices must be location-aware; servers must provide fail-over and reliability and probably run as virtual servers. Forcing one Windows API on all platforms is wasteful.

Microsoft could handle this split by forking the code base and developing different products, but they have another option: modules. They may choose to make Windows modular, using a single kernel with modules that can be added to build different configurations. A modular Windows is not that far-fetched; Windows already has a kernel/module design. In addition, it creates possibilities for other combinations, such as a specific Windows for embedded systems, another Windows for gaming, and yet another Windows for high-reliability systems.

I think the future of Windows is one of multiple variants, each serving a specific need.