Thursday, July 23, 2020

IBM and Apple innovate in different ways

Two of the most influential companies in the PC universe are IBM and Apple. There are others, including Microsoft. But I want to compare just IBM and Apple. These two companies have similarities, and differences.

IBM and Apple are both hardware companies. Apple is still a hardware company, although its main business is phones and not computers. IBM is more of a services company; it was a hardware company in 1981 when it introduced the IBM PC.

Both companies innovated and both companies created designs that influence the market today.

IBM introduced the detached keyboard (other systems were all-in-one designs or keyboard-and-CPU with a separate display). IBM also introduced the internal hard drive, the original 8-inch floppy disk, the 3.5-inch floppy disk, the ThinkPad TrackPoint (the "pointing stick"), and the VGA display and video card.

Apple introduced the mouse for personal computers (the original mouse was two decades earlier and for a system much larger than a PC), the PowerBook laptop (a year before the first ThinkPad), AppleTalk, touchscreen for iPhones, and (notably) iTunes which gave consumers a reliable, legal way to load music onto their devices.

Apple stands out in that it innovates not just by adding features, but by removing them. Apple was first in delivering a computer without a floppy drive, and then a computer without a CD drive. Apple famously removed the headphone jack from its phones. It also omitted the Home, End, PgUp, and PgDn keys on its laptops, going back as far as the first PowerBook. (As the PowerBook was not compatible with the IBM PC, it had no need of those keys.)

Apple, more than IBM or any other hardware supplier, has innovated by removing things. The makers of Windows PCs and laptops had typically followed Apple's lead. They have, over time, removed floppy drives, CD drives, and most laptops now require specific multi-key presses for Home, End, PgUp, and PgDn.

IBM innovated by adding features. Apple innovates by trimming away features. That's quite a difference.

Of course, one can remove only so much. Apple has trimmed the phone to a simple slab with a single port for charging and data transfer. It has trimmed the Macbook to a thin wedge that opens to a screen, keyboard, and trackpad. There is very little left to remove, which means that Apple has little room to innovate along its traditional methods.

But notice that Apple's innovation-by-reduction has been in hardware. Its operating systems are not the equivalent of a slim wedge. To the contrary, Apple's mac OS and iOS are somewhat bulky, which is something Apple shares with Windows and Linux.

Of the major operating systems, mac OS probably has the best chance of slimming. Apple has the "remove things to make it better" mindset, which helps to remove features. Apple also has close integration between operating system and hardware, and drops support for older hardware, which lets it remove drivers for older devices. Windows and Linux, in contrast, want to support as much hardware as they can, which means adding drivers and allowing for older devices.

Let's see if Apple's "less is more" approach works it way into mac OS and into Swift, Apple's favored language for development.

Thursday, July 16, 2020

Low code and no code still require thinking

One of the latest fads in tech is the "low code" (sometimes called "no code") approach to application development. This approach lets one construct an application without programming. The implication is that anyone can do it, and that one does not need to study programming or hire a programmer.

In a sense, all of this is true. And yet, it is not quite true.

Nor is it new.

Let's look at the "low code is really old" idea.

We have seen "low code" development for decades. It comes and goes, a cycle almost in sync with cicadas (which emerge from the ground every seventeen years).

It has been called different things in the past: "Report Program Generator", "Powerbase", "Fourth Generation Languages", and possibly even "COBOL" (which was hawked as a way for managers and non-programmers to read -- although not write -- code).

Each incarnation of low-code solutions uses contemporary technology, and always uses mid-grade, generally available hardware. The idea is to make these solutions available to everyone, or at least a large portion of the population, and not require expensive or specialized equipment. The 2020 version of "low code development" includes cloud-based, web-based, multiuser applications that run on servers and are accessed by a web browser.

There are multiple vendors offering different solutions. Some solutions are little more than multi-user spreadsheets with templates for data entry and charts. Others are more sophisticated. But all share a common trait, one that has been in all low-code solutions over time: they build what they build, with little room for expansion or customization. Low-code solutions offer a room with walls, and you can move anywhere in that room -- until you reach a wall.

(Tech-savvy folks will observe that *all* programming solutions, including the high-end hardware and sophisticated programming languages, are "rooms with constraining walls". Computers are not infinite, nor are compilers, processors, and databases. You can do only so much, and then you must stop. All programming methods, high-code and low-code, have limits. But the low-code methods have more restrictive limits.)

My complaint is not about the limits, though. My complaint is about the unstated assumption that anyone can quickly build an application.

Any programming solution, high-code or low-code, requires more than programming. It requires an understanding of the data, and that data is used within the organization. It requires that the person building the application know the range of data values, the properties of individual data elements, and the ways in which those elements can be combined.

In other words, you still have to know what you want to do, you have to understand your data, and you have to think. Even for a low-code solution. Building a solution without the proper analysis and thought can lead to the wrong solution.

Low-code solutions are good for simple tasks, when those tasks are well-defined, have few (or no) exceptions, and do not have high performance requirements.

The high-code solutions require more planning and more coding (of course!), but offer greater expansion capabilities and more customization. You can detect specific conditions and add special processing for unusual cases. High-code languages also offer support for performance analysis such as profiling. 

Another unstated assumption is that once you have built your solution (low-code or otherwise), you are done. Experienced programmers know that programs and systems are not static, that they grow over time as people think of new uses for them. Low-code solutions tend to have little headroom, little upward potential. (They do what they do, and no more.) High-code solutions can be built in ways that let them expand easily, but such expansion is not guaranteed. I have seen any number of custom, high-code solutions that could not expand to meet the growing demands of the business.

Low-code generating systems have a place in the world. One can argue that spreadsheets are the extreme in low-code solutions (if we omit the macros written in VBA). Yet even spreadsheets require us to understand the data.

Selecting a low-code solution over a high-code solution is about trade-offs: ease-of-use (for the low-code solutions) against expansion and tuning (in the high-code solution). Which is better for you will depend on the task at hand, the people you have and their skills, and the future uses of the data and the developed system.

I favor simplicity, and consider solutions in terms of their complexity. I try to use the simplest technology for the solution,

First, consider using a spreadsheet. If a spreadsheet will handle the data and do what you need, use it.

Second, consider a low-code solution. If a low-code app will handle the data and do what you need, use it.

Finally, consider a high-code solution. If the simpler solutions don't do what you need, use the high-code solution.

But think about what you need, about your data, and about the limits on different solutions.

Then decide.

Wednesday, July 8, 2020

Remote work was a perk, now it may be the norm

Before the 2020 COVID-19 pandemic, remote work was unusual, and office work was the norm. After the pandemic, which will be the perk and which the norm?

During the pandemic, for many companies -- especially those which develop software -- remote work has become the norm. But that is during the pandemic. I am thinking about the period after the pandemic.

I am quite certain that companies that have switched from the pre-pandemic configuration of a large office building full of analysts, programmers, testers, and system administrators to the pandemic configuration of a large empty office building with analysts, programmers, testers, and system administrators working at home have noticed that their large office building is ... empty. I am also certain that someone in those companies has "done the math" and calculated the savings in keeping those employees working from home and moving the necessary employees to a much smaller office building.

(I am also certain that most office buildings are leased, and the owners of said buildings have looked at the possibility that their clients will want to move to smaller quarters. But back to the client companies.)

It is clear that after the pandemic, at least some companies will switch to a permanent "work from home" strategy. Twitter and a few other notable companies have made such announcements. But will all companies do so?

The 2020 COVID-19 pandemic may bring about a change to corporate culture. The pre-pandemic mode of thought (employees work in the office except for a few favored individuals) could switch to a new mode of thought, namely "employees work from home except for a few favored individuals".

It is possible, although unlikely, that all companies will change to this new mindset of remote work as the norm. It is more likely that some companies will change and some companies will stay with the original idea of remote work as a perk.

And it is quite possible that some of the former will adopt a second mindset of "office work as a perk". That is, the mirror reflection of what we had prior to the pandemic. Instead of remote work considered a perk, granted to a few individuals, some companies may consider office work the perk, granted to a few individuals.

What's more, the job market may be fractured between companies that want remote workers and companies that want in-office workers. This may lead to additional details in job postings ("remote only", "remote but in same time zone", and "in-office only", or "in-office at least three days per week"). The one standard arrangement of working in an office five days a week may no longer apply.

In the long run, I think the "remote as norm" strategy will dominate. The advantages of "work in the office" seem to be informal conversations and better communication through body language, neither of which can be expressed in dollars. The advantages of remote work can, in contrast, be calculated in dollars (the cost of not leasing the office building with its desks and chairs, minus the cost of securing communications to remote office workers).

Looking at past performance, hard factors (dollars) win over soft factors (non-dollars). We can look to past labor movements and labor laws. Child labor was prevalent -- until outlawed. Work was 10-hour days and six days per week, until the forty-hour work-week was enforced (or at least strongly encouraged) by labor law. Discrimination by age, gender, and race were the practice, until outlawed.

Before anti-discrimination laws, one may have argued that it made sense to hire the person with the best qualifications regardless of race or gender, but companies somehow hired mostly white males for the better jobs. Practices that just "make sense" weren't -- and aren't -- necessarily adopted.

Work in the office may "make sense" for some things, but the economics are clear. I expect companies to take advantage of the economics, and switch workers to remote work. I also expect that remote work will be assigned by a person's position in the corporate hierarchy, with executives and managers working in offices and "worker bee" employees working from home.

Some companies may even offer work in the office as a perk for certain positions.

Thursday, July 2, 2020

Rough seas ahead for Intel

Apple's announced that it is switching processors from Intel to ARM. (For the MacBook. Apple has used ARM for iPhone and iPad for years.) This announcement indicates problems for Intel.

But first, some thoughts on "market leaders".

We like to think that markets have leaders, and those leaders set the direction and pace for the market, much as a squad leader in the military sets the direction and pace for the squad. Let's call this the "true leader".

There is another type of leader, one that instead of setting the direction and pace, looks at the crowd and runs out in front, and then claims leadership. Let's call this the "opportunistic leader".

So is Intel a true leader or an opportunistic leader? I think it is the latter.

One could argue that Intel lost its ability to lead the market in the mid-1980s, with the specific culprit being the IBM PC. Prior to the IBM PC, the market for personal computers was fractured, with different manufacturers using different processors. Apple and Commodore used the 6502, Tandy used the Zilog Z-80 in the TRS-80, and lots of others used Intel's 8080. Intel had the 8088 and 8086 processors, but very few manufacturers used them.

In the 1980s, Intel had a plan for the future of microprocessors - the iAPX line. It was a plan that built on the 16-bit 8086 but expanded to different architectures as the processors became more powerful. There were several specifications for processors, culminating with the iAPX 432 a 32-bit processor. (At the time, this was impressive.) The more powerful processors were not compatible with the 8-bit 8080, or (notably) the 16-bit 8086. It was a nice plan.

The IBM PC and the market for the IBM PC invalidated that plan. People wanted the IBM PC, which meant the Intel 8088. Later, it meant the IBM PC AT with the Intel 80286, which was not an iAPX processor.

This shows that Intel was not leading the market in the "true leader" sense.

Want another example? Let's look at "itanium", Intel's attempt at 64-bit architecture. (Also known as "IA-64".) It was a processor that was not compatible with the existing 32-bit software. Worse, it has a design that pushed work to the compiler, and some of that work was not determinable at compile time. AMD countered with x86_64, a design compatible with existing 32-bit code, and customers wanted that instead of Intel's offering. They wanted it so much that Intel eventually adopted the x86_64 design and abandoned IA-64.

I think these two examples show that Intel is more of the opportunistic leader than the true leader. Intel can design processors, and can design and manufacture the chips and build the support circuitry, but it's business strategy is to run out in front of the market.

Which brings us to Apple and ARM processors.

Apple has announced that it will switch from Intel processors to ARM processors for its Macbook laptops.

Is Apple a true market leader? Or are they opportunistic, running out in front of the crowd?

It doesn't matter.

It may be that Apple's decision will "move the market". It may be that Apple's "spidey sense" has detected a desire to shift to ARM processors, and Apple is first in the market to make the change. Whichever reason, Apple is doing it.

I think that not only Apple will shift from Intel to ARM, but the rest of the market will, too. For two reasons: cost and efficiency.

People will want computers with ARM processors because they (the computers) will have a lower price. 

People will want computers with ARM processors because they (the computers) will be more energy-efficient. (Which means a longer battery life.)

People who buy computers generally don't care about the processor. They care about running applications, and ARM can deliver that.

Windows is ready to move to ARM. Linux is ready to move to ARM. Apple is not only ready to move to ARM but is in fact moving to ARM.

What will happen to Intel?

Intel has a definite problem. They have lost the low-end market for processors. They are losing the mid-range market for processors.

One might think that Intel may find haven in high-end processors, running servers and other high-power computers. And there may be some space for them, but there is a problem: IBM.

IBM has staked out the top end of the market, with System/z processors (the successor to IBM's System/360, System/370, and System/390 processors) running the big servers that host virtual machines. IBM has a competent business there, one that will not be easily displaced by Intel's x86 processors.

Intel has a market that it being eaten by ARM at the bottom end and limited by IBM at the top end. Not an enviable position.

And certainly not a market leader, in any sense.