Saturday, April 21, 2018

Why the ACM is stuck in academe

The Association of Computing Machinery (ACM) is a professional organization. Its main web page claims it is "Advancing Computing as a Science & Profession". In 2000, it recognized that it was focussed exclusively on the academic world, and it also recognized that it had to expand. It has struggled with that expansion for the past two decades.

I recently found an example of its failure.

The flagship publication, "Communications of the ACM", is available on paper or on-line. (So far, so good.) It is available to all comers, with only some articles locked behind a paywall. (Also good.)

But the presentation is bland, almost stifling.

The Communications web site follows a standard, "C-clamp" layout with content and in the center and links and administrative items wrapped around it on the top, left, and bottom. An issue's table of contents has titles (links) with descriptions to the individual articles of the magazine. This is a reasonable arrangement.

Individual articles are presented with header and footer, but without the left-side links. They are not using the C-clamp layout. (Also good.)

The fonts and colors are appealing, and they conform to accessibility standards.

But the problem that shows how ACM fails to "get it" is with the comments. Their articles still have comments (which is good) but very few people comment. So few that many articles have no comments. How does ACM present an article with no comments? How do they convey this to the reader? With a single, mechanical phrase under the article text:

No entries found

That's it. Simply the text "no entries found". It doesn't even have a header describing the section as a comments section. (There is a horizontal rule between the article and this phrase, so the reader has some inkling that "no entries found" is somewhat distinctive from the article. But nothing indicating that the phrase refers to comments.)

Immediately under the title at the top of the page there is a link to comments (labelled "Comments") which is a simple intrapage link to the empty, unlabelled comments section.

I find phrase "no entries found" somewhat embarrassing. In the year 2018, we have the technology to provide text such as "no comments found" or "no comments" or perhaps "be the first to comment on this article". Yet the ACM, the self-proclaimed organization that "delivers resources that advance computing as a science and a profession" cannot bring itself to use any of those phrases. Instead, it allows the underlying CMS driving its web site to bleed out to the user.

A darker thought is that the ACM cares little for comments. It knows that it has to have them, to satisfy some need for "user engagement", but it doesn't really want them. That philosophy is consistent with the academic mindset of "publish and cite", in which citations to earlier publications are valued, but comments from random readers are not.

Yet the rest of the world (that is, people outside of academe) care little for citations and references. They care about opinions and information (ad profits). Comments are an ongoing problem for web sites; few are informative and many are insulting, and many web sites have abandoned comments.

ACM hasn't disabled its comments, but it hasn't encouraged them either. It sits in the middle.

This is why the ACM struggles with its outreach to the non-academic world.

Thursday, April 19, 2018

Why no language to replace SQL?

The history of programming is littered with programming languages. Some endure for ages (COBOL, C, Java) and some live briefly (Visual J++). We often develop new languages to replace existing ones (Perl, Python).

Yet one language has endured and has seen no replacements: SQL.

SQL, invented in the 1970s and popularized in the 1980s, has lived a good life with no apparent challengers.

It is an anomaly. Every language I can think of has a "challenger" language. FORTRAN was challenged by BASIC. BASIC was challenged by Pascal. C++ was challenged by Java; Java was challenged by C. Unix shell programming was challenged by AWK, which in turn was challenged by Perl, which in turn has been challenged by Python.

Yet there have been no (serious) challengers to SQL. Why not?

I can think of several reasons:
  • Everyone loves SQL and no one wants to change it.
  • Programmers think of SQL as a protocol (specialized for databases) and not a programming language. Therefore, they don't invent a new language to replace it.
  • Programmers want to work on other things.
  • The task is bigger than a programming language. Replacing SQL means designing the language, creating an interpreter (or compiler?), command-line tools (these are programmers, after all), bindings to other languages (Python, Ruby, and Perl at minimum), and data access routines. With all features of SQL, including triggers, access controls, transactions, and audit logs.
  • SQL gets a lot of things right, and works.
I'm betting on the last. SQL, for all of its warts, is effective, efficient, and correct.

But perhaps there is a challenger to SQL: NoSQL.

In one sense, NoSQL is a replacement for SQL. But it is a replacement of more than the language -- it is a replacement of the notion of data structure. NoSQL "databases" store documents and photographs and other things, but they are rarely used to process transactions. NoSQL databases don't replace SQL databases, they complement them. (Some companies move existing data from SQL databases to NoSQL databases, but this is data that fits poorly in the relational structure. They move some of their data but not all of their data out of the SQL database. These companies are fixing a problem, not replacing the SQL language.)

NoSQL is a complement of SQL, not a replacement (and therefore not a true challenger). SQL handles part of our data storage and NoSQL handles a different part.

It seems that SQL will be with us for some time. It is tied to the notion of relational organization, which is a useful mechanism for storing and processing homogeneous data.

Wednesday, April 11, 2018

Big enough is big enough

Visual Studio has a macro capability, but you might never have used it. You might not even know that it exists.

You see, you cannot use it as Visual Studio comes "out of the box". The feature is disabled. You have to take action before you can use it.

First, there is a setting inside of Visual Studio to enable macros.

Second, there is a setting inside of Windows to allow Visual Studio macros. Only system administrators can enable it.

Yes, you read that right. There are two settings to enable macros in Visual Studio, and both must be enabled to run macros.

Why? I'm not sure, but my guess is that the Visual Studio setting was there all along, allowing macros if users wanted them. The second setting (inside Windows) was added later, as a security feature.

The second setting was needed because the macro language inside of Visual Studio is powerful. It can call Windows API functions, instantiate COM objects, and talk to .NET classes. All of this in addition to the expected "insert some text" and "move the insertion point" we expect of a text editor macro.

Visual Studio's macro language is the equivalent of an industrial-strength cleaning solvent: So powerful that it can be used only with great care. And one is always at risk of a malevolent macro, sent from a co-worker or stranger.

But macros don't have to be this way.

The Notepad++ program (the editor for Windows) is a text editor -- not an IDE -- and it has macro capabilities. Its macro capability is much simpler than that of Visual Studio: it records keystrokes and plays them back. It can do anything you, the user, can do in the program, and no more.

Which means, of course, that NotePad++'s macro capabilities are safe. They can do only the "normal" operations of a text editor.

And it also means that macros in Notepad++ are safe. It's not possible to create a malevolent macro -- or send or receive one. (I guess the most malicious macro could be a "select all, delete, save-file" macro. It would be a nuisance but little else.)

The lesson? Macros that are "powerful enough" are, well, powerful enough. Macros that are "powerful enough to do anything" are, um, powerful enough to do anything, including things that are dangerous.

Notepad++ has macros that are powerful enough to do meaningful work. Visual Studio has macros that can do all sorts of things, much more that Notepad++, and apparently so powerful that they must be locked away from the "normal" user.

So Notepad++, with its relatively small macro capabilities is usable, and Visual Studio, with its impressive and all-powerful capabilities (okay, that's a bit strong, but you get the idea) is *not* usable. Visual Studio's macros are too powerful for the average user, so you can't use them.

Something to think about when designing your next product.

Wednesday, April 4, 2018

Apple to drop Intel chips (or not)

The romance between Apple and Intel has come to an end.

In 2005, Apple announced that it was switching to Intel processors for its desktop and laptop computers. Previously it had used PowerPC chips, and the laptops were called "PowerBooks". The first Intel-based laptops were called "MacBooks".

Now, Apple has announced plans to design its own processors. I'm certain that the folks over at Intel are less than happy.

Looking forward, I think a number of people will be unhappy with this change, from open source advocates to developers to even Apple itself.

Open source advocates may find that the new Apple-processor MacBooks are unable to run operating systems other than Apple's, which means that Linux will be locked out of the (new) Apple hardware. While only a miniscule number of people actually replace macOS with Linux (disclosure: I'm one) those who do may be rather vocal about the change.

Apple MacBooks are popular with developers. (Exactly why this is the case, I am not sure. I dislike the MacBook's keyboard and display, and prefer other equipment for my work. But maybe I have preferences different from most developers.)

Getting back to developers: They like Apple MacBooks. Look inside any start-up or small company, and MacBooks dominate the office space. I'm sure that part of this popularity is from Apple's use of NetBSD (a Unix derivative) as the base for macOS, which lets MacBook users run most Linux software.

When Apple switches from Intel to its own (probably proprietary) processor, will those utilities be available?

The third group affected by this change will be Apple itself. They may find that the development of processors is harder than they expect, with delays and trade-offs necessary for performance, power efficiency, security, and interfaces to other system components. Right now, Apple outsources those headaches to Intel. Apple may not like the decisions that Intel makes (after all, Intel serves other customers and must accommodate their needs as well as Intel's) and it may feel that control over the design will reduce those headaches.

In-sourcing the design of processors may reduce headaches... or it may simply move them. If Apple has been dissatisfied with Intel's delivery schedule for new chips, the new arrangement may simple mean that Apple management will be dissatisfied with their internal division's delivery schedule for new chips. Owning the design process may give Apple more control over the process but not total control over it.

The move from standard, well-known processors to proprietary and possibly not well-understood processors moves Apple away from the general market and into their own space. Apple desktops and laptops may become proprietary and secret, with Apple processors and Apple systems-on-a-chip and Apple operating systems and Apple drivers and Apple software, ... and only Apple able to upgrade, repair, or modify them.

That's a bit of a longshot, and I don't know that it will happen. Apple management may find the idea appealing, hoping for increased revenue. But it is a move towards isolationism, away from the "free trade" market that has made PCs popular and powerful. It's also a move to the market before the IBM PC, when small computers were not commodities but very different from each other. I'm not sure that it will help Apple in the long run.