Monday, August 30, 2021

Apple employees and iCloud

Recent stories about Apple employees have exposed some of the conditions that Apple requires for employment. One of those conditions is to link one's personal iCloud account to the Apple-assigned employee iCloud account.

Why do this? It seems an odd request, to merge one's personal iCloud account with the corporate iCloud account. Apple has not explained the reason for this policy (I did not ask Apple for comment) and Apple employees have not stated a reason either.

I have an idea. It's not that Apple is being evil. Instead, Apple is trying to keep secrets, secret.

Apple, more than most companies, is concerned with corporate secrets. It carefully guards its designs for new products. It limits communication between employees to only those groups that are part of the work assignment.

Apple also requires employees to agree to the "Apple can search you or your equipment whenever on Apple property" clause in its hiring contracts.

Apple does not want secrets to leak out, and it takes steps to keep secrets, well, secret.

I expect that Apple requires employees to use the corporate e-mail system and not personal e-mail accounts such as Yahoo, GMail, or Microsoft Outlook. Apple can scan messages in its internal e-mail system and identify leaked information, but it cannot scan external services. It probably blocks those web sites on its internal network.

Apple probably also blocks social media services such as Facebook and Twitter. When you're at work for Apple, you're at work for Apple, and not allowed to use such sites. Information could be shared via those sites, something Apple wants to prevent.

File sharing sites such as DropBox, BOX, and Microsoft OneDrive are other services that could allow for the sharing of information. Apple probably blocks those sites, too.

Apple may go as far as preventing people from attaching USB drives to their work computers.

So Apple has built an electronic fence around employees, to keep corporate date inside and not allow designs for new products to escape. That all makes sense.

The one problem in this scheme is iCloud.

Apple cannot block iCloud. They use it to share data and collaborate on projects. Blocking iCloud would isolate employees and limit the sharing of information to e-mail, which is far less effective for sharing data. iCloud can let multiple people work on the same file, at the same time, and always with the latest version of the file. Forcing people to share data via e-mail would eliminate real-time collaboration and allow for older versions of files -- and the confusion that would result.

So Apple must use iCloud. But iCloud allows people to switch from one iCloud account to another, and employees could switch from the corporate iCloud account to their personal account, send files out (just what Apple does not want), and then switch back to the corporate iCloud account. This is a "hole" in the security "fence".

The answer to this last problem is simple: require employees to link their personal iCloud to the corporate iCloud account. I imagine that their personal account, once linked, becomes part of the monitoring process for the corporate iCloud account, and closes the "hole" in the "fence".

This is why (I think) Apple requires employees to link their personal iCloud account to their corporate iCloud account. Linking accounts is a hack to close a hole in Apple's security.


Friday, August 27, 2021

Replacing a language

Do programming languages ever get replaced? Languages are introduced, and some rise and fall in popularity (and some rise after a fall), but very few ever disappear completely.

There are some languages that have disappeared. Two are FLOW-MATIC and NELIAC, languages from the 1950s.

Some languages have fallen to very low levels of usage. These include BASIC, PL/I, PL/M, and DIBOL.

Some languages never die (or at least haven't died yet). The go-to examples are COBOL and FORTRAN. Organizations continue to use them. Fortran is used for some parts of the R implementation, so anyone using R is in a sense using Fortran. We can include SQL in this list of "immortal" programming languages.

Some languages are replaced with new versions: C# version 6 replaced a very similar C# version 5, for example. VB.NET replaced VB6. And while there are a few people who use the old Visual Basic 6, they are a very small number.

The replacement of a language isn't so much about the language as it is about popularity. Several groups measure the popularity of languages: Tiobe, Stack Overflow, the Popularity of Programming Languages (PYPL)... They all rank the languages and show pretty tables and charts.

The mechanics of popularity is such that the "winners" (the most popular languages) are shown, and the "losers" (the less popular languages) are omitted from the table, or shown in a small note at the bottom. (Not unlike the social scene in US high schools.)

If we use this method, then any language that enters the "top 10" or "top 40" (or "top whatever") chart is popular, and any language that is not in the list is not popular. A language that never enters the "top" list is never really replaced, because it never "made it".

A language that does enter the "top" list and then falls out of it has been replaced. That language may still be used, perhaps by thousands, yet it is now considered a "loser".

For this method (referring to the Tiobe index), languages that have been replaced include: Objective-C, COBOL, Prolog, and Ada. They were popular, once. But now they are not as popular as other languages. (I can almost hear the languages protesting a la Norma Desmond: "We are big! It's the programs that got small!")

Sometimes we can state that one specific language replaced another. VB.NET replace VB6, because Microsoft engineered and advertised VB.NET to replace VB6. The same holds for Apple and Swift replacing Objective-C. But can we identify a specific language that replaced COBOL? Or a single language that replaced FORTRAN? Did C++ replace C? Is Rust replacing C++?

We can certainly say that, for a specific project, a new language replaces an older one. Perhaps we can say the same for an organization, if they embark on a project to re-write their code in a new language. I'm not sure that we can make the assertion for the entire IT industry. IT is too large, with too many companies and too many individuals, to verify such a claim in absolute terms. All we can rely on is popularity.

But popularity is not the measure of a language. It doesn't measure the language's capabilities, or its reliability, or its ability to run on multiple platforms.

We don't care about popularity for the technical aspects of the language. We care about the popularity of a language for us, for ourselves.

Product managers care about the popularity of a language because of staffing. A popular language will have lots of people who know it, and therefore the manage will have a (relatively) easy time of finding candidates to hire. An obscure language will have few candidates, and they may demand a higher wage.

Individuals care about the popularity of a language because it means that there will be lots of companies hiring for that skill. Few companies will hire for an obscure programming language.

This essay has meandered from replacing a language to popularity of languages to the concerns of hiring managers and candidates. That's probably not a coincidence. Economic activity drives a lot of behavior; I see no reason that programming should be exempt. When thinking about a programming language, think about the economics, because that will contribute a lot to the language's life.

Tuesday, August 24, 2021

After the PC

Is the M1 Mac a PC? That is, is the new Macbook a personal computer?

To answer that question, let's take a brief trip through the history of computing devices.

Mainframes were the first commercially viable electronic computing devices. They were large and consisted of a processing unit and a few peripherals. The CPU was the most important feature.

Minicomputers followed mainframes, and had an important difference: They had terminals as peripherals. The most important feature was not the CPU but the number of terminals.

PCs were different from minicomputers in that they had integrated video and and integrated keyboard. They did not have terminals. In this sense, M1 Macs are PCs.

Yet in another sense, M1 Macs are something different: They cannot be modified.

IBM PCs were designed for modification. The IBM PC had slots for accessory cards. The information for those slots was available. Other manufacturers could design and sell accessory cards. Consumers could open the computer and add hardware. They could pick their operating system (IBM offered DOS, CP/M-86, and UCSD p-System).

The openness of the IBM PC was somewhat unusual. Apple II (Apple ][) computers were not openable. Macs were not openable (you needed a special tool). Other computers of the era were typically built in a way that discouraged modifications.

The IBM PC set the standard for personal computers, and that standard was "able to be opened and modified".

The M1 Macs are systems on a chip. The hardware cannot be modified. The CPU, memory, disk (well, let's call it "storage", to use the old mainframe-era term) are all fixed. Nothing can be replaced or upgraded.

In this sense, the M1 Mac is not a PC. (Of course, if it is not a PC, then what do we call it? We need a name. "System-on-a-chip" is too long, "SoC" sounds like "sock" and that won't do, either.)

I suspect that the folks at Apple will be happy to refer to their products with a term other than "PC". Apple fans, too. But I don't have a suitable, generic, term. Apple folks might suggest the term "Mac", as in "I have a Mac in my backpack", but the term "Mac" is not generic. (Unless Apple is willing to let the term be generic. If so, when I carry my SoC Chromebook, I can still say "I have a Mac in my backpack." I doubt Apple would be happy with that.)

Perhaps the closest thing to the new Apple M1 Macs is something so old that it predates the IBM PC: The electronic calculator.

Available in the mid-1970s, electronic calculators were quite popular. Small and yet capable of numeric computations, they were useful for any number of people. Like the Apple M1 Mac, they were designed to remain unopened by the user (except perhaps to replace batteries) and they were not modifiable.

So perhaps the Apple M1 Macbooks are descendants of those disco-era calculators.

* * * * *

I am somewhat saddened by the idea that personal computers have evolved into calculators, that PCs are not modifiable. I gained a lot of experience with computers by modifying them: adding memory, changing disk drives, or installing new operating systems.

A parallel change occurred in the automobile industry. In the 1950s, lots of people bought cars and tinkered with them. They replaced tires and shock absorbers, adjusted carburetors, installed superchargers and turbochargers, and replaced exhaust pipes. But over time, automobiles become more complex and more computerized, and now very few people get involved with their cars. (There are some enthusiastic car-hackers, but they are few in number.)

We lost something with that change. We lost the camaraderie of learning together, of car clubs and amateur competitions.

We lost the same thing with the change in PCs, from open, modifiable systems to closed, optimized boxes.

Wednesday, August 18, 2021

Apple's trouble with CSAM is from its approach to new features

Apple's method, its motus operandi, is to design a product or process and present it to its customers. There is no discussion (at least none outside of Apple), there is no gathering of opinion, there is no debate. Apple decides, implements, and sells. Apple dictates, customers follow.

We can see this behavior in Apple's hardware.

Apple removed the floppy disk from the Macintosh. Apple removed the CD drive from the iMac. Apple removed the audio port on the iPhone.

We can also see this behavior in Apple's software.

Apple designs the UIs for iOS and iPadOS and has very definite ideas about how things ought to be.

This method has served Apple well. Apple designs products, and customers buy them.

Yet this method failed with CSAM.

Apparently, some customers don't like being dictated to.

One aspect of the CSAM change is that it is a change to an existing service. If Apple changes the design of the iPhone (removing the audio port, for example), customers still have their old audio-port-equipped iPhone. They can choose to replace that iPhone with a later model -- or not.

If Apple had introduced a new file storage service, and made CSAM scanning part of that service, customers could choose to use that new service -- or not.

But Apple made a change to an existing service (scanning photos in iCloud), and existing iPhones (if Apple scans the photos on the phone). That presents a different calculation to customers. Instead of choosing to use a new service or not, customers must now choose to stop using a service, or continue to use the modified service. Changing from one cloud storage service to another is a larger task. (And possibly not possible with iPhones and iPads.)

Customers don't have the option of "buying in" to the new service. The changes for CSAM are not something that one can put off for a few months. One cannot look at the experience of other customers and then decide to participate.

No, this change is Apple flexing its muscles, altering the deal that was already established. ("Pray that I do not alter it further," quipped Darth Vader in a movie from a long time ago.)

I think that the negative reaction to Apple's CSAM strategy is not so much the scanning of photos, but the altering of the existing service. I think that people now realize that Apple can, and will, alter services. Without the consent, or even the advice, of the customers. I think that is what is bothering people.

Sunday, August 15, 2021

COBOL and Elixir

Someone has created a project to transpile (their word) COBOL to Elixir. I have some thoughts on this idea. But first, let's look at the example they provide.

A sample COBOL program:

      >>SOURCE FORMAT FREE
IDENTIFICATION DIVISION.
PROGRAM-ID. Test1.
AUTHOR. Mike Binns.
DATE-WRITTEN. July 25th 2021
DATA DIVISION.
WORKING-STORAGE SECTION.
01 Name     PIC X(4) VALUE "Mike".
PROCEDURE DIVISION.

DISPLAY "Hello " Name

STOP RUN.

This is "hello, world" in COBOL. Note that it is quite longer than equivalent programs in most languages. Also note that while long, it is still readable. Even a person who does not know COBOL can make some sense of it.

Now let's look at the same program, transpiled to Elixr:

defmodule ElixirFromCobol.Test1 do
  @moduledoc """
  author: Mike Binns
  date written: July 25th 2021
  """

  def main do
    try do
      do_main()
    catch
      :stop_run -> :stop_run
    end
  end 

  def do_main do
    # pic: XXXX
    var_Name = "Mike"
    pics = %{"Name" => {:str, "XXXX", 4}}
    IO.puts "Hello " <> var_Name
    throw :stop_run
  end
end

That is ... a lot of code. More than the code for the COBOL version! Some of that is due to the exception of "stop run" which in this small example seems to be excessive. Why wrap the core function inside a main() that simply exists to trap the exception? (There is a reason. More on that later.)

I'm unsure of the reason for this project. If it is a side project made on a whim, and used for the entertainment (or education) of the author, then it makes sense.

But I cannot see this as a serious project, for a couple of reasons.

First, the produced Elixir code is longer, and in my opinion less readable, than the original COBOL code. I may be biased here, as I am somewhat familiar with COBOL and not at all familiar with Elixir, so I can look at COBOL code and say "of course it does that" but when I look at Elixir code I can only guess and think "well, maybe it does that". Elixir seems to follow the syntax for modern scripting languages such as Python and Ruby, with some unusual operators.

Second, the generated Elixir code provides some constructs which are not used. This is, perhaps, an artifact of generated code. Code generators are good, up to a point. They tend to be non-thinking; they read input, apply some rules, and produce output. They don't see the bigger picture. In the example, the transpiler has produced code that contains the variable "pics" which contains information about the COBOL programs PICTURE clauses, but this "pics" variable is not used.

The "pics" variable hints at a larger problem, which is that the transpiled code is not running the equivalent program but is instead interpreting data to achieve the same output. The Elixir program is, in fact, a tuned interpreter for a specific COBOL program. As an interpreter, its performance will be less than that of a compiled program. Where COBOL can compile code to handle the PICTURE clauses, the Elixir code must look up the PICTURE clause at runtime, decode it, and then take action.

My final concern is the most significant. The Elixir programming language is not a good match for the COBOL language. Theoretically, any program written in a Turing-complete language can be re-written in a different Turing-complete language. That's a nice theory, but in practice converting from one language to another can be easy or can be difficult. Modern languages like Elixir have object-oriented and structured programming constructs. COBOL predates those constructs and has procedural code and global variables.

We can see the impedance mismatch in the Elixir code to catch the "stop run" exception. A COBOL program may contain "STOP RUN" anywhere in the code. The Elixir transpiler project has to build extra code to duplicate this capability. I'm not sure how the transpiler will handle global variables, but it will probably be a method that is equally tortured. Converting code from a non-structured language to a structured programming language is difficult at best and results in odd-looking code.

My point here is not to insult or to shout down the transpiler project. It will probably be an educational experience, teaching the author about Elixir and probably more about COBOL.

My first point is that programs are designed to match the programming language. Programs written in object-oriented languages have object-oriented designs. Programs written in functional languages have functional designs. Programs written in non-structured languages have... non-structured designs. The designs from one type of programming language do not translate readily to a programming language of a different type.

My second point is that we assume that modern languages are better than older languages. We assume that object-oriented languages like C++, C#, and Java are better than (non-OO) structured languages like Pascal and Fortran-77. Some of us assume that functional languages are better than object-oriented languages.

I'm not so sure about those assumptions. I think that object oriented languages are better at some tasks that mere structured languages, and older structured-only languages are better at other tasks. Object-oriented languages are useful for large systems; they let us organize code into classes and functions, and even larger constructs through inheritance and templates. Dynamic languages like Python and Ruby are good for some tasks but not others.

And I must conclude that even older, non-functional, non-dynamic, non-object-oriented, non-structured programming languages are good for some tasks.

One analogy of programming languages is that of a carpenter's toolbox: full of various tools for different purposes. COBOL, one of the oldest languages, might be considered the hammer, one of the oldest tools. Hammers do not have the ability of saws, drills, tape measures, or levels, but carpenters still use them, when the task is appropriate for a hammer.

Perhaps we can learn a thing or two from carpenters.

Sunday, August 8, 2021

Apple and the photographs

There has been a lot of discussion about Apple's plan to identify photographs that contain illegal material. Various comments have been made on the "one in one trillion" estimate for false positives. Others have commented on the ethics of such a search.

One idea I have not seen discussed is the reason for Apple to do this. Why would Apple choose to identify these images? Why now?

It doesn't help sell Apple products.

It doesn't help sell Apple services.

It doesn't improve Apple's reputation.

Yet Apple made the effort to identify illegal photographs, which included requirements, design, coding, and testing. It cost Apple money to do this.

If Apple gains no revenue, gains no reputation, has no material gain at all, then why should they do it? Why make the effort and why expend the money?

Perhaps there is a gain, but one that Apple has not revealed to us. What unrevealed reason could Apple have to examine and identify photographs on Apple equipment? (Or more specifically, in Apple iCloud?)

The one thought that comes to mind is an agreement with law enforcement agencies. In such an agreement, Apple scans photographs and reports illegal material to law enforcement agencies. In exchange, Apple gets... what? Something from the government? Payment? Or perhaps they don't get something from the government, such as a lawsuit or regulatory interference.

I'm speculating here. I have no knowledge of Apple's motivations. Nor do I have any knowledge of such a deal between government and Apple -- or any company. (Any keep in mind that there are governments other than the US government that may make such requests.)

But if there is a deal, then perhaps Apple is not the last company to perform such action. We may see other companies announce similar efforts to identify illegal material. Worse, we may learn that some companies have implemented such programs without informing their customers.

Thursday, August 5, 2021

The roaring success of Windows 365

Microsoft announced Windows 365, its "Windows as a Service" offering that lets one (if one is a business) create and use virtual Windows desktops. And just as quickly, Microsoft announced that it was suspending new accounts, because too many had signed up for the service.

A few thoughts:

First (and obvious) is that Microsoft underestimated the demand for Windows 365. Microsoft hosts the service on its Azure framework, and if the demand for Windows 365 outstrips the available servers, then it is popular indeed.

Second, (and perhaps less obvious) is that Microsoft is (probably) kicking themselves for pricing the service as they did. With strong demand, Microsoft apparently "left money on the table" and could have charged higher rates.

Third, (and also not so obvious) is that Microsoft's business customers (that is, most businesses) really want to move from their current arrangement for PC hardware to a different one. They either want to move to Windows 365 or they want to try it -- which indicates that they are not happy with their current PC platform. (That current platform might be physical, real-world PCs on employee desks, or it might be virtual PCs accessed by Remote Desktop or Citrix or some other mechanism.)

The desire for customers to try a different solution is, in my mind, a warning for Microsoft. It means that customers are not happy with the current state of Windows and its support -- new versions, upgrades, security patches, and administration. It could mean that customers will entertain other solutions, including Linux and Apple.

A few other thoughts:

With demand from the business community much stronger than expected, Microsoft will probably focus on business customers. In other words, I expect Microsoft to offer no comparable service for individuals or families -- at least not for the next two years. I expect the cost for a consumer product is higher than the cost for a commercial product, and the revenue for a consumer product is lower than the revenue for a business product.

Microsoft may leverage demand for the Windows 365 service to spur Windows 11 sales. They have announced Windows 365 with access from various devices, and the opportunity is to provide additional services for Windows 11 clients. (Better security, better integration between access computer and virtual desktop, and possibly lower costs.)

Finally, there is the outside chance that Microsoft will provide a special edition of Windows 11, one that is stripped down and usable only to access Windows 365. (Much like ChromeOS is suitable only to run Chrome.) Microsoft may even design and sell hardware to support a Windows 11 "C" mode ("C" for "Connectivity"?).

The strong demand for Windows 365 shows that lots of people are interested in it. Microsoft won't ignore that. Be prepared for more announcements for this new service.