7 min read

Technology, Constraint, and Control

A lack of physical constraint grants digital technologies incredible flexibility. It also severs crucial informational pathways, without us noticing they're gone.
A close-up image of a brown doorknob on a rusty door.
"Rusty Doorknob" by Matthew Fauver. CC BY 2.0.

Technologies are built with a model of the world in mind. A hammer presumes a nail, a screwdriver a screw, and so on. But there is, necessarily, a gap between the model and the reality, a gap between how a technology is meant to be used and how it actually functions in the world.

Are new kinds of technologies widening that gap?

In her book Engineering a Safer World, Nancy Leveson describes how mechanical systems are designed to give direct physical feedback, feedback which renders mismatches between model and reality immediately visible:

When controls were primarily mechanical and were operated by people located close to the operating process, proximity allowed sensory perception of the status of the process via direct physical feedback such as vibration, sound, and temperature.  Displays were directly linked to the process and were essentially a physical extension of it. For example, the flicker of a gauge needle in the cab of a train indicated that (1) the engine valves were opening and closing in response to slight pressure fluctuations, (2) the gauge was connected to the engine, (3) the pointing indicator was free, and so on.

According to Leveson, newer technologies allow a greater distance between the operator and the process they're controlling. This physical distance can easily turn into an informational distance, as feedback is lost:

The introduction of electromechanical controls allowed operators to control processes from a greater distance (both physical and conceptual) than possible with pure mechanically linked controls. That distance, however, meant that operators lost a lot of direct information about the process [...] Accidents started to occur due to incorrect feedback. For example, major accidents (including Three Mile Island) have involved the operators commanding a valve to open and receiving feedback that the valve had opened, when in reality it had not. In this case and others, the valves were wired to provide feedback indicating that power had been applied to the valve, but not that the valve had actually opened.

Leveson attributes the increase in errors to the loosening of constraints:

Electromechanical controls relaxed constraints on the system design allowing greater functionality. At the same time, they created new possibilities for designer and operator error that had not existed or were much less likely in mechanically controlled systems. The later introduction of computer and digital controls afforded additional advantages and removed even more constraints on the control system design—and introduced more possibility for error.

Increased flexibility allows new technologies to accomplish an astonishing array of things. But the lack of constraint also severs crucial informational pathways—often without us even noticing they're gone.

In other words: with great power comes great risk.

Or: the price of freedom is an increased likelihood of fucking things up.

iFire

Fire is the earliest technology. We've come a long way from sitting around an open fire—we've invented torches and lanterns, ovens and steam engines. Why not take things further? Why not create a cell phone app to tend your fire for you?

If I'm tending a fire, I can see, feel, hear, and even smell, when it's getting low. The flames dim, the air on my skin cools, the crackling sounds of the log grow quiet, and the scent of fire dissipates.

If I'm using an app on my phone to keep a fire going while I'm out shopping, all that information needs to be communicated to me somehow. We need sensors near the fireplace, their inputs digitized and sent to me through the communications network.

I can guess your concerns: A fire-tending app is a terrible idea! Who cares if the fire goes out? If the app has bugs or there's a problem with the WiFi, your whole house could go up in flames!

All I can say is, where's your spirit of innovation? What happened to "move fast and burn things"?

More seriously, though...many technologies are as recklessly designed as this fire-tending app, and for the same reason: because they have cut themselves off from feedback.

Computers Are General Purpose Machines

In her keynote Once Again, The Doorknob, Olia Lialina critiques the design philosophy of Don Norman. Norman rose to prominence in the 1980s as the founder of "user-centered design" and worked as a designer at Apple during the 1990s. He argued that a computer program should be like a doorknob, the details of its interface so intuitive as to be invisible:

“A door has an interface – the doorknob and other hardware – but we should not have to think of ourselves using the interface to the door: we simply think about ourselves as going through the door or closing or opening the door.”

Norman, “Why Interfaces Don’t Work" p. 218

But, Lialina goes on to say, a computer is not a doorknob.  Immediately after making this comparison, Norman admits, “The computer really is special: it is not just another mechanical device.”

Lialina:

No one ever wants to refer to this moment of weakness; already in the next phrase Norman says that the metaphor applies anyway and the computer’s purpose is to simplify lives.

But this “not just another mechanical device” is the most important thing I like to make students aware of: the complexity and beauty of general purpose computers.

"General purpose" machines can do almost anything. That what the name 'general purpose' means. While they do face some constraints from physical reality (the speed of light, the limitations on dissipating heat, the costs of their component parts, etc) most constraints on computer programs are imposed by human designers.

Treating a computer like a doorknob—as if its current, specific purpose is its only possible purpose—is to ignore the vast gap between model and reality.

“A knob can open a door because it is connected to a latch. However in a digital world, an object does what it does because a developer imbued it with the power to do something […] On a computer screen though, we can see a raised three dimensional rectangle that clearly wants to be pushed like a button, but this doesn’t necessarily mean that it should be pushed. It could literally do almost anything.”

Alan Cooper, Robert Reimann, and David Cronin, quoted by Lialina

A digital doorknob is not constrained by reality, by wooden frames or metal locks and keys. It is constrained by its designers—by people.

To make these constraints invisible, to make them seem natural and inevitable as a doorknob, is to hide the politics of design. This not only disempowers the user, it conceals from the user that they've been disempowered.

How can a user give feedback on a design that they assume is natural and inevitable? How can a user contest the decisions of product managers and company owners, if they're not even aware that decisions are being made.

Capitalism Makes This Problem Worse, Not Better

Even in the most ideal society, the increased flexibility of universal computing machines, and the vast physical and cognitive distances they allow for, would increase the likelihood of errors.

And even in the most ideal society, there would remain a tension between "empowering users by making the complexity of technology available to them" and "disempowering users by overwhelming them with the complexity of technology available to them". (More on this in a later post.)

But alas, we don't live in an ideal society. Rather than focusing on how to balance usability and complexity, or flexibility and resilience to error, most software designers are pushed towards profit maximization. This often requires the purposeful discarding of feedback and the intentional disempowering of users. I will cover what this means, practically, in my next post.

For now, I will leave you with this quote from Cory Doctorow's talk, The Coming War on General Computation, about the dystopian lengths this approach is already being taken to:

[T]oday we have marketing departments who say things like “we don't need computers, we need... appliances. Make me a computer that doesn't run every program, just a program that does this specialized task, like streaming audio, or routing packets, or playing Xbox games, and make sure it doesn't run programs that I haven't authorized that might undermine our profits”. And on the surface, this seems like a reasonable idea – just a program that does one specialized task [...]

[B]ut that's not what we do when we turn a computer into an appliance. We're not making a computer that runs only the “appliance” app; we're making a computer that can run every program, but which uses some combination of rootkits, spyware, and code-signing to prevent the user from knowing which processes are running, from installing her own software, and from terminating processes that she doesn't want. In other words, an appliance is not a stripped-down computer – it is a fully functional computer with spyware on it out of the box.

A doorknob is just a doorknob. But a computer that's been locked down and loaded with every kind of spyware in a desperate attempt to force its users to treat is as only a doorknob—that's as political an object as you can find in the world.

Edited to add (June 4, 2023):

I recently came across a video by Technology Connections about electric car braking systems that exemplifies Leveson's point about digital systems introducing distance and error to previously mechanical systems. From the video:

"You didn’t used to have to think about what makes a car’s brake lights come on. They came on whenever you depressed the brake pedal. And since the brake pedal was the only thing that could slow the car enough to warrant the brake lights coming on, a simple switch on that pedal was all we needed.

But this is swiftly changing. Modern cars often slow down all on their own -maybe you’re using radar adaptive cruise control, or maybe the car has automatic emergency braking. Or perhaps you have an electric car with a one-pedal driving mode. And with these new features, we now have no choice but to control the brake lights with…software.

As the old saying goes, one man’s software is another man’s nightmare, and somebody at Hyundai (and quite possibly Kia) really dropped the ball. The way the Hyundai Ioniq 5 handles its brake lights, or at least the US-market 2022 Hyundai Ioniq 5 with software as-delivered, is so astonishingly bad that I think it warrants a recall."