Interstice of Seconds
To control a physical system, a person has to enact a change for the system to follow. For example, to control the direction of a car, the driver has to turn the wheel.
There is always a “lag time” between enacting change (turning the wheel, input) and the change (the car turning, output); the input does not immediately result in an output. There are still a few interstice of seconds that follow turning the wheel and the car beginning to turn. And it is in these split seconds that variability creeps into the system. In the example of a car, this nuance might seem irrelevant as we are talking about fractions of a second. However, in more complicated systems like adding chemicals to an ongoing chemical reaction, the lag time might be longer, making it harder to connect input with output.
In biology we can find several examples where lag time is substantial. For instance, when we receive medicine (input), it might take hours or days for us to feel better (output). Systems where the lag time is longer are complex and harder to understand and control.
On the other hand, systems with short—almost negligible—lag times are easier to work with. The prime example here is the computer, where code (input) almost immediately results in an output. And this is the case generally for electronics, where electric signals are sent in fractions of a second.
We need biology, chemistry, and complex physical systems. We have learned how to work with these systems to build today’s society. If you feed the cow, eventually you will be able to get meat. But the reason behind why “software is eating the world” might just be that it has smaller lag times and it is easier to control. Which means, it is easier to build with.
I am not claiming software is easy but rather that when dealing with software, one can overlook the dynamics of a physical system that lead to lag times when trying to control such system. After all, computers are physical systems. They use energy and generate heat. But how often do we worry about the rate of energy our laptops are using when running code? Usually never, unless it is a very niche application. We just plug in the laptop and keep running the code until we get our output. Which is not the case for other physical systems, where you have to wait around a bit (pun not intended) until you get your output. You have to wait for the cow to grow to get meat after feeding it. You have to wait for the car to respond after turning the wheel. And it is in that waiting time, that interstice, that randomness is added as a result of your system’s dynamics. This complicates whatever you are trying to do in the physical world.
To reduce the interstice of seconds is to make a system easier to work with. Unsurprisingly, this usually involves adding electronics to whatever we are trying to control. This is why you control your hotel room temperature with a small digital screen on the wall or why you turn on the lights in your house with a switch controlling the flow of electricity.
I wonder what other systems still contain generous interstice of seconds. I am very curious about what we still have to optimize in this arena; systems we still have to “electrify” or “computerize,” if you will.