The hardware revolution is already here. With Arduinos, BeagleBones, and Raspberry Pis cheap to purchase and easy to set up, it’s no wonder they are so popular among hobbyists. On the other end of the spectrum, smartphones sit as sleek computers, decked out in sensors recording the world. The innovation is everywhere in between, from the bizarre shapes of wearables to crazy uses of quadcopters, hacked together or factory-made.

The allure of these kits, like the software packages they use, is the freedom to mix and match components to fit the project. It should be no surprise that as hardware will be more popular as it becomes easier for developers to use it. But like software’s relationship with computer science, more sophisticated uses of hardware require more comprehensive theory. The only issue is that hardware deals with reality, and reality is not theoretical.

Hardware is how software communicates with the physical world. Let’s call the details of how the hardware executes programs the “instruction interface”. This is the usual dichotomy between hardware and software, with assembly on one side and transistors on the other.

On the software side, there are many layers of abstraction that allow developers to ignore the gritty details of the instruction interface. The operating system exists so that individual programs don’t need to worry about what other programs are running, and so on. Computer science is focused on these higher levels of abstraction — time-complexity doesn’t worry about clock speeds or cache misses. It is a simplified model of reality. Luckily, computer scientists must have a good model because their software works! Mostly.

What about the hardware that is controlled by software? Wait. We forgot about sensors, which is how software knows about reality! Of course, hardware is also how software communicates with the outside world. Let’s call this the “physical interface”. Mice, keyboards, and accelerometers are inputs to this interface, while screens, speakers, and even engines are outputs. The constraints of these channels for interaction play an important role in the physical interface.

In many instances, a user is providing the input and receiving the output. When we know this is the case, we have the “user interface”, which is how software communicates with the user. User interfaces are difficult to build because it requires understanding the user, with some help of psychology and cognitive science. But much of it is obvious, like text that is too small or sounds that are too soft.

Programs deal with more than just users. For example, cruise control uses a computer to control how much gas is fed into the engine based on how fast the wheels are spinning. Instead of understanding the user, it needs to understand the relationship between gas and speed. It needs to understand reality. It’s amazing that a cruise control system works given how little information it has — we still trust it when driving downhill!

Clearly, communication is at the heart of software. It is why processor instructions are called programming “languages”. Developers are translators who convert ideas into code understood by machines while juggling the constraints of the computer, the stupidity of the user, and the complexity of the physical world. These all exist outside of software’s control. Let’s call the boundary between software and everything else the “reality interface”.

Hardware is the reality interface. It’s easy to forget that when screen resolution and mouse polling are automatically accounted for by a UI toolkit. Or when concurrency and memory management are handled by a virtual machine. Yet developers still need to actively worry about the details when working with physical space.

Physics is the theory we use to model reality. Unfortunately, there is no framework that hides engineering details the same way that the operating system hides processor cores. Luckily, libraries go a long way in reducing the tediousness. Hardware kits, in the form of shields and connectors, have simplified the mechanical variation of hardware projects.

Back to cruise control. Aren’t control systems a great abstraction that allow a program to act without all the details? Yes, in some instances. But real behavior is often more chaotic, and most systems require tuning with an understanding and simplification of the underlying physical process. Cruise control isn’t for F1 cars.

We’re stuck right now. Building more sophisticated devices requires us to have a deeper understanding of engineering, not just physics. It is the sort of multidisciplinary endeavor previously reserved for robots and space stations. But imagine the possibilities once it all comes together.