Americans spend more than 10 hours every day staring at screens. But researchers are developing cutting-edge interfaces that could change the way we interact with digital media–interfaces that, in many cases, don’t require screens at all. Instead, your skin is the interface. Objects around you are the interface. Architecture itself is the interface.

The evolution–or devolution–of the interface was prominently on display this week at the Association for Computing Machinery’s 2018 CHI Conference on Human Factors in Computing Systems (or ACM CHI, for short). The conference is a hub for the world’s top minds to share the latest breakthroughs in human-computer interaction. The papers and projects presented at ACM CHI tend to act like a barometer for what the future of computers might look like–and this year, it seemed to suggest that our computers will be increasingly embedded in the world around us.

Here are six of the conference’s most fascinating prototypes.


Who needs a screen when your house can do double duty? This project, a collaboration between Disney Research and Carnegie Mellon’s Future Interfaces Group, transforms a simple wall into a touch screen using inexpensive materials. The idea is clever because it only uses layers of conductive paint and copper tape that only cost about $20 per 10.75 square feet. So what can you do with a smart wall anyway? You could turn certain spots on your bedroom wall into buttons that turn your lights on or off, for instance, or program a specific action to serve as a password to unlock a door.


[Image: Tangible Media Group/MIT]

A new project out of MIT Media Lab’s Tangible Media Group uses a technique called “electrowetting” to move water droplets around on a surface, essentially creating a water-based computer interface. They call it a “calm interface.” The team’s video–which won the conference’s award for best video demo–shows how the water drops could be used in art and interactive gaming. In one demo, a droplet is manipulated by the user tilting the game board, and the others are programmed to flee when that droplet comes near. The team imagines how the droplets could be used for communication: A woman draws a message “have a nice day” on her phone, and that message appears in the mirror of her home, where her partner is brushing his teeth. That’s way sweeter than a text message.


Who said health-tracking has to be serious? BioFidget is an augmented fidget spinner that also happens to have a heart-rate variability sensor and a respiration sensor. Researchers from the department of industrial design at Eindhoven University of Technology designed it to help reduce stress: You blow on the protrusions of the spinner to make it spin, and the slow repetition of this practice is meant to calm down your physiological stress response. White and red lights on the device light up to let you know when your heart rate and pulse have slowed down. Most importantly, the experience doesn’t require any sensors stuck to the users’ skin, unlike many medical tests and health-tracking devices. BioFidget provides more evidence that today’s smartest interfaces are in the most unexpected places.


Speaking of smart watches, another project out of the Future Interfaces Group at Carnegie Mellon has another solution for the small size of a smart watch screen: Why not just project onto the skin of your arm? As the researchers acknowledge in their paper, this is “a long-standing yet elusive goal, largely written off as science fiction”–one no one has achieved yet. But the LumiWatch can actually project a touch screen onto your skin. It’s an exciting development, bringing some of the more far-flung ideas about turning your skin into an interface closer to something you could buy in a store.

Other researchers at ACM CHI focused on skin, too. A group from the Korea Advanced Institute of Science and Technology (KAIST) presented a concept for a smartwatch that communicates through your wrist, in addition to a screen. They created a wearable with four small fans paired with four tiny vibration motors on the back that blow and vibrate against your skin to give you haptic information. Their proposal turns your arm into a secondary interface, one with a specific language of vibrations and air movements that can speak subtly and specifically.


Imagine if you could reach your email without ever touching your phone because you could feel the words on your skin. That might sound horrible–who wants more email in their lives–but Facebook has created a real sleeve system that translates words into tactile vibrations that users can decode. It’s a basic system so far because users have to learn to associate particular patterns of tactile vibrations with phonemes, or the basic sounds of a language. It’s a bit like Braille for your arm. The best applications for the research are for impaired people who are looking for new ways to read and communicate. The research is a fascinating example of an interface built for accessibility–though it’s unclear how it fits within Facebook’s business model and what it might be used for. The company has not responded to a request for comment. Perhaps one day this kind of sensor could be embedded into the sleeves of your shirt, allowing your clothes to talk to you.