What if Instead of Living with Computers, We Lived in a Computer?

It’s a world where physical spaces and objects interact with technology. The computer is not a phone in your pocket or a screen on your desk. It’s all around you. That world is inevitable and researchers at UCalgary’s Programmable Reality Lab are helping to create it.

Written by Jaelyn Molyneux, BA’05

In the 17th century, William Shakespeare wrote, “All the world’s a stage.” In the 21st century, all the world is a computer. Or, at least, it will be.

Dr. Ryo Suzuki, PhD, leads the Programmable Reality Lab at the University of Calgary. It’s one of eight labs that form the large and collaborative Interactions Lab within the Department of Computer Science where researchers are exploring human-computer interactions and information-visualization.

For its part, the Programmable Reality Lab is exploring how to combine the physical and digital world so that they interact with each other. Instead of holding a computer in your hand or having one on your desktop, the computer is all around us. Our entire environment is dynamic. Sound intense and maybe a little scary but it doesn’t have to be.

It could unlock creativity and regain some of what was lost with the introduction of present-day computers that are essentially designed to have us stare at a rectangular screen. Suzuki’s interest in programmable reality came from frustration at how physically limiting that can be.

“That technology confines our potential,” he says. “More creative work happens in the physical world.” It’s how humans have created for centuries and that was disrupted when our way of working shifted to desktop computers and mobile phones.

“It is time for more-immersive computers. It is an inevitable shift,” says Suzuki. “Almost everywhere can be a computer.”

Instead of using paper, the air around you becomes a canvas for art or a whiteboard for brainstorming. When you gaze at a magazine, annotations pop out off to the side and replace our current reflex to grab our phone and Google for more background information. A collection of tabletop robots could mimic the movements and feel of factory equipment, allowing you to train and develop muscle memory before executing a task in a high-stakes situation. It’s all a blend of the physical and the digital.

Suzuki says programmable reality is still in its exploratory phase, similar to when searching the internet and looking at emails pushed the limits of the first smartphones.

“It was essentially a desktop computer in your hand,” says Suzuki. “It took time to explore the interface and expand it to include location-based information, for example.” That expansion led to the creation of platforms like Uber and changed the game for how our phones function.

Poised on the edge of cracking the potential of programmable reality, researchers in Suzuki’s lab combine hardware and software, as well as develop their own, to test theories and explore ideas. They work with equipment like tablets, touchscreens and the Microsoft Hololens 2 headset. CNC machines, 3D printers, spools of wire and a makerspace are available in the lab to help with building prototypes. The researchers can take their ideas into the real world to gauge user experience and identify the limitations, as well as how the technology could work in the future.

They are anticipating a future where the technology continues to improve to make room for bigger ideas and smoother applications. Increased battery power to allow the equipment to function for longer would be nice. Equipment that expands the field of view would also be great. Instead of a clunky mixed reality headset, they see us carrying around lightweight glasses to pull out when needed.

For now, here are a few of the ideas from the Programmable Reality Lab that are being explored and tested as the researchers prove concepts and lay the foundation for a future that turns the entire physical world into one big computer.


RealityCanvas

The canvas is in the air all around you. With RealityCanvas, a camera captures you as you stand in front of it. As you scribble sketches into the air, they appear on the screen. Those sketches move with you and respond to your actions. Maybe you draw the shape of an umbrella; you can then virtually grab the handle and the umbrella moves with you, protecting you from the virtual raindrops that start to fall. It’s all live and improvised. Storytelling just got more interesting. A classroom presentation becomes more dynamic. And your TikTok video steps up a level.


Teachable Reality

Teachable Reality explores how easy it would be for anyone to combine everyday objects with augmented reality. While it uses interactive machine learning, you don’t have to be a programming pro to turn a plate into a steering wheel for a virtual car, pinch your fingers together to make a virtual object smaller, or create an exercise counter to follow along as you work out. Instead, it has an interface that can be used by anyone to create their own on-demand augmented reality triggered by actions.


ChameleonControl

What if your teacher could give you hands-on instruction from a different location? It’s called teleoperating and it can be done with ChameleonControl, which uses synchronized mixed reality. While you wear a headset, the instructor can see what you see; in turn, you can see the instructor’s hands virtually in front of you. Synchronize your hands with theirs to mirror an action. For example, if you have a machine you need to assemble in front of you, you’ll match your real hand to the instructor’s virtual hand as they guide you through putting the pieces together.


RealityChat

RealityChat is all about adding information to a conversation. You’re sitting across from a friend chatting. Both of you have headsets on. As your friend is talking, information appears around them triggered by keywords. For example, you might be making lunch plans and an image of a restaurant mentioned pops up just beside your friend’s face, along with its menu and address. It’s all happening in real time and in your eyeline. You never have to pull out your phone and type the question in. The conversation keeps flowing, albeit with images, videos, text, news headlines, weather reports and more popping up around your friend.


Virtual Reality Haptics at Home

What if you could easily turn objects in your house into something else? With Virtual Reality Haptics, you can. Using a controller, a pillow becomes a cat that you can pet, or a table becomes a whack-a-mole game to play. It’s virtual reality that you can touch, feel and interact with.

You May Also Like
Cattle in a field at W.A. Ranches
Read More

Meanwhile, Back at the Ranch

Vet Med’s 19,000-acre backyard is fertile ground for discovery by the largest group of researchers working on beef in North America.
Read More

We Interrupt This Workday to Streak

Two educators explain why they introduced “wellness streaks” to their field-study curriculum and give tips on how to start one and keep it up.