August 29, 2024

by

Tor Odland

Robots & Humans: Why we cheer on little robots

We were sitting at an outdoor restaurant in Tempe, Arizona, enjoying the warm evening air. Across the street, a small, cooler-shaped delivery robot was trying to navigate its way across the busy road. It would start, stop, turn back, and hesitate again as cars and pedestrians passed by.

After several minutes of this back-and-forth, something remarkable happened. Everyone on the patio started rooting for the little robot from Starship, a food delivery enterprise. We wanted it to take the plunge, to dare to cross the street. We had all, without a word to each other, anthropomorphized this machine, assigning it a hesitant, almost shy personality. Interestingly, everyone was on the little robot’s side and no one was expressing any hope that it would get hit by a car or come to harm. When it finally made it across, the whole restaurant erupted in cheers.

Small robot from Starship
The little food delivery robot from Starship Technologies

This experience highlighted something fascinating about human nature: our tendency to project human-like qualities onto machines. But it also raised a deeper question: What does the world actually look like through the eyes of a robot?

Machine perception: not what meets the human eye

Machine perception is how robots and other intelligent systems "see" and interpret the world around them. It involves a complex interplay of sensors, algorithms, and data processing that enables them to identify objects, navigate environments, and even interact with humans and which give the machines a kind of physical intelligence.

However, machine perception is fundamentally different from human perception. We rely on our senses – sight, hearing, touch, taste, and smell – to create a rich, subjective experience of the world. Robots, on the other hand, rely on sensors like cameras, LiDAR, radar, and ultrasound to collect raw data about their surroundings. This data is then processed by algorithms to extract meaningful information and make decisions. The little robot in Tempe was clearly receiving a lot of input from its sensors, but maybe needed a bit of refinement in the routines that helped it decide what to do about it.

Why understanding machine perception matters

Understanding how robots "see" is becoming increasingly important in our technology-driven world.

From self-driving cars to robotic assistants in our homes, the way machines perceive the world is already shaping the way we live, work, and interact. We seem to be rushing headlong into these technologies, without stopping to think very much about the broader implications. 

In this series of blog posts, we’re going to be taking a closer look at how robots see and even more importantly how they interact with humans. 

We’ll be talking to people like Rebekka Soma, a recent Ph.D. graduate from the University of Oslo, who are on the forefront of the research in Human-Robot Interaction (HRI) research. And we’ll be sharing our thoughts as well. As a maker of sensors that help robots ”see,” we are keenly interested in the topic. 

By understanding machine perception, we can better appreciate the capabilities and limitations of these technologies, and ensure they are developed and used responsibly.

Come along for the ride as we delve deeper into the world of machine perception, exploring the technologies that make it possible and the challenges that remain. We will also discuss the ethical implications of machine perception and how it is shaping our understanding of intelligence itself. And of course, we’ll be talking about how our 3D ultrasonic sensors fit into this whole world of machine perception.

Keep reading