Horses are likely not the first thing that come to mind when thinking about how people and autonomous vehicles will interact. They are, however, on the minds of Alex Mankowsky and Sabine Engelhardt, futurists at Daimler (the parent company of Mercedes-Benz). When they offered us the opportunity to spend an afternoon together along with Peepa, Point and Moritz, their three ponies, we eagerly agreed.
One of the most unsettling aspects of innovation is how frequently it omits human operation and interaction. The onset of autonomous vehicles, for instance, will eliminate almost all need for a human driver; it also reduces nearby pedestrians, bicyclists or bystanders ability to understand a car’s behavior. What if the two weren’t mutually exclusive? For the futurists (or “mobility philosophers”) behind Mercedes-Benz’s Future Technologies Department, finding a way to build a world where autonomous technology accommodates human existence and welcomes instinctual feedback is best achieved in the absence of technology.
For Engelhardt, this project dates back to 2015. At Ars Electronica in Linz, Austria, an audience member asked her partner, Mankowsky, who says his job is to “connect people from the inside and the outside,” whether or not humans would have to learn a new language in order to communicate intentions to autonomous technology. At first she was shocked and appalled. Then, she realized that if someone wasn’t actively working to ease the transition between current models and autonomous ones then perhaps they needed to.
She reflected on this during her long walks with their three ponies, noticing how they reacted to noises, distractions and obstacles, their ears and heads moving in response. And that’s how SLAP (See Like a Pony), began. “SLAP is a way of studying how the indicators of body language can be applied to a vehicle, so that our sensory system perceives the intention and does not miss, or misinterpret, any signals,” Mankowsky says.
The pair, using their personal farm as a lab, equip GoPro cameras to their three ponies and themselves and take them for walks around the area. By affixing cameras to both parties, the pair can track instinctual reactions to passersby, sudden movements and sounds—ones that only the horse (and eventually autonomous vehicles) could pick up, too.
“Empathy is a term we use to describe precisely this sensitivity to other people, other situations or animals. One current theory suggests that we can sense a person’s intentions by replicating their situation within ourselves, analyzing our own feelings and then mentally transposing these back—assuming that we have at some point experienced those feelings ourselves. As just one example of this, we are constantly trying to read the facial expression of the person we are talking to,” Mankowsky explains. “Empathy is the engine of harmonious mobility.”
“How often do we say that someone is driving aggressively or with a lack of confidence on the freeway? How do we know this? We can’t see the people at all. Just from the way the vehicle is moving and our observation of the style of driving, we determine how a person is apparently behaving,” Engelhardt adds. After we determine their behavior, we adjust ours accordingly.
In an act of defense, or to convey fear or worry, we can slow down or get out of the driver’s way. But how would an autonomous car fare if the roles were reversed? How would one react to a child on a skateboard popping out between parked cars or a runaway dog? “We have to integrate them into our fabric of life, in our processes, in our activities—that’s the task,” Mankowsky says. “As humans, we have three statuses: flight, approach and freeze.” But autonomous cars will never freeze in shock, and therefore can avoid many accidents.
One of the areas the team has been working on is creating indicators that people outside the autonomous car observe so they are confident in how the situation will be handled. We saw some of these innovations in the recently debuted ESF concept car with enhanced safety features. Imagine if the side of the car or its glass could illuminate with a warning that lets you know that it is aware of the kid on the skateboard—knowing this would reduce anxiety and develop trust between people and autonomous vehicles.
Humans can easily tell when horses intend to slow down or recognize your presence. Mankowsky and Engelhardt are attempting to translate the cues horses give to us to vehicles—allowing humans to be more comfortable with the idea of an autonomous vehicle being responsible for recognizing, avoiding and perhaps even apologizing.
“The ‘I can see you’ signal, in particular, should be visible but not cause us alarm,” Mankowsky says. “Sensory awareness also needs to be made visible in automated vehicles,” Engelhardt adds. “You see, for people to start putting their trust in the machine, they need to be able to recognize immediately and intuitively what the autonomous vehicle is going to do.”
Watch our interview with Mankowsky (and the ponies) to learn more.
Images by Evan Orensten