Beckhoff

Could a horse whisperer teach engineers about building better robots?

05 May 2023

(Image: Shutterstock)
(Image: Shutterstock)

Researchers from the University of Florida argue that the age-old relationship between humans and horses could help improve designs for robots that can improve our lives.

Humans and horses have enjoyed a strong working relationship for nearly 10,000 years – a partnership that transformed how food was produced, people were transported and even how wars were fought and won. 


Today, we look to horses for companionship, recreation and as teammates in competitive activities like racing, dressage and showing.


Can these age-old interactions between people and their horses teach us something about building robots designed to improve our lives? Researchers at the University of Florida say yes.


"There are no fundamental guiding principles for how to build an effective working relationship between robots and humans," said Eakta Jain, an Associate Professor of Computer and Information Science and Engineering at UF's Herbert Wertheim College of Engineering. 


"As we work to improve how humans interact with autonomous vehicles and other forms of AI, it occurred to me that we've done this before with horses. This relationship has existed for millennia but was never leveraged to provide insights for human-robot interaction."


Jain, who did her doctoral work at the Robotics Institute at Carnegie Mellon University, conducted a year of fieldwork observing the special interactions among horses and humans at the UF Horse Teaching Unit in Gainesville, Florida. She will present her findings today at the ACM Conference on Human Factors in Computing Systems in Hamburg, Germany.


As horses did thousands of years before, robots are entering our lives and workplaces as companions and teammates. They vacuum our floors, help educate and entertain our children, and studies are showing that social robots can be effective therapy tools to help improve mental and physical health. 


Increasingly, robots are found in factories and warehouses, working collaboratively with human workers, and sometimes called cobots.


As a member of the UF Transportation Institute, Jain was leading the human factor subgroup that examines how humans should interact with autonomous vehicles, or AVs.


"For the first time, cars and trucks can observe nearby vehicles and keep an appropriate distance from them as well as monitor the driver for signs of fatigue and attentiveness," Jain said. 


"However, the horse has had these capabilities for a long time. I thought why not learn from our partnership with horses for transportation to help solve the problem of natural interaction between humans and AVs."


Looking at our history with animals to help shape our future with robots is not a new concept, though most studies have been inspired by the relationship humans have with dogs. 


Jain and her colleagues in the College of Engineering and UF Equine Sciences are the first to bring together engineering and robotics researchers with horse experts and trainers to conduct on-the-ground field studies with the animals.


The multidisciplinary collaboration involved expertise in engineering, animal sciences and qualitative research methodologies, Jain explained. She first reached out to Joel McQuagge, from UF's equine behaviour and management programme, who oversees the UF Horse Teaching Unit. He hadn't thought about this connection between horses and robots, but he provided Jain with full access, and she spent months observing classes.


She interviewed and observed horse experts, including thoroughbred trainers and devoted horse owners. Christina Gardner-McCune, an Associate Professor in UF's Department of Computer and Information Science and Engineering, provided expertise in qualitative data analysis.


Data collected through observations and thematic analyses resulted in findings that can be applied by human-robot interaction researchers and robot designers.


"Some of the findings are concrete and easy to visualise, while others are more abstract," she says. "For example, we learned that a horse speaks with its body. You can see its ears pointing to where something caught its attention. 


“We could build in similar types of nonverbal expressions in our robots, like ears that point when there is a knock on the door or something visual in the car when there's a pedestrian on that side of the street."


A more abstract and groundbreaking finding is the notion of respect. When a trainer first works with a horse, he looks for signs of respect from the horse for its human partner.


"We don't typically think about respect in the context of human-robot interactions," Jain says. "In what ways can a robot show you that it respects you? Can we design behaviours similar to what the horse uses? Will that make the human more willing to work with the robot?"


Jain, originally from New Delhi, says she grew up with robots the way people grow up with animals. Her father is an engineer who made educational and industrial robots, and her mother was a computer science teacher who ran her school's robotics club.


"Robots were the subject of many dinner table conversations," she says, "so I was exposed to human-robot interactions early."


However, during her year-long study of the human-horse relationship, she learned how to ride a horse and says she hopes to one day own a horse.


"At first, I thought I could learn by observing and talking to people," she says. "There is no substitute for doing, though. I had to feel for myself how the horse-human partnership works. From the first time I got on a horse, I fell in love with them."


Print this page | E-mail this page


Stone Junction Ltd

This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.