As the world braces for the arrival of robots that can read, process and think like humans, a few companies are looking beyond what we know about the capabilities of today’s technology to what might be lurking in the future.
Robot-vision company Hanson Robotics is developing a device that can detect facial features and even track where the human eye is looking to learn more about the human condition.
That could ultimately help doctors identify the exact location of a patient’s eyes, which would be a critical piece of the puzzle for vision therapy and other types of vision treatments.
The company’s device is expected to hit the market this year, but it’s not the only one in development.
Other companies are building robots that mimic human behaviors, and many of those efforts involve creating robotic models to help surgeons see how human beings interact with one another.
Robots are also becoming increasingly adept at understanding the emotions of people, and a new study published last month in the journal Nature Communications offers some tantalizing hints at what the future holds.
The study, which is being presented at a conference this week, looked at the brain activity of about 400 people watching videos of two humans interacting, and found that while some participants were able to identify the emotion of the person they were watching, the people who were watching the videos did not.
Researchers used fMRI scans of the participants’ brains to determine how they were processing emotional stimuli in the videos.
They found that, compared to the non-participants, the brain regions involved in processing emotion were active in the right side of the brain — a region associated with emotion — while those involved in thinking were active on the left side.
That left us with the question of whether this brain activity was a result of emotions being processed differently in the brain or whether the brain was simply processing the emotions differently.
That question remains open.
The team that conducted the study suggests that, instead of a difference in how the brain processed emotion, the differences were likely due to the difference in the way the brain worked during emotion processing.
“We know that when we look at people and ask them to identify someone, we tend to look at them in a way that is similar to how we would identify someone from a picture,” lead author Daniel Zieminski, a neuroscience researcher at the University of Wisconsin-Madison, said in a statement.
“We think that that is a way of understanding what we call emotion processing.”
“It would seem that emotions are a way to understand what we would like to know about people, or how we might want to change people’s behavior,” he continued.
The study was published online in the Proceedings of the National Academy of Sciences, and the results are being shared with researchers at the National Institutes of Health.
Researchers also plan to examine the brain’s reaction to another emotion, sadness, to learn whether this might also be an indication of emotion processing in humans.
“In the future, we could use this to identify people who are in different emotional states and use that to make inferences about how we should treat them,” Zieminsky said.
Zieminski said the study also suggests that the brain is very adept at recognizing facial expressions, but the brain does not automatically respond to them.
That is, people who know the person in the video are not necessarily able to recognize the facial expression, which makes it hard for the brain to understand how to interpret a person’s facial expression.
“When you are a child, you will know the people in the room are smiling or frowning, but you don’t automatically know the smile or frown,” Zietkoski said.
“So we know that this is a problem that we will need to work on, and it might help us understand the brain more.”