The benefit was supposed to be that all of those messy human emotions would be banished and they would do exactly what management wanted with a ruthless, implacable efficiency that would make any internal auditor's heart go pitter-patter.
So the scientists couldn't leave well enough alone and had to program robots with the capability of developing feelings? Next they're going to be joining the argument over who gets the company seats at the baseball stadium. The good news is that if the robots become too obnoxious, managers may decide humans are preferable, after all. JL
James Young reports in the Wall Street Journal:
As humans and robots interact more regularly, companies will have to adapt to the inevitable social interaction between man and machine.Humans almost inevitably will treat the machines as living things. So companies need to prepare for all sorts of social issues that might crop up.
Move over and make room for your new colleague: a robot.
Robots are starting to move out of their isolated assembly lines to work increasingly alongside people and even collaborate with them as team members or assistants. Robots, for instance, are handing people tools or parts, while people are grasping robotic arms and moving them to show a robot how to perform custom tasks.
These human-robot teams leverage the skills of both. Robots can lift and move heavy objects, make precision movements and use advanced computer vision to inspect work. People, meanwhile, are extremely flexible, particularly for unpredictable or highly custom work, and have the manual dexterity and knowledge required to closely manage the robots. In the near future, we will have even more dynamic human-robot teams, where products and parts are handed back and forth for specialized work and the robots and people are asking each other for advice.
Not just a machine
On the one hand, these robots are just another modern machine to enter the factory. However, new research shows that people don’t treat them that way; rather, they treat them as living things.
In some sense, that isn’t surprising: Humans are hard-wired for social interaction with the world. We use emotions and personalities to describe the weather, or even how machines such as our car act. Research has shown this tendency to be exceptionally strong with robots, as they autonomously move around people. It is not necessary for robots to look like a person; even a small disc robot moving around a factory floor, with intentions and goals of its own, quickly comes to be seen by the humans around it as a living thing. Even people who vehemently object to the idea of assigning emotions and personalities to robots start to do exactly that after minimal time.All of which means that as humans and robots interact more regularly, companies will have to adapt to the inevitable social interaction between man and machine. Here are four specific actions companies should take, based on my team’s research:
Give robots social skills to improve work efficiency. People are naturally skilled at quickly analyzing social interaction, and this is as true with reading robots as it is with reading other people. For example, instead of surprising co-workers with unexpected movements, robots can use their eyes, where they look and how long they stare, or gestures to show intention before acting, to maximize team-worker awareness.
Our team found that people interpret how a robot moves—fast, slow, soft or jerky motions, etc.—in emotional terms. Robots should use this to broadcast states such as fatigue (batteries running low or soon needing maintenance), excitement (has important information to share) or curiosity (needing more information). By doing this, people can intuitively notice these things while working, rather than having to monitor a computer screen, tablet or readout.
We even found that a small robotic tail (like a dog) can communicate similar things. No one would suggest replacing detailed technical diagnostics or work reports with a smile or gesture, but robots should use social cues as a simple way to maintain team-worker awareness the same way people do.
Be careful when robots give people commands and recommendations. Our work has shown how robots can pressure people to do things they would rather not do. While computers and machines regularly give us recommendations or directions, such as automatic emails or alerts when a task is due, the same direction from robots can have a much stronger impact given the social-interaction elements that come into play.
In our research, we showed how a simple, small robot could pressure people to continue a highly tedious task—even after the people expressed repeated desire to quit—simply with verbal prodding.
Most surprising was not that people obeyed the robot, but the strategies they employed to try to resist the pressure. People tried arguing with and rationalizing with the robot, or appealing to an authority who wasn’t present (a researcher), but either continued their work or only gave up when the robot gave permission.
One could imagine a robot giving seemingly innocuous direction such as to make a bolt tighter, change a tool setting or pressure level, or even to change which electronic parts are used. However, what if the robot is wrong (for example, due to a sensor error) and yet keeps insisting? Will people doubt themselves given robots’ advanced knowledge and sensor capability?
This has important implications for a workplace: Who is responsible for damage if a robot pressures a person to make a mistake? Companies should be aware of the persuasive power of directions when coming from a social robot, and create policies and procedures for identifying situations where there is a human-robot disagreement and what to do when there is.
Develop a strategy for empathy toward robots. People feel bad when bad things happen to robots. It is not surprising that most people are unhappy when expensive equipment breaks or malfunctions, but a growing body of research shows that people have empathy for robots when they are harmed, even something as simple as losing their memory.
There are reports from military situations, for example, where soldiers have demanded that their robots be repaired rather than replaced, or are hesitant to place their robots in danger.
There are a couple of ways to think about this in a workplace environment. On the one hand, sentiment toward these robots might help reduce equipment abuse; if people feel bad when robots are damaged, they might be more likely to keep from harming them.
On the other hand, people might hesitate to push robots in hard situations where they may be damaged, at the cost of productivity, or avoid repairing and replacing damaged robots despite reduced output. Robot maintenance or replacement cycles where people feel bad for the robots may also affect morale and mental health.
Companies need to be aware of these possibilities, monitor for impacts, and establish practices and procedures that are sensitive to people’s emotions and potential reactions. For example, companies might want to separate robots that do collaborative tasks (thus building rapport) from those that do dangerous ones (and be at risk). That would minimize opportunities for workers to directly observe robots being mangled or disassembled.
Watch for social issues between people and robots. While the idea of people having social issues with a robot might seem bizarre, remember that as we treat robots more as social actors, we bring expectations and preconceptions to our interactions with them.
Research has shown people feel less comfortable around robots who break social norms, such as by having shifty eyes or mismatched facial expressions. A robot’s personality, voice pitch or even the use of whispering can affect feelings of trust and comfort.
Working with a robot that always grins while criticizing you, stares at your feet while giving recommendations, stares off into space randomly or sounds sarcastic while providing positive feedback would be awkward and uncomfortable and make it hard to develop one’s trust in the machine.
Gender roles
These are clear examples, but social structures around human-robot interaction can get quite complex and difficult to predict. People even attribute gender stereotypes to robots that are perceived as being male or female, and apply traditional gender roles to them, and men and women likewise interact differently with male or female robots.
Robots that have a level of social intelligence, and a well thought-out personality design, are more likely to be comfortable and effective to work with. But many robots won’t have that level of design, so companies should be on the watch for social issues and take worker complaints seriously.
It is still the early days of robots collaborating with people, and it is difficult to tell for certain how exactly these interactions will unfold. The early indicators, however, suggest that robots will take a unique role unlike any other technology we have worked with. Companies that engage this approach, and take early steps to integrate robots with the social and emotional aspects in mind, will have an advantage in maximizing how well people work with robots, while avoiding potential issues that are quite likely to arise.
0 comments:
Post a Comment