Socializing robots

Image
Faculty wearing a white shirt.

Description

Why should robots have artificial social intelligence? According to Heather Knight, assistant professor of computer science, if robots are going to help in hospitals or work with people in factories, they will need to be adapted to our social conventions.

Heather Knight and her team in the CHARISMA Robotics Lab at Oregon State are working on developing artificial social intelligence for robots.

Season number
Season 9
Episode number
4
Transcript

[Clip from Star Wars: The Last Jedi. Walt Disney Studios Motion Pictures, 2017.]

LUKE: Artoo?

ARTOO: Beeping.

LUKE: Artoo! Yes. Yeah, I know. Hey, it's a sacred island, watch the language. Old friend. I wish I could make you understand.

ROBERTSON: It’s amazing how believable that relationship is between Luke and Artoo. Okay, it’s fiction, but we forget that Artoo is just software and hardware and start thinking of him as a creature. I find this so fascinating.

I’m your host Rachel Robertson and in this episode we are continuing the theme of robotics and AI by exploring the social side of robotics.

[MUSIC: “The Ether Bunny,” by Eyes Closed Audio, used with permission of a Creative Commons Attribution License.]

ROBERTSON: From the College of Engineering at Oregon State University, this is “Engineering Out Loud.”

ROBERTSON: The interactions between robots and people are something Heather Knight finds fascinating too. She is an assistant professor of computer science here at Oregon State. In a research study, she found evidence that people anthropomorphize robots in real life. Maybe not to the same extent that we see in Star Wars, but still, it’s like we are wired to do it.

KNIGHT: One of the most defining features of being human is sociability. The way our entire society is structured is through these kinds of relationships and mutual respect. Otherwise we would all have, like, our own little huts and never see each other. But, this idea that we have common roads, that we can even have multiple people living in the same building. Even that we live in families, like, from our infancy, we are always around other people and constantly moderating and communicating and structuring. And so basically if robots can't do that, they can't be around us.

ROBERTSON: That’s Heather. A large part of her research program is designing socially intelligent behavior systems for robots. She and her team go by the name of the CHARISMA Robotics Lab. Charisma is such a great word, not only does it apply to her work to make robots more charismatic, but, as I think you’ll agree, it also applies to her. Here she is speaking on the Science Pub stage in Corvallis.

KNIGHT: So the field of robots is in transition. There was a lot of magic, or not magic depending on how you think about it, that has happened in terms of automation. And there are robots everywhere. They sort our mail. They go to planets that we can't yet reach, they help make a decent amount of manufacturing,less than you'd think. I think it's still like 60% human, in terms of production lines. But, but the big shift today, even in manufacturing is towards people working side by side with machines. So we have collaborative robots, we have robot competitions, lots of kids … anyone ever do a robot competition? Yeah. Know a relative or a friend? Yeah. I think Corvallis has sort of taken over. Yeah. High five. But yeah, so there's a lot of amazing robotic applications and some of them are purely functional. This is TUG. TUG delivers linens and a medical samples in hospitals.

ROBERTSON: Heather goes on to show other examples of collaborative robots, but I found the TUG robot to be pretty compelling because it’s easy to imagine the usefulness of a robot that can do some of the mundane delivery tasks in a hospital that would free up time for a hospital worker to do more important work. These are real robots that are currently in use — in fact the company’s website claims that 450 of them are installed in hospitals. But Heather points out there could be some barriers to widespread use.

KNIGHT: So the challenge in making this transition from robots that are behind a wall that we can't see to robots in our everyday life is that these robots need to know how to behave, right? In the same way that we teach toddlers not to do incredibly rude things. We don't want to have that TUG robot that's delivering medical samples to block the surgeon from getting to his or her patient. Right? So that's, that's logistical. But then there's also just a lot of rules of the road of how we socialize and interact with each other. And we don't want these robots to be frustrating and we don't want these robots to be rude, right? Because if we don't trust them, we can't collaborate with them and we won't buy them.

ROBERTSON: Heather’s area of research is called social robotics.

[MUSIC: “Kitten” by Podington Bear, licensed from Sound of Picture.]

She was introduced to this area when she was an undergraduate at MIT, where she worked with Cynthia Breazeal, an associate professor of media arts and sciences, who is considered to be the pioneer of social robotics. Heather went on to get her master’s degree at MIT and another master’s and doctorate at Carnegie Mellon. Along the way, she has forged her own path in social robotics that derives inspiration from entertainment. This has taken her in some pretty interesting directions — she has performed on the TED stage with her joke-telling robots. She is the founder of Marilyn Monrobot, a theater company that features robotic actors, and she has worked with Syyn Labs to create technology-based artwork such as the Rube Goldberg machine in a music video by OK Go for their song “This Too Shall Pass.” She is also the executive director of the Robot Film Festival and was the artist in residence at X, the research lab at Google’s parent company, Alphabet. It sounds like fun, right? But let’s get back to the field of social robotics and why she diverged into entertainment for inspiration.

KNIGHT: In the basically 20 years that this field has existed, most approaches have come from psychology. So people will look at how to do user studies and controlled user studies in a very psychology oriented way. There's some literature that we can use directly from psychology to design the machines but usually we have to run our own studies. But basically I, after being in the field first for, like, 16 years, I felt like it was hitting a wall and that maybe psychology wasn't always telling us how to actually make things. It like, it helps us actually measure and track things, but it doesn't tell us where the behavior is coming from or how to construct it. Whereas something like acting or performance is much more formative. So, in acting and performance when you're trying to create a new character or think about how someone will react to this, like, horrible situation they're in, they actually have an entire training program for this. Where people will practice being different kinds of people and they will form this entire approach and methodology for how to create and craft compelling, consistent characters that have other relationships over time. And I thought that that would be really helpful, as we're starting to make these artificial relationships between robots and people.

ROBERTSON: I mentioned at the beginning of this podcast that Heather conducted a research study where people who interacted closely with robots viewed them as some kind of creature. This project was with Clearpath Robotics, a company that makes autonomous, mobile transport robots to operate in warehouses and factories. Imagine a sleek, industrial-sized Roomba that can transport parts between workstations at a jogging pace.

[MUSIC: “Vibe Drive” by Podington Bear, licensed from Sound of Picture.]

The first part of the research project was to study how the employees interact with the robots, so they could figure out what kinds of social intelligence would be the most useful to add to the systems. To get help with this, she worked with Boh Chun, a graduate student in anthropology at Oregon State.

KNIGHT: It's been really fun because basically we found that there's about 150 people, at the company, at least onsite. And they have this huge area where they'd go and test the robots and the people that have their desks on the test floor -- the test team, they have all kinds of crazy, like, stories about these robots throwing a party. They'll bring their families in on the weekends to come visit the robots. There was one robot that they would call problem child. It had, like, duck tape on it because it ran into the side of the building sometimes and stuff. And so like they had all these like complex personas, for these robots.

Whereas these other people like the software team that worked at the opposite end of the building and they didn't really get to know individual robots. And they were more like working with aggregate data sets. Did not have that kind of what we would call anthropomorphization at all. Like, they did not attribute human characteristics. They thought all of them were exactly the same, which that part is factually incorrect because they behave differently. Right? But they just didn't see that, right. And so, one of the things we were trying to understand is when is it actually helpful to integrate, sociability into robots and, and with, in what situations is it not? And so, so basically what we're finding there, which I think would be a good preview for the factories and warehouses, is if the people aren't really spending that much time with the robots. Like, I wouldn't waste any time trying to build any relationships between the robot and those people.

[MUSIC: “Vibe Drive” by Podington Bear, licensed from Sound of Picture.]

ROBERTSON: It makes sense that the software team would not view the robots as creatures since they don’t interact with them directly. But Clearpath’s clients are more similar to the testing team members. In the warehouses and factories they sell to, the robots operate as coworkers with the human employees. For that reason, Clearpath is motivated to figure out how to modify their robots to be more acceptable to people who are working with them directly. In the study, Heather found that people wanted the robot actions to be more human-like, even in how the two different kinds of robots in the factory interact with each other.

KNIGHT: The smaller robots apparently cut off the bigger robots a lot. They see them more like small dogs like chihuahuas or terriers that are like high energy. And the larger robots, like great Danes. They use a lot of dog metaphors rather than human metaphors. You start thinking, like, ‘Oh, it just says like small dog syndrome. It's like, needs to show off in front of the big ones.’ But the big ones, like, they, they're confident enough in themselves, so they don't need to prove anything. So we hear things like that. So it's been interesting how something as simple as when these two different robots, and this is literally because of the type of sensors they have and because of how fast they can accelerate and slow down, suddenly develop these larger personas. So with a little bit more manipulations of that from a conceptual side, if we, if we wanted the robots that were smaller to seem more responsible, even though it's not necessary, you could also have them kind of slow down their accelerations and be like a little bit more, not necessarily cautious, but convey confidence rather than this emergent, but this more erratic sort of playful, chew at your ankle kind of impression. And, and that could be actually impactful to the people around it in terms of being like, is this … if this thing is like too playful and childlike, it's probably not taking its job seriously, or like I won't trust it as much and, or it's going to cut me off or I wouldn't want that thing to be around the forklifts.

So it's sort of interesting. On the other hand, if the great Dane is too chill, right, you're just like, I don't know if I really cares about this job enough. And so it could actually be a little bit helpful for it to kind of express like for example, especially if the factory lines running behind because they have very explicit targets that you can track. It would be neat for their, their motion style to change to convey that they are also trying to, trying to meet things. And some of that will happen automatically just because as you try to do things -- expression comes from the often comes from the exaggerated versions of the functional thing. But yeah, you could also make it more explicit.

[MUSIC: “Vibe Drive” by Podington Bear, licensed from Sound of Picture.]

ROBERTSON: Expressive motion is something that Heather has studied in the past as well. When she was a graduate student at Carnegie Mellon there was an autonomous mobile robot that they would dress up for Halloween to do reverse trick or treating — delivering candy to people’s offices.

KNIGHT: And so I decided one year to make that a research experiment and basically I had the robots vary their motion style. They would vary how quickly they were moving and also how they would orient towards people when they got to the office. And it turns out that if you have candy, the orientation doesn't matter that much. People like the candy all the time. But that velocity feature, like how quickly it was moving had a massive impact on whether people would take candy from the robots. So basically when the robot was moving down the hallway quickly, people thought it was trying to get somewhere and so they would be much less likely to interrupt the robot in the hallway. Um, then they would be, if it were moving slowly where it just kind of looks like it's, Oh, I'm here, I got some candy, who wants candy? Happy Halloween. It's not saying those things, but it felt like it was saying those things when people watched it moving.

ROBERTSON: In fact, people were more than twice as likely to take candy from a slow moving robot as they were from a fast moving robot.

KNIGHT: Motion is, is really important. And even something as simple as changing the robot speed can massively impact how people will choose to interact with it or even how people will choose to navigate around it.

ROBERTSON: We’ve only scratched the surface of Heather’s research. There are chair robots, jellyfish robots, robot comedians. If you want to learn more, you can check out the bonus content at engineeringoutloud.oregonstate.edu. But for now, I’ll let her sum up broadly what her research is about.

KNIGHT: All of my research tends to contribute to this concept of how can robots succeed in human environments and also what opportunities are there for robots in our everyday life. So whether that be at work, at home, on the streets, when we're exercising, on shared sidewalks. And so I guess I want to make robots that like empower people, not subjugate them. But yeah, so I think that it's really important to, to design technology with, social goals and also evaluate that technology with the people you're supposedly helping. Like, I think user-centric design is really helpful. One of the things I'm really excited about is ethnography and understanding the values that we should design into machines by actually releasing it into the world and talking to people about it or observing their behaviors with these systems. And we have a lot of other people at Oregon State that are also contributing to some of these, these problems. There's 13 faculty in robotics right now, and at least four of us do social robotics or human robot interaction research at least part of the time. And so we have like a, you know, a little powerhouse here. So a lot of us also care about answering this question of, of how, how can robots actually add value to human society?

[MUSIC: “Chimera” by Podington Bear, licensed from Sound of Picture.]

ROBERTSON: If you have been listening to this whole season on robotics and AI this is sentiment that you’ve heard more than once. Our researchers here want the work they to to be a force for good. But they also realize that the public has concerns, and they have concerns themselves. Here’s Heather answering a question from the Science Pub audience.

AUDIENCE MEMBER: So, um, there's a, I think you had a quote about how robots should support things we care about, but then there's also potential conflicts with that. I'm thinking about the movie Robot and Frank.

KNIGHT: I love that movie. Thank you for mentioning it. You should all watch this movie guys. Unfortunately it's no longer free on Netflix but you can get it on Amazon prime. You can watch it for $2.99 cause I made my class and watch it this past quarter. Okay. Sorry. Robot and Frank.

AUDIENCE MEMBER: In terms of the robots trying to support something he cares about, but, you know, it involves breaking other rules.

KNIGHT: Yeah. Is adding technology to our life always absolutely perfect and good? No! Should we think about the future and the different ways that technology will impact the world? Yes. Yes. I think that's really important. And then it's like I'm not a tech, I mean I am optimistic about technology but I think blind, optimistic, blind optimism does, it is not in service to anyone and we should absolutely think about what we want from technology and design towards that future. And if we're getting something really different and we should revise and I, and you know, I was trying to brainwash some of the computer science undergrads this past term I was teaching an ethics and social issues in computer science and I was like, ‘Guys, you are the future! Like, you are going to know this technology much better than anyone in Congress. Like, guaranteed. Like, so like be responsible. It needs to come from all directions.’

[MUSIC: “Chimera” by Podington Bear, licensed from Sound of Picture.]

This episode was produced by me, Rachel Robertson. Audio editing by Molly Aton. Our intro music is “The Ether Bunny” by Eyes Closed Audio on SoundCloud and used with permission of a Creative Commons attribution license. Other music and effects in this episode were also used with appropriate licenses. You can find the links on our website. For more episodes, visit engineeringoutloud.oregonstate.edu or subscribe by searching “Engineering Out Loud” on your favorite podcast app.

I’m going to let Jon the Robot take us out. This clip is from the show “Singu-hilarity: A Robot Comedy Variety Show” organized by Naomi Fitter, assistant professor of robotics. Keep your eye out for upcoming shows at the Majestic Theatre in Corvallis.

[JON THE ROBOT] I auditioned for the role of C3PO. They said I wasn't tall enough. I auditioned for the role of Rosie in the Jetsons reboot. They said I wasn't thick enough. I auditioned for the role of Wall-E. They said I was too put together. These beauty standards are impossible. I'm thinking of getting plastic surgery, where they epoxy more plastic to my exoskeleton. You see, because I am a robot and made of plastic so any modification to my exterior is plastic surgery.

Featured Researchers