Teaming with robots

Image
Season 5 episode 1 main image

Description

How do you talk to a robot? How about 250 robots? Julie A. Adams, professor of computer science, describes her research on human-robot interaction and the benefits and challenges of drone swarms.

Season number
Season 5
Episode number
1
Transcript

Transcript

NARRATOR: From the College of Engineering at Oregon State University, this is Engineering Out Loud.

ROBERTSON: Welcome to Season 5, everyone. I’m Rachel Robertson.

[MOVIE CLIP (Star Wars): I am C-3PO, human-cyborg relations, and this is my counterpart R2D2. Beep, beep, beep. Hello.]

ROBERTSON: That’s a throwback to the original Star Wars for you because today our topic will be on human-robot interaction. Over 40 years ago, George Lucas imagined a world where robots and humans interacted with ease. But in reality we are still far from creating C-3PO. In fact, human-robot interaction is an incredibly complex task. Then, if you consider trying to communicate with a large number of robots all at once, it becomes even more complex. And it takes a person with a diverse set of skills to tackle this area of research.

ADAMS: I'm Julie Adams. I am a fairly new professor at Oregon State. I'm a professor of computer science with courtesy appointments in mechanical engineering and industrial engineering.

ROBERTSON: Julie is also an associate director of the Collaborative Robotics and Intelligence Systems Institute (known as CoRIS) which we will talk more about later in the podcast.

Our theme this season is collaborations and partnerships which turns out to be pretty important for research in robotics. But first let’s learn more about what Julie does. I asked her how she would explain her research to someone she just met.

ADAMS: Oftentimes I start out by talking about the fact that I work with robots.

[MUSIC: “Harps Uplifting” by Mortal Thing used with permission of the artist.]

And that then typically leads to a conversation about: What about robots do you do? I often then talk about the fact that I work with ground, aerial, marine robots, typically in teams. And I'm curious about how to create the teams so that they are capable of doing useful tasks that help their human partners. And also how humans can interact with their robotic teammates, either directly like you and I would interact to move a table around, or indirectly meaning that the team is located, perhaps, on the other side of campus or maybe even downtown on Second Avenue, and we're sitting here and I want to be able to interact with the team so we call that a remote interaction.

ROBERTSON: The robots that Julie works with are often called drones, and a team of them would be a drone swarm.

ADAMS: So, when you say drones most people think of unmanned aerial systems.

ROBERTSON: If you are having trouble conjuring up an image of an unmanned aerial system, think of a small remote helicopter with four blades, known as a quad-rotor. Many of you may have bought them for your kids or yourself for Christmas which hopefully hasn’t crashed yet or flown off on its own and gotten lost at sea.

[SOUND EFFECT: Drone take off, used with permission of Creative Commons Public Domain]

ADAMS: In my world, drone swarms are not just aerial systems they could be combinations of ground vehicles, so small mobile robots to autonomous cars, combined with aerial systems. It could also be out in the middle of the ocean you could have underwater gliders combined with autonomous surface vehicles combined with aerial vehicles. Your drone swarm in that case is what we call a heterogeneous swarm meaning that you have different types of vehicles and different capabilities.

[SOUND EFFECTS: Car door close engine start, Drone take off, Submarine, used with permission of Creative Commons Public Domain]

ROBERTSON: Swarms are usually 50 or more vehicles which have both local communication with each other and distributed decision making, like you see with schools of fish that move as a group to get away from a shark, for example. But what would you do with a drone swarm?

ADAMS: There are so many different applications. In our oceans, you can imagine using swarms to monitor what's happening in the ocean climate, for mapping the ocean…you could also use smaller ones that are with the firefighters where they could put them up and get some idea...you might want to use them to monitor the salmon runs... looking at how do you do pollination in the future… You could think of being over off the coast of Africa and wanting to be monitoring for pirates things of that nature…

ROBERTSON: Okay, so you get the idea. Lots of applications. But how do we get there? How can we create the robots of our imagination?

ADAMS: Collaborations in general for robotics are key. As a roboticist you tend to specialize in a sub area and then you really have to work with others who have capabilities in other areas. So, for example, my sub areas are the human interaction and the artificial intelligence -- more of the computer programming aspects. While I know enough about mechanical control to be dangerous I am not someone like Jonathan Hurst who has created Cassie.

ROBERTSON: If you haven’t heard of Cassie yet, go to the show notes page at engineeringoutloud.oregonstate.edu to learn more and see see Cassie in action. Basically, Cassie is a bi-pedal robot that is designed for agility. Cassie’s designer, Jonathan Hurst, is an associate professor of mechanical engineering here.

ADAMS: Jonathan has all of the excellence and fundamental understanding of controls and mechanical systems that I don't have. It's really important to create these relationships with others so that you can develop these more complex systems that are capable of doing things in the real world. And of course within CoRIS and the robotics program at Oregon State, one of our key things is that we want systems that are going out in the real world and are doing things that help people. In order to achieve that goal you really have to bring people together with different specialties.

ROBERTSON: Julie just mentioned CoRIS, which is the Collaborative Robotics and Intelligence Systems Institute at Oregon State and was formed to foster collaborations across disciplines. So, the hope is eventually, the next version of Cassie will be more than a fabulous pair of legs but will also perceive its environment and react appropriately.

Beyond Oregon State, Julie has long-established collaborations with other universities, government agencies and companies. She has been collaborating with Mike Goodrich at Brigham Young University so long, they couldn’t remember exactly how long it’s been.

ADAMS: It was the project for the cognitive task analysis wasn’t it, Mike? For UAVs for the wilderness search and rescue?

GOODRICH: Yes, we were using unmanned aerial vehicles to try to do wilderness search and rescue and we were working with search and rescue experts, and I didn't know how to figure out what they were doing. And so Julie was willing to come out for a little bit and interview the search and rescue experts and she did a really deep analysis for how they accomplish their tasks. And then we just used the heck out of that.

ADAMS: Actually it was my student Curtis Humphrey that went out and interviewed them but we did the analysis together. And that was probably what? 10-12 years ago?

GOODRICH: That’s got to be at least 12 years.

[MUSIC: “Harps Uplifting” by Mortal Thing used with permission of the artist.]

ADAMS: Yeah.

ROBERTSON: It was a collaboration that clicked and they have continued to find new projects that they can work on together. One place that their interests converge is using biological models to inform how they design systems of human interaction with drone swarms. Their current project is funded by the Office of Naval Research.

ADAMS: We are looking at having hubs of vehicles. So think of something like bee hive or an ant nest. And so you've got these distributed hubs of vehicles the hubs could either be stationary or they could be mobile. And you also have distributed sets of users who need to share those resources and want to be able to use them for missions. Do you want to say something about that explanation, Mike?

GOODRICH: Yeah, if I could. There's a lot of people who are looking at what Julie and I call spatial swarms. And we are familiar with those in nature like schools of fish, flocks of birds where all the animals are kind of cohesive, they're about in the same neighborhood. And there are many people who are working on how humans could influence those and there's some really great work being done but not very many people are looking at other biological models. Julie just did a great job describing these hub-based colonies. Not a lot of people are looking at those. And those are really important because now you can have these collective systems that do things when they don't have to be neighbors with each other. So think of robots that leave a hub and go over the hill and they don't communicate for a while and come back. Well, that … you can see all sorts of applications for that. And what we want to know is how does an individual human influence those hub-based colonies? And then more crazily, we want to understand how organizations of humans might be able to influence those. So we're really looking at problems that are 10, 15, 20 years into the future and trying to understand the basic concepts and organizational principles for organizations of humans influencing these hub-based colonies.

[Music: “Space Station” by Mortal Thing used with permission of the artist.]

ROBERTSON: So, an application of a hub-based system could be monitoring pirates off the coast of Africa as Julie mentioned. You could imagine having swarms of underwater gliders and/or aerial vehicles that are using a Navy ship as a hub. You would obviously want people directing those swarms, and perhaps even more than one group would be required. Maybe there are two or three Navy ships that are strategically located to help direct the swarms. And you can see why hub-based colonies (like honeybees) versus spatial swarms (like fish) could be a better biological model to base an artificial system on, since many of the applications for drone swarms would need to have at least one hub.

In order to create systems based on biological models, Julie spends a lot of time studying biological systems in order to figure out what features she can use.

ADAMS: We take insights from this literature that we read and we derive new ideas for algorithms and a lot of it is like creating a recipe from scratch. Of, I think this will work, and that will work, let's put these approaches together and test them. Typically we test in simulation. So, graphical simulation or some other type of simulated system before we’ll actually put it on real robots. We want to tweak that algorithm or recipe, if you will, to make sure that it does the things that we expect it to do in simulation and then we'll go out and put it on the real robots to see what happens.

ROBERTSON: In addition to researching literature on biological systems, Julie has partnered with Ramesh Sagili, assistant professor of apiculture in the College of Agriculture at Oregon State, to study the behavior of honeybee colonies.

[SOUND EFFECT: Bees in lavender, used with permission of Creative Commons Public Domain]

[MUSIC: “Buzz Box,” Podington Bear, used with permission of a Creative Commons Attribution-NonCommercial License]

In this project, headed by Smart Information Flow Technologies and funded by DARPA (Defense Advanced Research Projects Agency) they are looking at bee behaviors that could have applications for drone swarms. For example, when a bee is sick or has been contaminated by pesticides it will sometimes leave the hive to protect the colony. Julie sees an application of this behavior for drone swarms.

ADAMS: We've all seen the reports of systems getting hacked, right? And even today we're starting to see hacks that occur on our phones, viruses that get on our phones. Well, those are the same types of computers that are on drones. So if you have a swarm of drones, how can you potentially protect that swarm from an infection? Let's say there's a similar vehicle that flies into this swarm that has this virus and through those local communications it starts spreading the virus through the swarm. Well, that now means that your swarm can be taken over or perhaps doesn't do what you expect it to do. By looking at this self-removal behavior from the honey bees -- can we have the swarm members, the local members, identify that this drone is infected and make it leave or shut it down or something of that nature to keep it from spreading.

[MUSIC: “Buzz Box,” Podington Bear, used with permission of a Creative Commons Attribution-NonCommercial License]

ROBERTSON: Creating a functioning system to control drone swarms is the ultimate goal for Julie’s research. But it can’t be done alone. As she mentioned, collaborations are key. She took a big step towards that goal this year by working with a team that received a $7.1 million contract to develop a drone swarm infrastructure for up to 250 drones. The contract is part of DARPA’s OFFSET program which is short for Offensive Swarm-Enabled Tactics.

ADAMS: The work with the offset program is really: How do we create these systems that we can actually use outside and increase the numbers and solve some of the challenging interactions there -- not only from hardware and software interaction perspective but also the human interaction perspective?

ROBERTSON: To tackle such a big project will take a big team. In fact, DARPA has enlisted two teams, one lead by Raytheon BBN Technologies and the other by Northrop Grumman Corporation. Julie is on Raytheon’s team which also includes the research and development firm, Smart Information Flow Technologies. What DARPA is interested in developing is technologies for the U.S. military.

ADAMS: One of the primary examples of military usage would be in these urban environment so if you look at the environments in which our military has been deployed over the last few years, and where they envision the majority of future deployments to occur, it's probably going to be urban environments -- megacities.

[SOUND EFFECTS: City ambience, City ambiance Kansas city, and Honking traffic used with permission of Creative Commons Public Domain]

You have these environments that now become much more dangerous for humans. If you can take the swarm and sweep a neighborhood and understand what's happening in that neighborhood and make it safer for the humans to go in, then that is something that we want to be able to use these technologies for.

ROBERTSON: What are the main challenges for getting swarms to work?

ADAMS: There are lots of challenges right now. One is the communications amongst the swarm and having the swarm actually be able do things in the real world. Another challenge is just actually fielding these systems -- trying to put 50, 100, 200 vehicles into the air or get them all working at the same time on the ground is extremely challenging with limited personnel.

[SOUND EFFECTS: Drone take off, Machine blades wind remote control helicopter, used with permission of Creative Commons Public Domain]

It’s just not an easy thing to do. Now, in theory if you have 50 quad rotors, you could program them and just hit a button and they could all take off but that is not allowable under FAA regulations.

ROBERTSON: Ah yes, FAA regulations. That’s a can of worms that we won’t go into today, but I will say that Julie, in her role as associate director for CoRIS, will be working to develop rules and regulations relating to policy and safety issues with robots and artificial intelligence. Additionally, she is Oregon State’s technical point of contact for the FAA’s Center of Excellence for Unmanned Aerial Systems, and will engage Oregon State faculty in support of efforts to integrate drones into the U.S. commercial airspace.

Although we will skip over the FAA regulations I did ask Julie to comment on the ethical concerns people have about robots and artificial intelligence.

ADAMS: The reality is that technology is coming. We can say in the United States that we're not going to do that, that's fine, but I can tell you that there are other countries in the world that are going to do it anyway. My attitude has always been while I don't want the systems to be able to control us in a hundred years, we need to understand the technology or others are going to impact us negatively. One of the biggest concerns, is making sure we understand the technology, that we put proper safeguards in place to ensure that we as humans can maintain control of that technology. And I think that's the best that we can do as a society because we also want to maintain our lead in the world on this technology. The other aspect that often comes up is people's concerns about jobs and to me this robot revolution, if you will, or AI revolution is no different in many respects than things like the Agricultural Revolution. And what we've seen if you look back in history is that: yes, people will lose their jobs, people will need to retrain, but typically as a society we come out better on the other side.

ROBERTSON: So, what do you feel is the coolest thing about drone swarms?

ADAMS: That's a hard question. I think the coolest thing will be in the future when we are able to have drone swarms that are what I will call a hybrid, meaning that they have the capabilities of these very simple biological systems like schools of fish and colonies of bees and ants. But they also have the capability to do something more advanced that is really going to be useful to society -- be it pollinating our crops, or monitoring our weather, or doing a survey after a massive hurricane to understand where there is potentially people in need of help. I think having these technologies is going to make life better but also safer for humans.

[MUSIC: “Harps Uplifting” by Mortal Thing used with permission of the artist.]

ROBERTSON: That’s it, my friends. I hope you enjoyed learning more about the future of drone swarms. Stay tuned for more episodes this season on research partnerships. AND, if you’d like to hear more on the topic of human-robot interation , check out a really interesting podcast from Oregon State Science Pub featuring social roboticist Heather Knight. Just search your podcast app for Oregon State Science Pub or go to engineeringoutloud.oregonstate.edu where we have links for bonus material. This episode was produced and hosted by me, Rachel Robertson. Audio magic was performed by Brian Blythe who also wrote some of the music for the show. Thanks Brian! Our intro music is “The Ether Bunny” by Eyes Closed Audio on SoundCloud and used with permission of a Creative Commons attribution license. Other music and effects in this episode were also used with appropriate licenses. You can find the links on our website. For more episodes, visit engineeringoutloud.oregonstate.edu or subscribe (and not miss a signle episode) by searching “Engineering Out Loud” on your favorite podcast app.

ADAMS: Thanks, Mike. Bye, Mike.

ROBERTSON: Bye, bye.

GOODRICH: Bye.

ROBERTSON: Alright, well, that was fun.

Featured Researchers