Image

Robotics Open House 2020 (Remote)

Thank you to our sponsor!

Image
Thank you middle and high school students, families, and teachers who attended USC's Remote Robotics Open House 2020!

Monday May 18: 3:30 - 4:30 pm
Space Robots

PhD Student: Isabel Rayas
Prof. Sukhatme’s Robotics Embedded Systems Lab  
Title: Mars 2020 Mobility Testing
Description: This is a talk about what goes into testing a Mars rover before its launch. I'll discuss my work with a team at NASA Jet Propulsion Laboratory, where our goal is to make sure the rover can use its science instruments correctly while it drives along the surface of Mars.

Image

PhD Student: Lilly Clark
Bhaskar Krishnamachari’s Autonomous Networks Research Group and Konstantinos Psounis’ Networked Systems Performance and Design Lab  
Title: Moon Exploration with a Network of Driving Robots
Description: Lilly Clark presents an approach to making maps of underground tunnels on the moon with four or more small, driving robots. The main challenge is localization, or the ability of each robot to autonomously determine its location. The key idea of my research is to equip each robot with an ultra-wideband radio and to use bounced signals to estimate location.

Image

Tuesday May 19: 3:30 - 4:30 pm
Virtual Environments and Marine Environments

PhD Student: Tom Groechel
Prof. Matarić’s Interaction Lab  
Title: Live Mixed Reality Demo and how it Applies to Socially Assistive Robotics
Description: I'll be doing a live demo of the Hololens 2 and how we make designs and applications for an educational robot tutor.

Image

PhD Student: Chris Denniston
Prof. Sukhatme’s Robotics Embedded Systems Lab  
Title: Robots from the Black Lagoon! How Robots can Help Understand What's Happening in Lakes and Oceans.
Description: I will discuss how robots are transforming our understanding and ability to do science in lakes and oceans. Robots are a fast-growing tool in many scientists' toolkits to understand many different processes in marine environments. They are so important that over 3,000 oceanographic papers have been authored which rely exclusively on data gathered by robots. The talk will begin with an overview of the world of underwater robots and then progress to what research questions our team is currently working on.

Image

Wednesday May 20: 3:30 - 4:30 pm
                 Physical Assistance and Physical Therapy                   

PhD Student: Nathan Dennler
Prof. Matarić’s Interaction Lab and Prof. Nikolaidis’ lab   
Title: Robot-Assisted Hair Combing
Description: When people acquire a disability, they can't do some things as easily as they used to do them. Someone might need more assistance with getting ready in the morning for example, but maybe they aren't able to (or don't want to) have someone always watching over them. We are working on a robot to brush people's hair so that people who can't brush their hair as easily anymore can still get ready in the morning and have some choices about how they look. By learning more about how robots can physically interact with people, we can see how robots might want to behave in situations where they are closely interacting with people, and how they should behave to be both helpful and safe!

Image

PhD Student: Lauren Klein
Prof. Matarić’s Interaction Lab
Title: Encouraging Babies to Kick with Socially Assistive Robots
Description: Practicing their motor skills at an early age is important to infants’ development. Lauren Klein will talk about a robotic companion that encourages infants to kick more often by rewarding their kicking behavior.

Image

Thursday May 21: 3:30 - 4:30 pm
Emotion Detection

PhD Student: Zhonghao Shi
Prof. Matarić’s Interaction Lab
Title: Emotional Intelligence for Robots
Description: In this presentation, we will discuss how robots model and perceive emotions and social behaviors using information from different modalities such as video, audio and text.

Image

PhD Student: Leena Mathur
Prof. Matarić’s Interaction Lab
Title: Emotion AI: Developing Robots and Machines with Emotional Intelligence
Description: How can robots and machines automatically recognize people's emotions when interacting with them? This talk will discuss how machines can use a person's facial expressions, body movements, and voice quality to detect their emotions. There will also be a demo video from my research on automatic emotion recognition.

Image

Friday May 22
Robot Simulators

PhD Student: David Millard
Prof. Sukhatme’s Robotics Embedded Systems Lab
Title: Robot Simulators: Virtual Worlds Where Robots Safely Practice Their Skills
Description: Robots are being used to perform new tasks every day. Before a robot can safely be trusted to carry out a new skill, engineers test out the robot in a virtual world called a physical simulator. I'll talk about how physics simulators are used in robotic engineering, and also about new techniques using simulators to help robots teach themselves about the real world.

Image

PhD Student: Matt Rueben
Prof. Sukhatme’s Robotics Embedded Systems Lab
Title: Robot Olympics: Learning how a new robot works by playing a game with it.                                                                                                              Description: When you first take a new robot out of the box, what do you do to figure out how it works? What kinds of sensors does it have? What can it do? We made a game to help people learn about a new robot – a mix of AI, psychology, and fun with a real robot!

 

Image