This project is funded by the Office of Naval Research Global (ONRG) of the US.
More LinksIn this project, we explore how a team of diverse robots can collaboratively monitor complex environments, such as bustling seaports or major city events. Equipped with varied sensors like cameras and microphones, each robot gathers data from its unique perspective. While some advancements exist in robots reaching consensus on basic features, integrating high-level reasoning with diverse sensing remains a challenge. Our goal is to bridge this gap, utilizing advanced control and recognition techniques. Through our algorithms, robots will not only identify key elements and events but also optimize data acquisition and inter-robot communication, ensuring a consistent scene understanding across the team, all while relying on local sensing and minimal robot-to-robot communications.
Learning scalable and efficient communication policies for multi-robot collision avoidance
In Autonomous Robots 47, 1275-1297,
2023.
Active Classification of Moving Targets with Learned Control Policies
In IEEE Robotics and Automation Letters (RA-L),
2023.
With Whome to Communicate: Learning Efficient Communication for Multi-Robot Collision Avoidance
In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS),
2020.