Session: Autonomous AI Navigation Systems
Autonomous navigation is more than just self-driving cars—it’s poised to transform industries and everyday life. From robots operating on industrial floors and in restaurants to AI systems helping the visually impaired navigate their daily environments, this field has vast potential to reshape how we interact with the world. In this talk, we’ll explore the complexities of autonomous navigation systems, their wide-ranging applications, and key challenges such as real-time, on-device decision-making, environmental variability, and safety. We'll also take a closer look at the AI powering these systems, with a motivating case study of the AI Guide Dog, a blind assistance system that brings AI navigation to life.
Bio
Aishwarya, currently a Machine Learning Engineer at Waymo, and former AI Engineer at Tesla Autopilot has been in the Autonomous Vehicles industry, primarily working at the intersection of Computer Vision and Large-Scale ML Engineering, to help advance the future on Autonomous Driving and Robotics systems. Previously a Machine Learning Engineer at Google Ads and Technical Lead at Morgan Stanley, Aishwarya has over 4 years of experience in the domains of Large-Scale Deep Learning, and Systems. Aishwarya holds a Master’s degree from the School of Computer Science at Carnegie Mellon University where she was involved in research on Navigation Assistance Systems for the Blind using Multimodal Learning and Deep Vision approaches. She also serves as a Technical Mentor for various research initiatives at CMU.