How can a robot learn to use its different parts for interactions?
How can we go beyond proprioception for robust mobile manipulation?
What abstractions are necessary to describe multiple tasks?

To find some answers to the above questions, I am currently a PhD student at ETH Zurich advised by Marco Hutter, and a Research Scientist at NVIDIA Research.

Over the span of my career, I have had the opportunity to work with some amazing robotic groups on many different robotic platforms. I have been a visiting student researcher at Vector Institute, a research intern at NNAISENSE, and a part-time research engineer at ETH Zurich. During my undergrad at IIT Kanpur, I was a visiting student at University of Freiburg, Germany, working closely with Abhinav Valada and Wolfram Burgard. I also founded the AUV-IITK team, where I worked on different hardware and software aspects of building an autonomous underwater vehicle.

If you have questions or would like to discuss ideas, feel free to reach out through email!

news

Feb 1, 2024 Four papers (task symmetry in RL, pedipulation, semantic navigation and surgical benchmark) accepted to ICRA 2024
Apr 20, 2023 Our paper on ‘Orbit: A Unified Simulation Framework for Interactive Robot Learning Environments’ is accepted to IEEE RA-L and will be presented at IROS 2023
Jul 1, 2022 Our papers on articulated object and in-hand manipulation are accepted to IROS 2022 :robot:
Jan 31, 2022 Our paper on ‘A Collision-Free MPC for Whole-Body Dynamic Locomotion and Manipulation’ is accepted to ICRA 2022
Oct 7, 2021 Joined Marco Hutter’s group at ETH Zurich as a PhD student

publications

  1. Symmetry Considerations for Learning Task Symmetric Robot Policies Mayank Mittal, Nikita Rudin, Victor Klemm, Arthur Allshire, and Marco Hutter ICRA 2024 [Abs] [arXiv]
  2. Pedipulate: Enabling Manipulation Skills using a Quadruped Robot’s Leg Philip Arm, Mayank Mittal, Hendrik Kolvenbach, and Marco Hutter ICRA 2024 [Abs] [arXiv] [Video] [Website]
  3. ViPlanner: Visual Semantic Imperative Learning for Local Navigation Pascal Roth, Julian Nubert, Fan Yang, Mayank Mittal, and Marco Hutter ICRA 2024 [Abs] [arXiv] [Video] [Code]
  1. ORBIT: A Unified Simulation Framework for Interactive Robot Learning Environments Mayank Mittal, Calvin Yu, Qinxi Yu, Jingzhou Liu, Nikita Rudin, David Hoeller, and others IEEE RA-L 2023 [Abs] [arXiv] [Website] [Code]
  1. A Collision-Free MPC for Whole-Body Dynamic Locomotion and Manipulation Jia-Ruei Chiu, Jean-Pierre Sleiman, Mayank Mittal, Farbod Farshidian, and Marco Hutter ICRA 2022 [Abs] [arXiv] [Video]
  2. Transferring Dexterous Manipulation from GPU Simulation to a Remote Real-World TriFinger Arthur Allshire, Mayank Mittal, Varun Lodaya, Viktor Makoviychuk, Denys Makoviichuk, Felix Widmaier, Manuel Wüthrich, Stefan Bauer, Ankur Handa, and Animesh Garg IROS 2022 [Abs] [arXiv] [Website] [Code]
  3. Articulated Object Interaction in Unknown Scenes with Whole-Body Mobile Manipulation Mayank Mittal, David Hoeller, Farbod Farshidian, Marco Hutter, and Animesh Garg IROS 2022 [Abs] [arXiv] [Website]
    1. Neural Lyapunov Model Predictive Control Mayank Mittal, Marco Gallieri, Alessio Quaglino, Seyed Sina Mirrazavi Salehian, and Jan Koutnik (Under Review) [Abs] [arXiv]
    2. Learning Camera Miscalibration Detection Andrei Cramariuc, Aleksandar Petrov, Rohit Suri, Mayank Mittal, Roland Siegwart, and Cesar Cadena ICRA 2020 [Abs] [arXiv] [Code]
    1. Vision-based Autonomous UAV Navigation and Landing for Urban Search and Rescue Mayank Mittal, Rohit Mohan, Wolfram Burgard, and Abhinav Valada ISRR 2019 [Abs] [arXiv] [Website]
    1. Vision-based Autonomous Landing in Catastrophe-Struck Environments Mayank Mittal, Abhinav Valada, and Wolfram Burgard Workshop on Vision-based Drones: What’s Next?
      IROS 2018
      [Abs] [arXiv] [Video] [PDF]