Chahat Deep Singh

  • Assistant Professor
  • Robotics and System Design, Computer Vision, Artificial Intelligence, Sensing
Address

Office Location: ECES 152
Lab Location: ECES 1B14 and DLC 2B40

Research Interests

Autonomous Drone, Computer Vision, Neuromorphic Perception and Computational Imaging

We work at the intersection of Perception, Robotics, AI and Computational Imaging in order to push the boundaries of robot autonomy. Our Minimal Perception framework holds the key to developing small yet intelligent and efficient robot navigation systems. We focus on active and bio-inspired perception algorithms, validating our theories by developing and deploying them on robots. At PRAISe, we pioneer minimal perception framework enabling tiny robot autonomy through onboard novel sensing and computation. We strive to employ our robotics solutions to do good and to make meaningful contributions to humanity and nature.

Background

Chahat Deep Singh is an Assistant Professor in Robotics in the Paul M. Rady Department of Mechanical Engineering and the He graduated with doctoral and master's degrees in Computer Science and Robotics respectively from the University of Maryland. Singh’s research focuses on developing minimal perception architectures to enable onboard autonomy on robots as small as a credit card. Singh's Postdoctoral research was funded by the Army Research Laboratory and Maryland Robotics Center (2023-2024). He was also awarded the Ann G. Wylie Fellowship for outstanding dissertation for 2022-2023, Future Faculty Fellowship 2022-2023, and University of Maryland's Dean Fellowship 2018-2020. His work has been featured on the cover of Science Robotics, BBC, IEEE Spectrum, Voice of America, NVIDIA, Futurism, Maryland Today, Tech Crunch, and much more. Singh has also served as a reviewer for T-PAMI, T-CI, RA-L, T-ASE, CVPR, ICRA, IROS, ICCV, and RSS among other top journals and conferences.

Selected Publications

  • Singh, C.D.*, Sanket, N.J.*, Ganguly, K., Fermüller, C. and Aloimonos, Y., 2018. Gapflyt: Active vision based minimalist structure-less gap detection for quadrotor flight. IEEE Robotics and Automation Letters, 3(4), pp.2799-2806. (*Equal Contribution)
  • Singh, C.D.*, Sanket, N.J.*, Fermüller, C. and Aloimonos, Y., 2023. Ajna: Generalized deep uncertainty for minimal perception on parsimonious robots. Science Robotics, 8(81), p.eadd5139. (*Equal Contribution)
  • Shah, S.*, Rajyaguru, N.*, Singh, C.D., Metzler, C. and Aloimonos, Y., 2024. CodedVO: Coded Visual Odometry. IEEE Robotics and Automation Letters. (*Equal Contribution)
  • Singh, C.D., He, B., and ALOIMONOS, J., 2024. Minimal perception: enabling autonomy in resource-constrained robots. Frontiers in Robotics and AI, 11, p.1431826.
  • Sanket, N.J., Parameshwara, C.M., Singh, C.D., Kuruttukulam, A.V., Fermüller, C., Scaramuzza, D. and Aloimonos, Y., 2020, May. Evdodgenet: Deep dynamic obstacle dodging with event cameras. In 2020 IEEE International Conference on Robotics and Automation (ICRA) (pp. 10651-10657). IEEE.