Skip to content

Integrating Multimodal Perception into Ground Mobile Robots

Abstract

Multimodal perception systems enhance the robustness and adaptability of autonomous mobile robots by integrating heterogeneous sensor modalities, improving long-term localisation and mapping in dynamic environments and human-robot interaction. Current mobile platforms often focus on specific sensor configurations and prioritise cost-effectiveness, possibly limiting the flexibility of the user to extend the original robots further. This paper presents a methodology to integrate multimodal perception into a ground mobile platform, incorporating wheel odometry, 2D laser scanners, 3D Light Detection and Ranging (LiDAR), and RGBD cameras. The methodology describes the electronics design to power devices, firmware, computation and networking architecture aspects, and mechanical mounting for the sensory system based on 3D printing, laser cutting, and bending metal sheet processes. Experiments demonstrate the usage of the revised platform in 2D and 3D localisation and mapping and pallet pocket estimation applications. All the documentation and designs are accessible in a public repository.

Keywords: Light Detection and Ranging (LiDAR), mobile robot, multimodal perception, open-source, RGBD camera.

This repository contains all the documentation associated with the modifications made by INESC TEC on the Hangfa Discovery Q2 mobile platform in order to be compatible with multimodal perception. These modifications also enable the platform to be integrated in the Robot Operating System (ROS), facilitating its usage for research topics such as perception, localisation and mapping, multi-robot coordination (when more than one platform is available to the user), Artificial Intelligence (AI) applied on autonomous mobile robotics, among other topics.

Furthermore, the main goals of this website and the respective GitHub repository is to help researchers interested in modifying their Hangfa mobile platforms or applying all the modifications that we have made to other similarly small mobile platforms. As a result, the website includes the following information:

  • Platform: brief presentation of the Hangfa Discovery Q2 robot mobile platform
  • Bill Of Materials (BOM): summary on the components used for the modifications to the platform
  • Electronics: presentation of the electronics redesign (battery management and power budgets, motor drivers, encoders reading, and external DC power output for the user)
  • Single Board Computer (SBC): computing units considered in the work and their configuration in terms of Operating System (OS), Robot Operating System (ROS) setup, configuring remote access, set up development environment, and configuring the firmware communication
  • Network: TBC
  • Sensors: TBC
  • ... To Be Completed (TBC)

Lastly, this work is within the scope of the Mobile Robotics Development Team (MRDT) in the national project GreenAuto: Green innovation for the Automotive Industry. MRDT team is a Research & Development (R&D) team from the CRIIS - Centre for Robotics in Industry and Intelligent Systems at the iiLab - Industry and Innovation Laboratory.

Videos

History

The Hangfa Discovery Q2 platforms were firstly used at CRIIS - Centre for Robotics in Industry and Intelligent Systems from INESC TEC - Institute for Systems and Computer Engineering, Technology and Science by a student in 2016. This student was Fernando Jorge Marques de Sá, being enrolled in the Master of Science (MSc) course in Electrical and Computers Engineering (ECE) at the Faculty of Engineering, University of Porto (FEUP). The modifications made to the Hangfa platform in order to make it compatible with perception and Simultaneous and Localization and Mapping (SLAM) on mobile ground robots were not developed during Fernando's MSc Thesis. Nevertheless, those modifications were possible given the knowledge gained in terms of the platform limitations and usage.

Citation

Fernando Jorge Marques de Sá (2016). Sistema de navegação para plataforma móvel omnidirecional [Master's thesis, Faculty of Engineering, University of Porto (FEUP)]. Open Repository of the University of Porto. URI: https://hdl.handle.net/10216/85263

Document

Contacts

If you have any questions or you want to know more about this work, please contact one of the following contributors:

Institutions

  • INESC TEC Logo
  • FEUP Logo

Acknowledgements

Funding

GreenAuto: Green innovation for the Automotive Industry

Citation

Plain Text

R.B. Sousa, H.M. Sobreira, J.G. Martins, P.G. Costa, M.F. Silva and A.P. Moreira, "Integrating Multimodal Perception into Ground Mobile Robots," 2025 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC2025), Madeira, Portugal, 2025, pp. TBD, doi: TBD [Manuscript accepted for publication]. [github] [preprint]

BibTex

@INPROCEEDINGS{sousa2025icarsc,
  author    = {Ricardo B. Sousa and Héber Miguel Sobreira and João G. Martins and Paulo G. Costa and Manuel F. Silva and António P. Moreira},
  booktitle = {2025 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC)},
  title     = {Integrating Multimodal Perception into Ground Mobile Robots},
  year      = {2025},
  volume    = {},
  number    = {},
  pages     = {--},
  doi       = {},
  note      = {Manuscript accepted for publication},}