Tightly coupled visual-inertial-LI DAR SLAM for real time application
Since Amazon robotics expanded the use of drones to package deliveries to customers, drone applications have been expanded to many industries along with its ability to perform various tasks autonomously. The fundamental technology of drones’ autonomy comes from perceiving its surrounding, creating its own map based on onboard sensors and estimate its location within the map. This technology, also known as Simultaneous Localization and Mapping (SLAM), has been on the rise especially in mining and construction industries for surveying and mapping the site more efficiently; thus, many research works have been performed to improve robot’s SLAM technology. Although various sensor suites have been researched to improve SLAM performance, this project focuses on the novel contribution of developing a robust and accurate 3D SLAM by jointly optimizing stereo cameras, IMU and LiDAR measurements. This project will not only advance the field of autonomous navigation but will also help ARA Robotique to be competitive in the UAV market.