Tightly-coupled Visual-Inertial-LiDAR SLAM

Since Amazon robotics expanded the use of drones to package deliveries to customers, drone applications have been expanded to many industries along with its ability to perform various tasks autonomously. The fundamental technology of drones’ autonomy comes from perceiving its surrounding, creating its own map based on onboard sensors and estimate its location within the map. This technology, also known as Simultaneous Localization and Mapping (SLAM), has been on the rise especially in mining and construction industries for surveying and mapping the site more efficiently; thus, many research works have been performed to improve robot’s SLAM technology. Although various sensor suites have been researched to improve SLAM performance, this project focuses on the novel contribution of developing a robust and accurate 3D SLAM by jointly optimizing stereo cameras, IMU and LiDAR measurements. TO BE CONT’D

Faculty Supervisor:

James Richard Forbes;David Meger

Student:

Kyungmin Jung

Partner:

ARA Robotique

Discipline:

Engineering - mechanical

Sector:

Manufacturing

University:

Program:

Accelerate

Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects