Multi-Camera Calibration and Stitching Under Automated Scenarios
Many vehicles use multiple cameras to provide an unobstructed view to the operator and provide information about the environment around them. As these cameras are placed sparsely around the vehicle, the video sequences are not easily mapped to a regular surface; therefore, distortion from the irregular mapping process provides an insufficient reproduction of the exterior environment. In addition, currently employed mapping methods are based on mapping points at infinity and thus while background objects are generally correct, foreground objects become distorted, especially when transferring from one camera to the another. Our objective is to develop a projection and stitching system that is able to correctly map images into a view space that is relatively correct for the operator, while maintaining an overall easy-to-use and low-power operation, in a real-time environment.