Projected Texture Fusion
Publication from Digital
Proceedings of the 10th International Symposium on Image and Signal Processing and Analysis (accepted) , 7/2017
Active consumer grade depth sensors have motivated recent research on volumetric depth map fusion. This led to the development of new, efficient, video-rate integration and tracking methods. These approaches still suffer from the geometric inaccuracies of the input depth maps of consumer grade depth sensors. This paper presents a practical stereo system that combines highly accurate and robust projected texture stereo and efficient volumetric integration and allows to easily capture accurate 3D models of indoor scenes. We describe a stereo method that is optimized for random dot projection patterns and delivers complete and robust results. We also show the complementing hardware setup that delivers accurate, complete depth maps. Results of a real-world scene are compared to ground truth data.