Please use this identifier to cite or link to this item: http://localhost/handle/Hannan/606199
Title: Handling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimation
Authors: Yucheng Wang;Jian Zhang;Zicheng Liu;Qiang Wu;Philip A. Chou;Zhengyou Zhang;Yunde Jia
subject: rotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) data
Year: 2016
Publisher: IEEE
Abstract: The accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.
URI: http://localhost/handle/Hannan/137659
http://localhost/handle/Hannan/606199
ISSN: 1051-8215
1558-2205
volume: 26
issue: 7
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File
Title: Handling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimation
Authors: Yucheng Wang;Jian Zhang;Zicheng Liu;Qiang Wu;Philip A. Chou;Zhengyou Zhang;Yunde Jia
subject: rotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) data
Year: 2016
Publisher: IEEE
Abstract: The accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.
URI: http://localhost/handle/Hannan/137659
http://localhost/handle/Hannan/606199
ISSN: 1051-8215
1558-2205
volume: 26
issue: 7
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File
Title: Handling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimation
Authors: Yucheng Wang;Jian Zhang;Zicheng Liu;Qiang Wu;Philip A. Chou;Zhengyou Zhang;Yunde Jia
subject: rotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) data
Year: 2016
Publisher: IEEE
Abstract: The accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.
URI: http://localhost/handle/Hannan/137659
http://localhost/handle/Hannan/606199
ISSN: 1051-8215
1558-2205
volume: 26
issue: 7
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File