Please use this identifier to cite or link to this item: http://localhost/handle/Hannan/606199
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYucheng Wangen_US
dc.contributor.authorJian Zhangen_US
dc.contributor.authorZicheng Liuen_US
dc.contributor.authorQiang Wuen_US
dc.contributor.authorPhilip A. Chouen_US
dc.contributor.authorZhengyou Zhangen_US
dc.contributor.authorYunde Jiaen_US
dc.date.accessioned2020-05-20T09:02:46Z-
dc.date.available2020-05-20T09:02:46Z-
dc.date.issued2016en_US
dc.identifier.issn1051-8215en_US
dc.identifier.issn1558-2205en_US
dc.identifier.other10.1109/TCSVT.2015.2462011en_US
dc.identifier.urihttp://localhost/handle/Hannan/137659en_US
dc.identifier.urihttp://localhost/handle/Hannan/606199-
dc.description.abstractThe accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.en_US
dc.publisherIEEEen_US
dc.relation.haspart7172518.pdfen_US
dc.subjectrotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) dataen_US
dc.titleHandling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimationen_US
dc.typeArticleen_US
dc.journal.volume26en_US
dc.journal.issue7en_US
dc.journal.titleIEEE Transactions on Circuits and Systems for Video Technologyen_US
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYucheng Wangen_US
dc.contributor.authorJian Zhangen_US
dc.contributor.authorZicheng Liuen_US
dc.contributor.authorQiang Wuen_US
dc.contributor.authorPhilip A. Chouen_US
dc.contributor.authorZhengyou Zhangen_US
dc.contributor.authorYunde Jiaen_US
dc.date.accessioned2020-05-20T09:02:46Z-
dc.date.available2020-05-20T09:02:46Z-
dc.date.issued2016en_US
dc.identifier.issn1051-8215en_US
dc.identifier.issn1558-2205en_US
dc.identifier.other10.1109/TCSVT.2015.2462011en_US
dc.identifier.urihttp://localhost/handle/Hannan/137659en_US
dc.identifier.urihttp://localhost/handle/Hannan/606199-
dc.description.abstractThe accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.en_US
dc.publisherIEEEen_US
dc.relation.haspart7172518.pdfen_US
dc.subjectrotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) dataen_US
dc.titleHandling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimationen_US
dc.typeArticleen_US
dc.journal.volume26en_US
dc.journal.issue7en_US
dc.journal.titleIEEE Transactions on Circuits and Systems for Video Technologyen_US
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYucheng Wangen_US
dc.contributor.authorJian Zhangen_US
dc.contributor.authorZicheng Liuen_US
dc.contributor.authorQiang Wuen_US
dc.contributor.authorPhilip A. Chouen_US
dc.contributor.authorZhengyou Zhangen_US
dc.contributor.authorYunde Jiaen_US
dc.date.accessioned2020-05-20T09:02:46Z-
dc.date.available2020-05-20T09:02:46Z-
dc.date.issued2016en_US
dc.identifier.issn1051-8215en_US
dc.identifier.issn1558-2205en_US
dc.identifier.other10.1109/TCSVT.2015.2462011en_US
dc.identifier.urihttp://localhost/handle/Hannan/137659en_US
dc.identifier.urihttp://localhost/handle/Hannan/606199-
dc.description.abstractThe accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.en_US
dc.publisherIEEEen_US
dc.relation.haspart7172518.pdfen_US
dc.subjectrotation|scene flow|Large displacement motion|occlusion|red-green-blue-depth (RGB-D) dataen_US
dc.titleHandling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimationen_US
dc.typeArticleen_US
dc.journal.volume26en_US
dc.journal.issue7en_US
dc.journal.titleIEEE Transactions on Circuits and Systems for Video Technologyen_US
Appears in Collections:2016

Files in This Item:
File Description SizeFormat 
7172518.pdf2.64 MBAdobe PDFThumbnail
Preview File