Abstract
This paper presents a pooling-based hierarchical model to extract a dense matching set for optical flow estimation. The proposed model down-samples basic image features (gradient and colour) with min and max pooling, to maintain distinctive visual features from the original resolution to the highly down-sampled layers. Subsequently, patch descriptors are extracted from the pooling results for coarse-to-fine patch matching. In the matching process, the local optimum correspondence of patches is found with a four-step search, and then refined by a velocity propagation algorithm. This paper also presents a method to detect matching outliers by checking the consistency of motion-based and colour-based segmentation. We evaluate the proposed method on two benchmarks, MPI-Sintel and Kitti-2015, using two criteria: the matching accuracy and the accuracy of the resulting optical flow estimation. The results indicate that the proposed method is more efficient, produces more matches than the existing algorithms, and improves significantly the accuracy of optical flow estimation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Achanta, R., Shaji, A., Smith, K., Lucchi, A., Fua, P., Süsstrunk, S.: SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans. Pattern Anal. Mach. Intell. 34(11), 2274–2281 (2012)
Bailer, C., Taetz, B., Stricker, D.: Flow fields: dense correspondence fields for highly accurate large displacement optical flow estimation. In: ICCV, pp. 4015–4023 (2015)
Barnes, C., Shechtman, E., Goldman, D.B., Finkelstein, A.: The generalized patchmatch correspondence algorithm. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6313, pp. 29–43. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15558-1_3
Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Pajdla, T., Matas, J. (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24673-2_3
Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. Pattern Anal. Mach. Intell. 33, 500–513 (2011)
Brox, T., Malik, J., Bregler, C.: Large displacement optical flow. In: CVPR, pp. 41–48 (2009)
Butler, D.J., Wulff, J., Stanley, G.B., Black, M.J.: A naturalistic open source movie for optical flow evaluation. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7577, pp. 611–625. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33783-3_44
Chen, Q., Koltun, V.: Full flow: optical flow estimation by global optimization over regular grids. In: CVPR, pp. 4706–4714 (2016)
Dollár, P., Zitnick, C.L.: Structured forests for fast edge detection. In: ICCV, pp. 1841–1848 (2013)
Dong, X., Shen, J., Shao, L.: Hierarchical superpixel-to-pixel dense matching. IEEE Trans. Circ. Syst. Video Technol. 27(12), 2518–2526 (2017)
Dosovitskiy, A., et al.: FlowNet: learning optical flow with convolutional networks. In: ICCV, pp. 2758–2766 (2015)
He, K., Sun, J.: Computing nearest-neighbor fields via propagation-assisted KD-trees. In: CVPR, pp. 111–118 (2012)
Hu, Y., Li, Y., Song, R.: Robust interpolation of correspondences for large displacement optical flow. In: CVPR, vol. 2017-Janua, pp. 4791–4799 (2017)
Hu, Y., Song, R., Li, Y.: Efficient coarse-to-fine patchmatch for large displacement optical flow. In: CVPR, pp. 5704–5712 (2016)
Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., Brox, T.: Flownet 2.0: evolution of optical flow estimation with deep networks. In: CVPR, pp. 1–8 (2017)
Liu, C., Yuen, J., Member, S., Torralba, A.: SIFT flow: dense correspondence across scenes and its applications. IEEE Trans. Pattern Anal. Mach. Intell. 33(5), 978–994 (2011)
Liu, C., Yuen, J., Torralba, A., Sivic, J., Freeman, W.T.: SIFT flow: dense correspondence across different scenes. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5304, pp. 28–42. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-88690-7_3
Lu, J., Li, Y., Yang, H., Min, D., Eng, W., Do, M.N.: PatchMatch filter: edge-aware filtering meets randomized search for visual correspondence. IEEE Trans. Pattern Anal. Mach. Intell. 39(9), 1866–1879 (2017)
Mayer, N., et al.: A large dataset to train convolutional networks for disparity, optical flow, and scene flow estimation. In: CVPR, pp. 4040–4048 (2015)
Menze, M., Heipke, C., Geiger, A.: Joint 3D estimation of vehicles and scene flow. In: ISPRS Workshops, pp. 427–434 (2015)
Menze, M., Heipke, C., Geiger, A.: Object scene flow. ISPRS J. Photogrammetry Remote Sens. 140, 60–76 (2018)
Ranjan, A., Black, M.J.: Optical flow estimation using a spatial pyramid network. In: CVPR, pp. 1–8 (2017)
Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: Epicflow: edge-preserving interpolation of correspondences for optical flow. In: CVPR, pp. 1164–1172 (2015)
Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: Deepmatching: hierarchical deformable dense matching. Int. J. Comput. Vis. 120, 300–323 (2016)
Sevilla-Lara, L., Sun, D., Learned-Miller, E.G., Black, M.J.: Optical flow estimation with channel constancy. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8689, pp. 423–438. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10590-1_28
Sun, D., Yang, X., Liu, M.Y., Kautz, J.: PWC-Net: CNNs for optical flow using pyramid, warping, and cost volume. In: CVPR, pp. 1–8 (2017)
Unger, M., Werlberger, M., Pock, T., Bischof, H.: Joint motion estimation and segmentation of complex scenes with label costs and occlusion modeling. In: CVPR, pp. 1878–1885 (2012)
Wulff, J., Sevilla-Lara, L., Black, M.J.: Optical flow in mostly rigid scenes. In: CVPR, pp. 6911–6920 (2017)
Xu, J., Ranftl, R., Koltun, V.: Accurate optical flow via direct cost volume processing. In: CVPR, pp. 1289–1297 (2017)
Yang, J., Li, H.: Dense, accurate optical flow estimation with piecewise parametric model. In: CVPR, pp. 1019–1027 (2015)
Yang, Y., Soatto, S.: S2F: slow-to-fast interpolator flow. In: CVPR, pp. 3767–3776 (2017)
Yao, J., Boben, M., Fidler, S., Urtasun, R.: Real-time coarse-to-fine topologically preserving segmentation. In: CVPR, pp. 2947–2955 (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Tang, X., Phung, S.L., Bouzerdoum, A., Tang, V.H. (2019). Pooling-Based Feature Extraction and Coarse-to-fine Patch Matching for Optical Flow Estimation. In: Jawahar, C., Li, H., Mori, G., Schindler, K. (eds) Computer Vision – ACCV 2018. ACCV 2018. Lecture Notes in Computer Science(), vol 11364. Springer, Cham. https://doi.org/10.1007/978-3-030-20870-7_37
Download citation
DOI: https://doi.org/10.1007/978-3-030-20870-7_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-20869-1
Online ISBN: 978-3-030-20870-7
eBook Packages: Computer ScienceComputer Science (R0)