A Modified ICP Algorithm Based on FAST and Optical Flow for 3D Registration
Konrad Koniarski, Andrzej Myśliński
DOI: http://dx.doi.org/10.15439/2022F28
Citation: Proceedings of the 17th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 30, pages 531–534 (2022)
Abstract. This paper presents a modified Iterative Closest Point (ICP) algorithm based on a suitable selection of initial points and local optical flow to speed up registration of static scenes with high accuracy. The biggest disadvantages of using standard ICP algorithm are appropriate initialization and effective matching point step in each iteration. In the proposed modification we deal with these problems and optimize this method for Augmented Reality application. As this application uses RGB-D images sequence the changes between consecutive key-frames are small. Therefore only small subset of the source image key-points is selected using scale-space pyramid and FAST approaches. It leads to the significant reduction of the number of the processed image points. Since the point matching technique using local optical flow is applied, in each optimization step of ICP the costly point matching procedure can be abandoned. The proposed approach has been validated by the numerical examples.
References
- Q. Y. Zhou, J. Park, and V. Koltun, “Fast global registration,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9906 LNCS. Springer Verlag, 2016. http://dx.doi.org/10.1007/978-3-319-46475-6_47. ISBN 9783319464749. ISSN 16113349 pp. 766–782.
- A. W. Fitzgibbon, “Robust registration of 2D and 3D point sets,” Image and Vision Computing, vol. 21, no. 13-14, pp. 1145–1153, 12 2003. http://dx.doi.org/10.1016/J.IMAVIS.2003.09.004
- C. Kerl, J. Sturm, and D. Cremers, “Dense visual SLAM for RGB-D cameras,” in IEEE International Conference on Intelligent Robots and Systems, 2013. http://dx.doi.org/10.1109/IROS.2013.6696650. ISBN 9781467363587. ISSN 21530858 pp. 2100–2106.
- Y. He, B. Liang, J. Yang, S. Li, and J. He, “An Iterative Closest Points Algorithm for Registration of 3D Laser Scanner Point Clouds with Geometric Features,” Sensors 2017, Vol. 17, Page 1862, vol. 17, no. 8, p. 1862, 8 2017. http://dx.doi.org/10.3390/S17081862. [Online]. Available: https://www.mdpi.com/1424-8220/17/8/1862/htmhttps://www.mdpi.com/1424-8220/17/8/1862
- E. Marchand, H. Uchiyama, and F. Spindler, “Pose Estimation for Augmented Reality: A Hands-On Survey,” IEEE Transactions on Visualization and Computer Graphics, vol. 22, no. 12, pp. 2633–2651, 12 2016. http://dx.doi.org/10.1109/TVCG.2015.2513408
- E. Rosten and T. Drummond, “Fusing points and lines for high performance tracking,” Proceedings of the IEEE International Conference on Computer Vision, vol. II, pp. 1508–1515, 2005. http://dx.doi.org/10.1109/ICCV.2005.104
- Koniarski, “Augmented reality using optical flow,” Proceedings of the 2015 Federated Conference on Computer Science and Information Systems, FedCSIS 2015, pp. 841–847, 10 2015. http://dx.doi.org/10.15439/2015F202.[Online]. Available: https://fedcsis.org/proceedings/2015/drp/202.html
- Mahesh and M. V. Subramanyam, “Automatic feature based image registration using SIFT algorithm,” in 2012 3rd International Conference on Computing, Communication and Networking Technologies, ICCCNT 2012, 2012. http://dx.doi.org/10.1109/ICCCNT.2012.6396024
- A. Fontes and J. E. B. Maia, “Visual Odometry for RGB-D Cameras,” 3 2022. http://dx.doi.org/10.48550/arxiv.2203.15119. [Online]. Available: https://arxiv.org/abs/2203.15119v1
- J. Sturm, N. Engelhard, F. Endres, W. Burgard, and D. Cremers, “A benchmark for the evaluation of RGB-D SLAM systems,” in IEEE International Conference on Intelligent Robots and Systems, 2012. http://dx.doi.org/10.1109/IROS.2012.6385773. ISBN 9781467317375. ISSN 21530858 pp. 573–580.
- J. Park, Q. Y. Zhou, and V. Koltun, “Colored Point Cloud Regis- tration Revisited,” Proceedings of the IEEE International Conference on Computer Vision, vol. 2017-Octob, pp. 143–152, 12 2017. http://dx.doi.org/10.1109/ICCV.2017.25
- R. A. Newcombe, R. A. Newcombe, S. Izadi, O. Hilliges, D. Molyneaux, D. Kim, A. J. Davison, P. Kohli, J. Shotton, S. Hodges, and A. Fitzgibbon, “KinectFusion: Real-time dense surface mapping and tracking,” in 2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, 2011. http://dx.doi.org/10.1109/ISMAR.2011.6092378. ISBN 9781457721830 pp. 127–136. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.221.100
- K. Koniarski and A. Myśliński, “Feature Point Cloud Based Registration in Augmented Reality,” Lecture Notes in Networks and Systems, vol. 364 LNNS, pp. 418–427, 12 2021. http://dx.doi.org/10.1007/978-3-030-92604-5_37. [Online]. Available: https://link.springer.com/chapter/10.1007/978-3-030-92604-5_37
- S. Akpinar and F. N. Alpaslan, “Optical flow-based representation for video action detection,” Emerging Trends in Image Processing, Computer Vision and Pattern Recognition, pp. 331–351, 1 2015. http://dx.doi.org/10.1016/B978-0-12-802045-6.00021-1
- B. D. Lucas and T. Kanade, “Iterative Image Registration Technique With an Application to Stereo Vision.” vol. 2, 1981, pp. 674–679. [Online]. Available: https://www.researchgate.net/publication/215458777_An_Iterative_Image_Registration_Technique_with_an_Application_to_Stereo_Vision_IJCAI
- B. K. Horn and B. G. Schunck, “Determining optical flow,” Artificial Intelligence, vol. 17, no. 1-3, pp. 185–203, 8 1981. http://dx.doi.org/10.1016/0004-3702(81)90024-2
- A. Bruhn, J. Weickert, and C. Schnörr, “Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods,” International Journal of Computer Vision, vol. 61, no. 3, pp. 1–21, 2 2005. http://dx.doi.org/10.1023/B:VISI.0000045324.43199.43