Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

A*-FastIsomap: An Improved Performance of Classical Isomap Based on A* Search Algorithm

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Nonlinear Dimensionality Reduction (NLDR) is a well-known approach of manifold learning to transform the data from high to low dimensional space. After studying various techniques proposed for the NLDR, we find that performance improvement is still required. Therefore, we adopt classical Isomap, which reduces Shortest Path Distance (SPD) and high computational time cost problems. These problems are occurring due to the Dijkstra algorithm. This paper presents the A*-FastIsomap method for SPD issues, which is based on the A* Search Algorithm with the Double Buckets algorithm. We compared the A*-FastIsomap with classical Isomap to verify its better efficiency and results for high dimensional datasets with much higher accuracy. The outcome of our current study demonstrates that as compared to classical Isomap, our proposed A*-FastIsomap is faster and more accurate. Furthermore, our proposed method can reduce the computation time for high and large-dimensional datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Han Z, Meng D-Y, Xu Z-B, Gu N-N (2011) Incremental alignment manifold learning. J Comput Sci Technol 26(1):153–165

    Article  Google Scholar 

  2. Meng D, Leung Y, Xu Z, Fung T, Zhang Q (2008) Improving geodesic distance estimation based on locally linear assumption. Pattern Recogn Lett 29(7):862–870

    Article  Google Scholar 

  3. Meng D, Leung Y, Xu Z (2011) Detecting intrinsic loops underlying data manifold. IEEE Trans Knowl Data Eng 25(2):337–347

    Article  Google Scholar 

  4. Saul LK, Roweis ST (2003) Think globally, fit locally: unsupervised learning of low dimensional manifolds. Departmental Papers (CIS), 12

  5. De Silva V, Tenenbaum JB (2002) Global versus local methods in nonlinear dimensionality reduction. NIPS 15:705–712

    Google Scholar 

  6. Liang D, Qiao C, Xu Z (2015) Enhancing both efficiency and representational capability of isomap by extensive landmark selection. Mathematical Problems in Engineering 2015

  7. Shi H, Yin B, Bao Y, Lei Y (2016) A novel landmark point selection method for l-isomap. In: 2016 12th IEEE International Conference on Control and Automation (ICCA), pp 621–625. IEEE

  8. Tenenbaum JB, De Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323

    Article  Google Scholar 

  9. Zhang Z, Zha H (2004) Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J Sci Comput 26(1):313–338

    Article  MathSciNet  Google Scholar 

  10. Tamayo P, Slonim D, Mesirov J, Zhu Q, Kitareewan S, Dmitrovsky E, Lander ES, Golub TR (1999) Interpreting patterns of gene expression with self-organizing maps: methods and application to hematopoietic differentiation. Proc Natl Acad Sci 96(6):2907–2912

    Article  Google Scholar 

  11. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396

    Article  Google Scholar 

  12. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  13. Hong-Yuan W, Xiu-Jie D, Qi-Cai C, Fu-Hua C (2013) An improved isomap for visualization and classification of multiple manifolds. In: International Conference on Neural Information Processing, pp 1–12. Springer

  14. Qu T, Cai Z (2015) A fast isomap algorithm based on fibonacci heap. In: International Conference in Swarm Intelligence, pp 225–231. Springer

  15. Qu T, Cai Z (2017) An improved isomap method for manifold learning. International Journal of Intelligent Computing and Cybernetics

  16. Lei Y-K, Xu Y, Zhang S-W, Wang S-L, Ding Z-G (2010) Fast isomap based on minimum set coverage. In: International Conference on Intelligent Computing, pp 173–179. Springer

  17. DS GMJ (1979) Computers and intractability: a guide to the theory of np-completeness. San Franciso WH Freeman and co

  18. Jing L, Shao C (2011) Selection of the suitable parameter value for isomap. J Softw 6(6):1034–1041

    Article  Google Scholar 

  19. Fu B, Chen L, Zhou Y, Zheng D, Wei Z, Dai J, Pan H (2018) An improved a* algorithm for the industrial robot path planning with high success rate and short length. Robot Auton Syst 106:26–37

    Article  Google Scholar 

  20. Zhang Z, Chow TW, Zhao M (2012) M-isomap: Orthogonal constrained marginal isomap for nonlinear dimensionality reduction. IEEE Trans Cybern 43(1):180–191

    Article  Google Scholar 

  21. Yousaf M, Rehman TU, Liao D, Alhusaini N, Jing L (2020) Fastisomapvis: A novel approach for nonlinear manifold learning. IEEE Access 8:199470–199481

    Article  Google Scholar 

  22. Najafi A, Joudaki A, Fatemizadeh E (2016) Nonlinear dimensionality reduction via path-based isometric mapping. IEEE Trans Pattern Anal Mach Intell 38(7):1452–1464

    Article  Google Scholar 

  23. Huang R, Zhang G, Chen J (2019) Semi-supervised discriminant isomap with application to visualization, image retrieval and classification. Int J Mach Learn Cybern 10(6):1269–1278

    Article  Google Scholar 

  24. Lee DD, Seung HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401(6755):788–791

    Article  Google Scholar 

  25. Torgerson WS (1952) Multidimensional scaling: I. theory and method. Psychometrika 17(4):401–419

    Article  MathSciNet  Google Scholar 

  26. Cox MAA, Cox TF (2008) Multidimensional Scaling, pp 315–347. Springer,Berlin Heidelberg

  27. Jolliffe IT, Cadima J (2016) Principal component analysis: a review and recent developments. Philos Trans Royal Soc Math Phys Eng Sci 374(2065):2015–0202

    MathSciNet  Google Scholar 

  28. Schölkopf B, Smola A, Müller K-R (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319

    Article  Google Scholar 

  29. Takahashi S, Fujishiro I, Okada M (2009) Applying manifold learning to plotting approximate contour trees. IEEE Trans Visual Comput Graphics 15(6):1185–1192

    Article  Google Scholar 

  30. Yazdian N, Tie Y, Venetsanopoulos A, Guan L (2014) Automatic ontario license plate recognition using local normalization and intelligent character classification. In: 2014 IEEE 27th Canadian Conference on Electrical and Computer Engineering (CCECE), pp 1–6. IEEE

  31. Gepshtein S, Keller Y (2015) Sensor network localization by augmented dual embedding. IEEE Trans Signal Process 63(9):2420–2431

    Article  MathSciNet  Google Scholar 

  32. Rana S, Singh A (2016) Comparative analysis of sentiment orientation using svm and naive bayes techniques. In: 2016 2nd International Conference on Next Generation Computing Technologies (NGCT), pp 106–111. IEEE

  33. Verma R, Khurd P, Davatzikos C (2007) On analyzing diffusion tensor images by identifying manifold structure using isomaps. IEEE Trans Med Imaging 26(6):772–778

    Article  Google Scholar 

  34. Yu J, Zhu C, Zhang J, Huang Q, Tao D (2019) Spatial pyramid-enhanced netvlad with weighted triplet loss for place recognition. IEEE Trans Neural Netw learning sys 31(2):661–674

    Article  Google Scholar 

  35. Chen D, Li X, Li S (2021) A novel convolutional neural network model based on beetle antennae search optimization algorithm for computerized tomography diagnosis. IEEE Transactions on Neural Networks and Learning Systems

  36. Maier M, Von Luxburg U, Hein M (2008) Influence of graph construction on graph-based clustering measures. In: NIPS, 1025: 1032. Citeseer

  37. Hougardy S (2010) The floyd-warshall algorithm on graphs with negative cycles. Inf Process Lett 110(8–9):279–281

    Article  MathSciNet  Google Scholar 

  38. Silpa-Anan C, Hartley RI (2008) Optimised kd-trees for fast image descriptor matching. 2008 IEEE Conference on Computer Vision and Pattern Recognition, 1–8

  39. Jo J, Seo J, Fekete J-D (2017) A progressive kd tree for approximate k-nearest neighbors. In: 2017 IEEE Workshop on Data Systems for Interactive Analysis (DSIA), pp 1–5. IEEE

  40. Muja M, Lowe DG (2009) Fast approximate nearest neighbors with automatic algorithm configuration. VISAPP (1) 2(331–340):2

    Google Scholar 

  41. Zhan FB (1997) Three fastest shortest path algorithms on real road networks: Data structures and procedures. J Geogr Inf Decis Anal 1(1):69–82

    Google Scholar 

  42. Xiao-Yan L, Yan-Li C (2010) Application of dijkstra algorithm in logistics distribution lines. In: Third International Symposium on Computer Science and Computational Technology (ISCSCT’10), Jiaozuo, PR China, pp 048–050. Citeseer

  43. Abujassar R, Ghanbari M (2011) Efficient algorithms to enhance recovery schema in link state protocols. arXiv preprint arXiv:1108.1426

  44. Wang H, Yu Y, Yuan Q (2011) Application of dijkstra algorithm in robot path-planning. In: 2011 Second International Conference on Mechanic Automation and Control Engineering, pp 1067–1069. IEEE

  45. Eneh A, Arinze U (2017) Comparative analysis and implementation of dijkstra’s shortest path algorithm for emergency response and logistic planning. Niger J Technol 36(3):876–888

    Article  Google Scholar 

  46. Sivakumar S, Chandrasekar C (2014) Modified dijkstra’s shortest path algorithm. Int J Innov Research Comp Commun Eng 2(11):6450–6456

    Google Scholar 

  47. Yu J, Tao D, Wang M (2012) Adaptive hypergraph learning and its application in image classification. IEEE Trans Image Process 21(7):3262–3272

    Article  MathSciNet  Google Scholar 

  48. Hart PE, Nilsson NJ, Raphael B (1968) A formal basis for the heuristic determination of minimum cost paths. IEEE trans Sys Sci Cybern 4(2):100–107

    Article  Google Scholar 

  49. Cherkassky BV, Goldberg AV, Radzik T (1996) Shortest paths algorithms: Theory and experimental evaluation. Math Program 73(2):129–174

    Article  MathSciNet  Google Scholar 

  50. Gulraj M, Ahmad N (2016) Mood detection of psychological and mentally disturbed patients using machine learning techniques. Int J Comp Sci Network Secur (IJCSNS) 16(8):63

    Google Scholar 

  51. Amsaleg L, Jegou H (2010) Datasets for approximate nearest neighbor search

  52. Leskovec J, Krevl A (2014) SNAP Datasets: Stanford large network dataset collection. MI, USA, Ann Arbor

    Google Scholar 

  53. Gredell DA, Schroeder AR, Belk KE, Broeckling CD, Heuberger AL, Kim S-Y, King DA, Shackelford SD, Sharp JL, Wheeler TL et al (2019) Comparison of machine learning algorithms for predictive modeling of beef attributes using rapid evaporative ionization mass spectrometry (reims) data. Sci Rep 9(1):1–9

    Article  Google Scholar 

  54. Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Sys Man Cybern Part B (Cybern) 35(6):1098–1107

    Article  Google Scholar 

Download references

Funding

This work is supported by the Strategic Priority Research Program of Chinese Academy of Sciences, Grant NO. XDA19020102.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Mahwish Yousaf or Li Jing.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rehman, T.U., Yousaf, M. & Jing, L. A*-FastIsomap: An Improved Performance of Classical Isomap Based on A* Search Algorithm. Neural Process Lett 55, 12719–12736 (2023). https://doi.org/10.1007/s11063-022-10941-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-022-10941-3

Keywords