REFERENCES
1. Frank JT, Unke OT, Müller KR, Chmiela S. A Euclidean transformer for fast and stable machine learned force fields. Nat Commun 2024;15:6539.
2. Choung S, Park W, Moon J, Han JW. Rise of machine learning potentials in heterogeneous catalysis: developments, applications, and prospects. Chem Eng J 2024;494:152757.
3. Tang D, Ketkaew R, Luber S. Machine learning interatomic potentials for heterogeneous catalysis. Chem A Eur J 2024;30:e202401148.
4. Damewood J, Karaguesian J, Lunger JR, et al. Representations of materials for machine learning. Annu Rev Mater Res 2023;53:399-426.
5. Song Z, Chen X, Meng F, et al. Machine learning in materials design: algorithm and application*. Chinese Phys B 2020;29:116103.
6. Dieb S, Song Z, Yin W, Ishii M. Optimization of depth-graded multilayer structure for x-ray optics using machine learning. J Appl Phy 2020;128:074901.
7. Cheng G, Gong XG, Yin WJ. Crystal structure prediction by combining graph network and optimization algorithm. Nat Commun 2022;13:1492.
8. Zendehboudi S, Rezaei N, Lohi A. Applications of hybrid models in chemical, petroleum, and energy systems: a systematic review. Appl Energy 2018;228:2539-66.
9. Leukel J, Scheurer L, Sugumaran V. Machine learning models for predicting physical properties in asphalt road construction: a systematic review. Constr Build Mater 2024;440:137397.
10. Musaelian A, Batzner S, Johansson A, et al. Learning local equivariant representations for large-scale atomistic dynamics. Nat Commun 2023;14:579.
11. Batzner S, Musaelian A, Sun L, et al. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat Commun 2022;13:2453.
12. Thölke P, Fabritiis GD.
13. Wang G, Wang C, Zhang X, Li Z, Zhou J, Sun Z. Machine learning interatomic potential: bridge the gap between small-scale models and realistic device-scale simulations. iScience 2024;27:109673.
14. Noda K, Shibuta Y. Prediction of potential energy profiles of molecular dynamic simulation by graph convolutional networks. Comput Mater Sci 2023;229:112448.
15. Yu H, Zhong Y, Hong L, et al. Spin-dependent graph neural network potential for magnetic materials. Phys Rev B 2024;109:14426.
16. Vandenhaute S, Cools-ceuppens M, Dekeyser S, Verstraelen T, Van Speybroeck V. Machine learning potentials for metal-organic frameworks using an incremental learning approach. npj Comput Mater 2023;9:1-8.
17. Song K, Zhao R, Liu J, et al. General-purpose machine-learned potential for 16 elemental metals and their alloys. Available from: http://arxiv.org/abs/2311.04732. [Last accessed on 27 Dec 2024].
18. Sun H, Zhang C, Tang L, Wang R, Xia W, Wang C. Molecular dynamics simulation of Fe-Si alloys using a neural network machine learning potential. Phys Rev B 2023;107:224301.
19. Kostiuchenko TS, Shapeev AV, Novikov IS. Interatomic interaction models for magnetic materials: recent advances. Chinese Phys Lett 2024;41:066101.
20. Fan Z, Chen W, Vierimaa V, Harju A. Efficient molecular dynamics simulations with many-body potentials on graphics processing units. Comput Phys Commun 2017;218:10-6.
21. Zhong Y, Yu H, Gong X, Xiang H. A general tensor prediction framework based on graph neural networks. J Phys Chem Lett 2023;14:6339-48.
22. Zhong Y, Yu H, Su M, Gong X, Xiang H. Transferable equivariant graph neural networks for the hamiltonians of molecules and solids. npj Comput Mater 2023;9:182.
23. Zhong Y, Yu H, Yang J, Guo X, Xiang H, Gong X. Universal machine learning kohn-sham hamiltonian for materials. Chinese Phys Lett 2024;41:077103.
24. Li H, Wang Z, Zou N, et al. Deep-learning density functional theory hamiltonian for efficient ab initio electronic-structure calculation. Nat Comput Sci 2022;2:367-77.
25. Zhong Y, Liu S, Zhang B, et al. Accelerating the calculation of electron-phonon coupling strength with machine learning. Nat Comput Sci 2024;4:615-25.
26. Zhang C, Zhong Y, Tao ZG, et al. Advancing nonadiabatic molecular dynamics simulations for solids: achieving supreme accuracy and efficiency with machine learning. Available from: https://arxiv.org/html/2408.06654v1. [Last accessed on 27 Dec 2024].
27. Xie T, Grossman JC. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys Rev Lett 2018;120:145301.
28. Choudhary K, Decost B. Atomistic line graph neural network for improved materials property predictions. npj Comput Mater 2021;7:185.
29. Choudhary K, Garrity K. Designing high-TC superconductors with BCS-inspired screening, density functional theory, and deep-learning. npj Comput Mater 2022;8:244.
30. Choudhary K, Garrity KF, Sharma V, Biacchi AJ, Walker ARH, Tavazza F. High-throughput density functional perturbation theory and machine learning predictions of infrared, piezoelectric and dielectric responses. npj Comput Mater 2020;6:64.
31. Clayson IG, Hewitt D, Hutereau M, Pope T, Slater B. High throughput methods in the synthesis, characterization, and optimization of porous materials. Adv Mater 2020;32:e2002780.
32. Wang R, Yu H, Zhong Y, Xiang H. Identifying direct bandgap silicon structures with high-throughput search and machine learning methods. J Phys Chem C 2024;128:12677-85.
33. Stergiou K, Ntakolia C, Varytis P, Koumoulos E, Karlsson P, Moustakidis S. Enhancing property prediction and process optimization in building materials through machine learning: a review. Comput Mater Sci 2023;220:112031.
34. Cybenko G. Approximation by superpositions of a sigmoidal function. Math Control Signal Syst 1989;2:303-14.
35. Hornik K, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators. Neural Netw 1989;2:359-66.
36. Liu Z, Wang Y, Vaidya S, et al. KAN: Kolmogorov-Arnold Networks. Available from: http://arxiv.org/abs/2404.19756. [Last accessed on 27 Dec 2024].
37. Braun J, Griebel M. On a constructive proof of kolmogorov’s superposition theorem. Constr Approx 2009;30:653-75.
38. Arnol’d VI. On the representation of functions of several variables as a superposition of functions of a smaller number of variables. In: Givental AB, Khesin BA, Marsden JE, Varchenko AN, Vassiliev VA, Viro OY, Zakalyukin VM, editors. Collected Works. Berlin: Springer Berlin Heidelberg; 2009. pp. 25-46.
39. Li Z. Kolmogorov-Arnold Networks are radial basis function networks. Available from: http://arxiv.org/abs/2405.06721. [Last accessed on 27 Dec2024].
40. Bozorgasl Z, Chen H. Wav-KAN: Wavelet Kolmogorov-Arnold Networks. Available from: https://arxiv.org/abs/2405.12832. [Last accessed on 27 Dec2024].
41. Xu J, Chen Z, Li J, et al. FourierKAN-GCF: Fourier Kolmogorov-Arnold Network - an effective and efficient feature transformation for graph collaborative filtering. Available from: http://arxiv.org/abs/2406.01034. [Last accessed on 27 Dec2024].
42. Aghaei AA. fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions. Available from: http://arxiv.org/abs/2406.07456. [Last accessed on 27 Dec2024].
43. Reinhardt EAF, Dinesh PR, Gleyzer S. SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions. Available from: http://arxiv.org/abs/2407.04149. [Last accessed on 27 Dec2024].
44. Nagai Y, Okumura M. Kolmogorov-Arnold Networks in molecular dynamics. Available from: https://arxiv.org/abs/2407.17774. [Last accessed on 27 Dec2024].
45. Genet R, Inzirillo H. TKAN: Temporal Kolmogorov-Arnold Networks. Available from: https://arxiv.org/abs/2405.07344. [Last accessed on 27 Dec2024].
46. Kiamari M, Kiamari M, Krishnamachari B. GKAN: Graph Kolmogorov-Arnold Networks. Available from: http://arxiv.org/abs/2406.06470. [Last accessed on 27 Dec2024].
47. Inzirillo H, Genet R. SigKAN: Signature-Weighted Kolmogorov-Arnold Networks for rime series. Available from: http://arxiv.org/abs/2406.17890. [Last accessed on 27 Dec2024].
48. Bresson R, Nikolentzos G, Panagopoulos G, Chatzianastasis M, Pang J, Vazirgiannis M. KAGNNs: Kolmogorov-Arnold Networks meet graph learning. Available from: http://arxiv.org/abs/2406.18380. [Last accessed on 27 Dec2024].
49. Wang Y, Sun J, Bai J, et al. Kolmogorov–arnold-informed neural network: a physics-informed deep learning framework for solving forward and inverse problems based on kolmogorov-arnold networks. Comput Methods Appl Mech Eng 2025;433:117518.
50. Batatia I, Kovacs DP, Simm GNC, Ortner C, Csanyi G. MACE: higher order equivariant message passing neural networks for fast and accurate force fields. 2022. Available from: https://openreview.net/forum?id=YPpSngE-ZU. [Last accessed on 27 Dec2024].
51. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE.
52. Blealtan/efficient-kan. Available from: https://github.com/Blealtan/efficient-kan. [Last accessed on 27 Dec2024].
54. Perdew JP, Burke K, Ernzerhof M. Generalized gradient approximation made simple. Phys Rev Lett 1997;78:1396-1396.
55. Thompson AP, Aktulga HM, Berger R, et al. LAMMPS-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Comput Phys Commun 2022;271:108171.
56. Wu J, Zhang Y, Zhang L, Liu S. Deep learning of accurate force field of ferroelectric HfO2. Phys Rev B 2021;103:024108.
57. Deringer VL, Csányi G. Machine learning based interatomic potential for amorphous carbon. Phys Rev B 2017:95.
58. Wang J, Wang Y, Zhang H, et al. E(n)-equivariant cartesian tensor message passing interatomic potential. Nat Commun 2024;15:7607.
59. Fan Z, Wang Y, Ying P, et al. GPUMD: a package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations. J Chem Phys 2022;157:114801.
60. Mumuni A, Mumuni F. Data augmentation: a comprehensive survey of modern approaches. Array 2022;16:100258.
61. Lu Y, Shen M, Wang H, Wang X, van Rechem C, Fu T, Wei W. Machine learning for synthetic data generation: a review. Available from: https://arxiv.org/abs/2302.04062. [Last accessed on 27 Dec2024].
62. Farahani A, Voghoei S, Rasheed K, Arabnia HR. A brief review of domain adaptation. In: Stahlbock R, Weiss GM, Abou-nasr M, Yang C, Arabnia HR, Deligiannidis L, editors. Advances in data science and information engineering. Cham: Springer International Publishing; 2021. pp. 877-94.
63. Zhuang F, Qi Z, Duan K, et al. A comprehensive survey on transfer learning. Proc IEEE 2021;109:43-76.
64. Chen C, Ong SP. A universal graph deep learning interatomic potential for the periodic table. Nat Comput Sci 2022;2:718-28.
65. Deng B, Zhong P, Jun K, et al. CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling. Nat Mach Intell 2023;5:1031-41.
66. Arabha S, Aghbolagh ZS, Ghorbani K, Hatam-lee SM, Rajabpour A. Recent advances in lattice thermal conductivity calculation using machine-learning interatomic potentials. J Appl Phys 2021;130:210903.
67. Qian X, Yang R. Machine learning for predicting thermal transport properties of solids. Mater Sci Eng R Rep 2021;146:100642.
68. Mortazavi B, Zhuang X, Rabczuk T, Shapeev AV. Atomistic modeling of the mechanical properties: the rise of machine learning interatomic potentials. Mater Horiz 2023;10:1956-68.
69. Mortazavi B, Podryabinkin EV, Roche S, Rabczuk T, Zhuang X, Shapeev AV. Machine-learning interatomic potentials enable first-principles multiscale modeling of lattice thermal conductivity in graphene/borophene heterostructures. Mater Horiz 2020;7:2359-67.
70. Luo Y, Li M, Yuan H, Liu H, Fang Y. Predicting lattice thermal conductivity via machine learning: a mini review. npj Comput Mater 2023;9:964.
71. Kim Y, Yang C, Kim Y, Gu GX, Ryu S. Designing an adhesive pillar shape with deep learning-based optimization. ACS Appl Mater Interfaces 2020;12:24458-65.
72. Yu CH, Chen W, Chiang YH, et al. End-to-end deep learning model to predict and design secondary structure content of structural proteins. ACS Biomater Sci Eng 2022;8:1156-65.
73. Zhang Z, Zhang Z, Di Caprio F, Gu GX. Machine learning for accelerating the design process of double-double composite structures. Compos Struct 2022;285:115233.