REFERENCES

1. Zhang TY, Liu XJ. Informatics is fueling new materials discovery. J Mater Inf 2021;1:6.

2. Hara K, Yamada S, Kurotani A, Chikayama E, Kikuchi J. Materials informatics approach using domain modelling for exploring structure-property relationships of polymers. Sci Rep 2022;12:10558.

3. Kuz’min V, Artemenko A, Ognichenko L, et al. Simplex representation of molecular structure as universal QSAR/QSPR tool. Struct Chem 2021;32:1365-92.

4. Keyvanpour MR, Shirzad MB. An analysis of QSAR research based on machine learning concepts. Curr Drug Discov Technol 2021;18:17-30.

5. Poulson BG, Alsulami QA, Sharfalddin A, et al. Cyclodextrins: structural, chemical, and physical properties, and applications. Polysaccharides 2022;3:1-31.

6. Gao H, Ji S. Graph U-Nets. IEEE Trans Pattern Anal Mach Intell 2022;44:4948-60.

7. Lee JB, Rossi R, Kong X. Graph classification using structural attention. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining; London, United Kingdom; 2018. pp. 1666-74.

8. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE. Neural message passing for quantum chemistry. Proceedings of the 34th International Conference on Machine Learning; Sydney, NSW, Australia; 2017. pp. 1263-72. Available from: https://proceedings.mlr.press/v70/gilmer17a [Last accessed on 7 Jun 2023]

9. Wei X, Wang C, Jia Z, Xu W. High-cycle fatigue S-N curve prediction of steels based on a transfer learning-guided convolutional neural network. J Mater Inf 2022;2:9.

10. Luo H, Xiong C, Fang W, Love PED, Zhang B, Ouyang X. Convolutional neural networks: computer vision-based workforce activity assessment in construction. Autom Constr 2018;94:282-9.

11. Zhang J, Zhang J, Wu X, Shi Z, Hwang J. Coarse-to-fine multiscale fusion network for single image deraining. J Electron Imag 2022;31:043003.

12. Wu X, Zhang Y, Li Q, Qi Y, Wang J, Guo Y. Face aging with pixel-level alignment GAN. Appl Intell 2022;52:14665-78.

13. Khurana D, Koli A, Khatter K, Singh S. Natural language processing: state of the art, current trends and challenges. Multimed Tools Appl 2023;82:3713-44.

14. Wu X, Jin Y, Wang J, Qian Q, Guo Y. MKD: Mixup-based knowledge distillation for mandarin end-to-end speech recognition. Algorithms 2022;15:160.

15. Wu X, Tang B, Zhao M, Wang J, Guo Y. STR transformer: a cross-domain transformer for scene text recognition. Appl Intell 2023;53:3444-58.

16. Zhou J, Cui G, Hu S, et al. Graph neural networks: a review of methods and applications. AI Open 2020;1:57-81.

17. Wu Z, Pan S, Chen F, Long G, Zhang C, Yu PS. A comprehensive survey on graph neural networks. IEEE Trans Neural Netw Learn Syst 2021;32:4-24.

18. Zagidullin B, Wang Z, Guan Y, Pitkänen E, Tang J. Comparative analysis of molecular fingerprints in prediction of drug combination effects. Brief Bioinform 2021;22:bbab291.

19. Melville JL, Riley JF, Hirst JD. Similarity by compression. J Chem Inf Model 2007;47:25-33.

20. Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, et al. Convolutional networks on graphs for learning molecular fingerprints. Proceedings of the 28th International Conference on Neural Information Processing Systems; Montreal, Canada; 2015. pp. 2224-32. Available from: https://dl.acm.org/doi/10.5555/2969442.2969488 [Last accessed on 8 Jun 2023]

21. Ding Y, Chen M, Guo C, Zhang P, Wang J. Molecular fingerprint-based machine learning assisted QSAR model development for prediction of ionic liquid properties. J Mol Liq 2021;326:115212.

22. Maggiora G, Vogt M, Stumpfe D, Bajorath J. Molecular similarity in medicinal chemistry. J Med Chem 2014;57:3186-204.

23. Rogers D, Hahn M. Extended-connectivity fingerprints. J Chem Inf Model 2010;50:742-54.

24. Rush TS 3rd, Grant JA, Mosyak L, Nicholls A. A shape-based 3-D scaffold hopping method and its application to a bacterial protein-protein interaction. J Med Chem 2005;48:1489-95.

25. Durant JL, Leland BA, Henry DR, Nourse JG. Reoptimization of MDL keys for use in drug discovery. J Chem Inf Comput Sci 2002;42:1273-80.

26. Liu X, Liu C, Huang R, et al. Long short-term memory recurrent neural network for pharmacokinetic-pharmacodynamic modeling. Int J Clin Pharmacol Ther 2021;59:138-46.

27. Goulas A, Damicelli F, Hilgetag CC. Bio-instantiated recurrent neural networks: Integrating neurobiology-based network topology in artificial networks. Neural Netw 2021;142:608-18.

28. Weininger D. SMILES, a chemical language and information system. 1. introduction to methodology and encoding rules. J Chem Inf Comput Sci 1988;28:31-6.

29. Heller SR, McNaught A, Pletnev I, Stein S, Tchekhovskoi D. InChI, the IUPAC international chemical identifier. J Cheminform 2015;7:23.

30. Lin X, Quan Z, Wang ZJ, Huang H, Zeng X. A novel molecular representation with BiGRU neural networks for learning atom. Brief Bioinform 2020;21:2099-111.

31. Feng YH, Zhang SW. Prediction of drug-drug interaction using an attention-based graph neural network on drug molecular graphs. Molecules 2022;27:3004.

32. Chuang KV, Gunsalus LM, Keiser MJ. Learning molecular representations for medicinal chemistry. J Med Chem 2020;63:8705-22.

33. Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 2013;35:1798-828.

34. Zhang Z, Shao L, Xu Y, Liu L, Yang J. Marginal representation learning with graph structure self-adaptation. IEEE Trans Neural Netw Learn Syst 2018;29:4645-59.

35. Xie Y, Jin P, Gong M, Zhang C, Yu B. Multi-task network representation learning. Front Neurosci 2020;14:1.

36. Wang S, Wang Q, Gong M. Multi-task learning based network embedding. Front Neurosci 2019;13:1387.

37. Khasanova R, Frossard P. Graph-based isometry invariant representation learning. Proceedings of the 34th International Conference on Machine Learning; Sydney, NSW, Australia; 2017. pp. 1847-56. Available from: http://proceedings.mlr.press/v70/khasanova17a.html?ref=https://githubhelp.com [Last accessed on 8 Jun 2023]

38. Lee S, Jo J. Scale-invariant representation of machine learning. Phys Rev E 2022;105:044306.

39. Batra R, Song L, Ramprasad R. Emerging materials intelligence ecosystems propelled by machine learning. Nat Rev Mater 2021;6:655-78.

40. Agrawal A, Choudhary A. Deep materials informatics: applications of deep learning in materials science. MRS Commun 2019;9:779-92.

41. Chen C, Ye W, Zuo Y, Zheng C, Ong SP. Graph networks as a universal machine learning framework for molecules and crystals. Chem Mater 2019;31:3564-72.

42. Park CW, Wolverton C. Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery. Phys Rev Materials 2020;4:063801.

43. Sumpter BG, Noid DW. Neural networks and graph theory as computational tools for predicting polymer properties. Macromol Theory Simul ;3:363-78.

44. Xie T, Grossman JC. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys Rev Lett 2018;120:145301.

45. Coley CW, Jin W, Rogers L, et al. A graph-convolutional neural network model for the prediction of chemical reactivity. Chem Sci 2019;10:370-7.

46. Wu Z, Ramsundar B, Feinberg EN, et al. MoleculeNet: a benchmark for molecular machine learning. Chem Sci 2018;9:513-30.

47. Schmidt J, Marques MRG, Botti S, Marques MAL. Recent advances and applications of machine learning in solid-state materials science. npj Comput Mater 2019;5:83.

48. Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering. Available from: https://arxiv.org/abs/1606.09375 [Last accessed on 8 Jun 2023].

49. Hammond DK, Vandergheynst P, Gribonval R. Wavelets on graphs via spectral graph theory. Appl Comput Harmon A 2011;30:129-50.

50. Li R, Wang S, Zhu F, Huang J. Adaptive graph convolutional neural networks. AAAI 2018:32.

51. Monti F, Boscaini D, Masci J, Rodolà E, Svoboda J, M. Bronstein MM. Geometric deep learning on graphs and manifolds using mixture model CNNs. Available from: https://ieeexplore.ieee.org/document/8100059 [Last accessed on 8 Jun 2023].

52. Atwood J, Towsley D. Diffusion-convolutional neural networks. Available from: https://proceedings.neurips.cc/paper_files/paper/2016/hash/390e982518a50e280d8e2b535462ec1f-Abstract.html [Last accessed on 8 Jun 2023]

53. Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. Available from: https://proceedings.neurips.cc/paper_files/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html [Last accessed on 8 Jun 2023]

54. Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y. Graph attention networks. Available from: https://arxiv.org/abs/1710.10903 [Last accessed on 8 Jun 2023].

55. Lusci A, Pollastri G, Baldi P. Deep architectures and deep learning in chemoinformatics: the prediction of aqueous solubility for drug-like molecules. J Chem Inf Model 2013;53:1563-75.

56. Altae-Tran H, Ramsundar B, Pappu AS, Pande V. Low data drug discovery with one-shot learning. ACS Cent Sci 2017;3:283-93.

57. Segler MHS, Kogej T, Tyrchan C, Waller MP. Generating focused molecule libraries for drug discovery with recurrent neural networks. ACS Cent Sci 2018;4:120-31.

58. Rahimi A, Cohn T, Baldwin T. Semi-supervised user geolocation via graph convolutional networks. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers); Melbourne, Australia; 2018. pp. 2009-19.

59. Li G, Müller M, Thabet A, Ghanem B. DeepGCNs: can GCNs go as deep as CNNs? 2019 IEEE/CVF International Conference on Computer Vision (ICCV); Seoul, Korea (South); 2019. pp. 9266-76. Available from: https://openaccess.thecvf.com/content_ICCV_2019/html/Li_DeepGCNs_Can_GCNs_Go_As_Deep_As_CNNs_ICCV_2019_paper.html [Last accessed on 8 Jun 2023]

60. Morris C, Ritzert M, Fey M, et al. Weisfeiler and leman go neural: higher-order graph neural networks. AAAI 2019;33:4602-9.

61. Papp PA, Martinkus K, Faber L, Wattenhofer R. DropGNN: random dropouts increase the expressiveness of graph neural networks. Available from: https://proceedings.neurips.cc/paper/2021/hash/b8b2926bd27d4307569ad119b6025f94-Abstract.html [Last accessed on 8 Jun 2023]

62. Rong Y, Huang W, Xu T, Huang J. DropEdge: towards deep graph convolutional networks on node classification. Available from: https://openreview.net/forum?id=Hkx1qkrKPr [Last accessed on 8 Jun 2023].

63. Sun Q, Li J, Peng H, et al. SUGAR: subgraph neural network with reinforcement pooling and self-supervised mutual information mechanism. Available from: https://arxiv.org/abs/2101.08170 [Last accessed on 8 Jun 2023].

64. Bevilacqua B, Frasca F, Lim D, et al. Equivariant subgraph aggregation networks. Available from: https://openreview.net/forum?id=dFbKQaRk15w [Last accessed on 8 Jun 2023].

65. Gasteiger J, Yeshwanth C, Günnemann S. Directional message passing on molecular graphs via synthetic coordinates. Available from: https://proceedings.neurips.cc/paper/2021/hash/82489c9737cc245530c7a6ebef3753ec-Abstract.html [Last accessed on 8 Jun 2023]

66. Liu Y, Wang L, Liu M, et al. Spherical message passing for 3D molecular graphs. Available from: https://par.nsf.gov/servlets/purl/10353844 [Last accessed on 8 Jun 2023].

67. Gasteiger J, Becker F, Günnemann S. Gemnet: universal directional graph neural networks for molecules. Available from: https://proceedings.neurips.cc/paper/2021/hash/35cf8659cfcb13224cbd47863a34fc58-Abstract.html [Last accessed on 8 Jun 2023]

68. Vasudevan RK, Choudhary K, Mehta A, et al. Materials science in the AI age: high-throughput library generation, machine learning and a pathway from correlations to the underpinning physics. MRS Commun 2019;9:10.1557/mrc.2019.95.

69. Lu S, Zhou Q, Ouyang Y, Guo Y, Li Q, Wang J. Accelerated discovery of stable lead-free hybrid organic-inorganic perovskites via machine learning. Nat Commun 2018;9:3405.

70. Xie T, France-Lanord A, Wang Y, et al. Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties. Nat Commun 2022;13:3415.

71. Xie T, Grossman JC. Hierarchical visualization of materials space with graph convolutional neural networks. J Chem Phys 2018;149:174111.

72. Gómez-Bombarelli R, Wei JN, Duvenaud D, et al. Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent Sci 2018;4:268-76.

73. Zhao H, Chen W, Huang H, et al. A robotic platform for the synthesis of colloidal nanocrystals. Nat Synth 2023;2:505-14.

74. Lee YJ, Kahng H, Kim SB. Generative adversarial networks for de novo molecular design. Mol Inform 2021;40:e2100045.

75. Patel RA, Borca CH, Webb MA. Featurization strategies for polymer sequence or composition design by machine learning. Mol Syst Des Eng 2022;7:661-76.

76. Putin E, Asadulaev A, Ivanenkov Y, et al. Reinforced adversarial neural computer for de novo molecular design. J Chem Inf Model 2018;58:1194-204.

77. Zhao Y, Al-Fahdi M, Hu M, et al. High-throughput discovery of novel cubic crystal materials using deep generative neural networks. Adv Sci 2021;8:e2100566.

78. You J, Ying R, Ren X, Hamilton W, Leskovec J. GraphRNN: generating realistic graphs with deep auto-regressive models. Proceedings of the 35th International Conference on Machine Learning; 2018. pp. 5708-17. Available from: http://proceedings.mlr.press/v80/you18a.html?ref=https://githubhelp.com [Last accessed on 8 Jun 2023]

79. Lai X, Yang P, Wang K, Yang Q, Yu D. MGRNN: structure generation of molecules based on graph recurrent neural networks. Mol Inform 2021;40:e2100091.

80. Sanchez-Lengeling B, Aspuru-Guzik A. Inverse molecular design using machine learning: generative models for matter engineering. Science 2018;361:360-5.

81. Lin X, Jiang Y, Yang Y. Molecular distance matrix prediction based on graph convolutional networks. J Mol Struct 2022;1257:132540.

82. Gong S, Wang Y, Tian Y, Wang L, Liu G. Rapid enthalpy prediction of transition states using molecular graph convolutional network. AIChE Journal 2023:69.

83. Jain A, Ong SP, Hautier G, et al. Commentary: the materials project: a materials genome approach to accelerating materials innovation. APL Materials 2013;1:011002.

84. Im S, Kim H, Kim W, Cho M. Neural network constitutive model for crystal structures. Comput Mech 2021;67:185-206.

85. Dunn A, Wang Q, Ganose A, Dopp D, Jain A. Benchmarking materials property prediction methods: the matbench test set and automatminer reference algorithm. npj Comput Mater 2020;6:138.

86. Fung V, Zhang J, Juarez E, Sumpter BG. Benchmarking graph neural networks for materials chemistry. npj Comput Mater 2021;7:84.

87. Choudhary K, Decost B. Atomistic line graph neural network for improved materials property predictions. npj Comput Mater 2021;7:185.

88. Louis SY, Zhao Y, Nasiri A, et al. Graph convolutional neural networks with global attention for improved materials property prediction. Phys Chem Chem Phys 2020;22:18141-8.

89. Li Y, Li P, Yang X, et al. Introducing block design in graph neural networks for molecular properties prediction. Chem Eng J 2021;414:128817.

90. Trieu HL, Miwa M, Ananiadou S. BioVAE: a pre-trained latent variable language model for biomedical text mining. Bioinformatics 2022;38:872-4.

91. Zhang ZC, Zhang MY, Zhou T, Qiu YL. Pre-trained language model augmented adversarial training network for Chinese clinical event detection. Math Biosci Eng 2020;17:2825-41.

92. Lee J, Yoon W, Kim S, et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 2020;36:1234-40.

93. Wang Y, Wang J, Cao Z, Barati Farimani A. Molecular contrastive learning of representations via graph neural networks. Nat Mach Intell 2022;4:279-87.

94. Ding Z, Chen Z, Ma T, Lu CT, Ma W, Shaw L. Predicting the hydrogen release ability of LiBH4 -based mixtures by ensemble machine learning. Energy Stor Mater 2020;27:466-77.

95. Blum LC, Reymond JL. 970 million druglike small molecules for virtual screening in the chemical universe database GDB-13. J Am Chem Soc 2009;131:8732-3.

96. Ruddigkeit L, van Deursen R, Blum LC, Reymond JL. Enumeration of 166 billion organic small molecules in the chemical universe database GDB-17. J Chem Inf Model 2012;52:2864-75.

97. Ramakrishnan R, Dral PO, Rupp M, von Lilienfeld OA. Quantum chemistry structures and properties of 134 kilo molecules. Sci Data 2014;1:140022.

98. Gan Y, Zhou J, Sun Z. Prediction of the atomic structure and thermoelectric performance for semiconducting Ge1Sb6Te10 from DFT calculations. J Mater Inf 2021;1:2.

99. Elegbeleye IF, Maluta NE, Maphanga RR. Density functional theory study of optical and electronic properties of (TiO2)n=5,8,68 clusters for application in solar cells. Molecules 2021;26:955.

100. Liao R, Zhao Z, Urtasun R, Zemel R. LanczosNet: multi-scale deep graph convolutional networks. Available from: https://openreview.net/forum?id=BkedznAqKQ [Last accessed on 8 Jun 2023].

101. Louis SY, Siriwardane EMD, Joshi RP, Omee SS, Kumar N, Hu J. Accurate prediction of voltage of battery electrode materials using attention-based graph neural networks. ACS Appl Mater Interfaces 2022;14:26587-94.

102. Omee SS, Louis SY, Fu N, et al. Scalable deeper graph neural networks for high-performance materials property prediction. Patterns 2022;3:100491.

103. Breuck P, Heymans G, Rignanese G. Accurate experimental band gap predictions with multifidelity correction learning. J Mater Inf 2022;2:10.

104. Shang C, Liu Q, Tong Q, Sun J, Song M, Bi J. Multi-view spectral graph convolution with consistent edge attention for molecular modeling. Neurocomputing 2021;445:12-25.

105. Tang B, Kramer ST, Fang M, Qiu Y, Wu Z, Xu D. A self-attention based message passing neural network for predicting molecular lipophilicity and aqueous solubility. J Cheminform 2020;12:15.

106. Huuskonen J. Estimation of aqueous solubility for a diverse set of organic compounds based on molecular topology. J Chem Inf Comput Sci 2000;40:773-7.

107. Micheli A. Neural network for graphs: a contextual constructive approach. IEEE Trans Neural Netw 2009;20:498-511.

108. Coley CW, Barzilay R, Green WH, Jaakkola TS, Jensen KF. Convolutional embedding of attributed molecular graphs for physical property prediction. J Chem Inf Model 2017;57:1757-72.

109. Meng M, Wei Z, Li Z, Jiang M, Bian Y. Property prediction of molecules in graph convolutional neural network expansion. 2019 IEEE 10th International Conference on Software Engineering and Service Science (ICSESS); Beijing, China.

110. Dai M, Demirel MF, Liang Y, Hu J. Graph neural networks for an accurate and interpretable prediction of the properties of polycrystalline materials. npj Comput Mater 2021;7:103.

111. Groom CR, Bruno IJ, Lightfoot MP, Ward SC. The cambridge structural database. Acta Crystallogr B Struct Sci Cryst Eng Mater 2016;72:171-9.

112. Gražulis S, Daškevič A, Merkys A, et al. Crystallography open database (COD): an open-access collection of crystal structures and platform for world-wide collaboration. Nucleic Acids Res 2012;40:D420-7.

113. Hao Z, Lu C, Huang Z, et al. ASGN: An active semi-supervised graph neural network for molecular property prediction. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining; Virtual Event, CA, USA; 2020. pp. 731-52.

114. Park CW, Kornbluth M, Vandermause J, Wolverton C, Kozinsky B, Mailoa JP. Accurate and scalable graph neural network force field and molecular dynamics with direct force architecture. npj Comput Mater 2021;7:73.

115. Vugmeyster Y, Harrold J, Xu X. Absorption, distribution, metabolism, and excretion (ADME) studies of biotherapeutics for autoimmune and inflammatory conditions. AAPS J 2012;14:714-27.

116. Delaney JS. ESOL: estimating aqueous solubility directly from molecular structure. J Chem Inf Comput Sci 2004;44:1000-5.

117. Bergström CA, Strafford M, Lazorova L, Avdeef A, Luthman K, Artursson P. Absorption classification of oral drugs based on molecular surface properties. J Med Chem 2003;46:558-70.

118. Otsuka S, Kuwajima I, Hosoya J, Xu Y, Yamazaki M. PoLyInfo: polymer database for polymeric materials design. 2011 International Conference on Emerging Intelligent Data and Web Technologies; Tirana, Albania; 2011. pp. 22-9.

119. Li M, Ma Z, Wang YG, Zhuang X. Fast haar transforms for graph neural networks. Neural Netw 2020;128:188-98.

120. Ma H, Bian Y, Rong Y, et al. Cross-dependent graph neural networks for molecular property prediction. Bioinformatics 2022;38:2003-9.

121. Mansimov E, Mahmood O, Kang S, Cho K. Molecular geometry prediction using a deep generative graph neural network. Sci Rep 2019;9:20381.

122. Allotey J, Butler KT, Thiyagalingam J. Entropy-based active learning of graph neural network surrogate models for materials properties. J Chem Phys 2021;155:174116.

123. Bertinetto C, Duce C, Micheli A, Solaro R, Starita A, Tiné MR. Evaluation of hierarchical structured representations for QSPR studies of small molecules and polymers by recursive neural networks. J Mol Graph Model 2009;27:797-802.

124. Feinberg EN, Sur D, Wu Z, et al. PotentialNet for molecular property prediction. ACS Cent Sci 2018;4:1520-30.

125. Withnall M, Lindelöf E, Engkvist O, Chen H. Building attention and edge message passing neural networks for bioactivity and physical-chemical property prediction. J Cheminform 2020;12:1.

126. Sun FY, Hoffman J, Verma V, Tang J. InfoGraph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. International Conference on Learning Representations; Addis Ababa, Ethiopia; 2020. pp. 1-16. Available from: https://research.aalto.fi/en/publications/infograph-unsupervised-and-semi-supervised-graph-level-representa [Last accessed on 8 Jun 2023]

127. Huang K, Xiao C, Glass LM, Zitnik M, Sun J. SkipGNN: predicting molecular interactions with skip-graph networks. Sci Rep 2020;10:21092.

128. Wang X, Li Z, Jiang M, Wang S, Zhang S, Wei Z. Molecule property prediction based on spatial graph embedding. J Chem Inf Model 2019;59:3817-28.

129. Zhang Z, Guan J, Zhou S. FraGAT: a fragment-oriented multi-scale graph attention model for molecular property prediction. Bioinformatics 2021;37:2981-7.

130. Choi JY, Zhang P, Mehta K, Blanchard A, Lupo Pasini M. Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules. J Cheminform 2022;14:70.

131. Choudhary K, Yildirim T, Siderius DW, Kusne AG, Mcdannald A, Ortiz-montalvo DL. Graph neural network predictions of metal organic framework CO2 adsorption properties. Comput Mater Sci 2022;210:111388.

132. Frey NC, Akinwande D, Jariwala D, Shenoy VB. Machine learning-enabled design of point defects in 2D materials for quantum and neuromorphic information processing. ACS Nano 2020;14:13406-17.

133. Wang Z, Han Y, Cai J, Wu S, Li J. DeepTMC: A deep learning platform to targeted design doped transition metal compounds. Energy Stor Mater 2022;45:1201-11.

134. Wang R, Zou Y, Zhang C, Wang X, Yang M, Xu D. Combining crystal graphs and domain knowledge in machine learning to predict metal-organic frameworks performance in methane adsorption. Micropor Mesopor Mat 2022;331:111666.

135. Pablo-García S, Morandi S, Vargas-Hernández RA, et al. Fast evaluation of the adsorption energy of organic molecules on metals via graph neural networks. Nat Comput Sci 2023;3:433-42.

136. Tavazza F, DeCost B, Choudhary K. Uncertainty prediction for machine learning models of material properties. ACS Omega 2021;6:32431-40.

137. Kwon Y, Lee D, Choi YS, Kang S. Uncertainty-aware prediction of chemical reaction yields with graph neural networks. J Cheminform 2022;14:2.

138. Behler J, Parrinello M. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys Rev Lett 2007;98:146401.

139. Shapeev AV. Moment tensor potentials: a class of systematically improvable interatomic potentials. Multiscale Model Simul 2016;14:1153-73.

140. Artrith N, Urban A. An implementation of artificial neural-network potentials for atomistic materials simulations: Performance for TiO2. Comput Mater Sci 2016;114:135-50.

141. Wang H, Zhang L, Han J, E W. DeePMD-kit: a deep learning package for many-body potential energy representation and molecular dynamics. Comput Phys Commun 2018;228:178-84.

142. Unke OT, Chmiela S, Sauceda HE, et al. Machine learning force fields. Chem Rev 2021;121:10142-86.

143. Dral PO. Quantum chemistry in the age of machine learning. J Phys Chem Lett 2020;11:2336-47.

144. Yao N, Chen X, Fu ZH, Zhang Q. Applying classical, ab initio, and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries. Chem Rev 2022;122:10970-1021.

145. Mai H, Le TC, Chen D, Winkler DA, Caruso RA. Machine learning for electrocatalyst and photocatalyst design and discovery. Chem Rev 2022;122:13478-515.

146. Kovács DP, Oord CV, Kucera J, et al. Linear atomic cluster expansion force fields for organic molecules: beyond RMSE. J Chem Theory Comput 2021;17:7696-711.

147. Morrow JD, Gardner JLA, Deringer VL. How to validate machine-learned interatomic potentials. J Chem Phys 2023;158:121501.

148. Behler J. Four generations of high-dimensional neural network potentials. Chem Rev 2021;121:10037-72.

149. Deringer VL, Caro MA, Csányi G. A general-purpose machine-learning force field for bulk and nanostructured phosphorus. Nat Commun 2020;11:5461.

150. Jinnouchi R, Lahnsteiner J, Karsai F, Kresse G, Bokdam M. Phase transitions of hybrid perovskites simulated by machine-learning force fields trained on the fly with bayesian inference. Phys Rev Lett 2019;122:225701.

151. Margraf JT. Science-driven atomistic machine learning. Angew Chem Int Ed Engl 2023;26:e202219170.

152. Freitas R, Reed EJ. Uncovering the effects of interface-induced ordering of liquid on crystal growth using machine learning. Nat Commun 2020;11:3260.

153. Schran C, Thiemann FL, Rowe P, Müller EA, Marsalek O, Michaelides A. Machine learning potentials for complex aqueous systems made simple. Proc Natl Acad Sci U S A 2021;118:e2110077118.

154. Anwar J, Zahn D. Uncovering molecular processes in crystal nucleation and growth by using molecular simulation. Angew Chem Int Ed Engl 2011;50:1996-2013.

155. Caro MA, Deringer VL, Koskinen J, Laurila T, Csányi G. Growth mechanism and origin of high sp^{3} content in tetrahedral amorphous carbon. Phys Rev Lett 2018;120:166101.

156. Li J, Luo K, An Q. Activating mobile dislocation in boron carbide at room temperature via al doping. Phys Rev Lett 2023;130:116104.

157. Cao B, Yang S, Sun A, Dong Z, Zhang T. Domain knowledge-guided interpretive machine learning: formula discovery for the oxidation behavior of ferritic-martensitic steels in supercritical water. J Mater Inf 2022;2:4.

158. Li X, Zhou Y, Dvornek N, et al. BrainGNN: interpretable brain graph neural network for fMRI analysis. Med Image Anal 2021;74:102233.

159. Ali A, Zhu Y, Zakarya M. Exploiting dynamic spatio-temporal graph convolutional neural networks for citywide traffic flows prediction. Neural Netw 2022;145:233-47.

160. Yang L, Jiang S, Zhang F. Multitask learning with graph neural network for travel time estimation. Comput Intell Neurosci 2022;2022:6622734.

161. Wu X, Li Y, Wang J, Qian Q, Guo Y. UBAR: user behavior-aware recommendation with knowledge graph. Knowl-Based Syst 2022;254:109661.

162. Zhu J, Yaseen A. A recommender for research collaborators using graph neural networks. Front Artif Intell 2022;5:881704.

163. Gatta V, Moscato V, Postiglione M, Sperli G. An epidemiological neural network exploiting dynamic graph structured data applied to the COVID-19 outbreak. IEEE Trans Big Data 2021;7:45-55.

164. Fritz C, Dorigatti E, Rügamer D. Combining graph neural networks and spatio-temporal disease models to improve the prediction of weekly COVID-19 cases in Germany. Sci Rep 2022;12:3930.

Journal of Materials Informatics
ISSN 2770-372X (Online)
Follow Us

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/