数据与计算发展前沿 ›› 2024, Vol. 6 ›› Issue (1): 79-93.
CSTR: 32002.14.jfdc.CN10-1649/TP.2024.01.008
doi: 10.11871/jfdc.issn.2096-742X.2024.01.008
收稿日期:
2022-09-22
出版日期:
2024-02-20
发布日期:
2024-02-21
通讯作者:
* 田子奇(E-mail: 作者简介:
劳思思,宁波大学,硕士研究生,现在中国科学院宁波材料技术与工程研究所联合培养。目前主要从事机器学习在催化领域的应用研究工作。基金资助:
Received:
2022-09-22
Online:
2024-02-20
Published:
2024-02-21
摘要:
【目的】探讨图卷积神经网络模型在晶体材料研究中的应用及未来可能的发展方向。【方法】首先介绍图卷积神经网络的发展历程及研究现状,然后对目前图卷积神经网络在晶体材料中的应用进行分类探讨,最后展望未来图卷积神经网络在晶体材料领域中的研究方向。【结果】图卷积神经网络在自然科学研究中得到越发广泛的应用,在晶体材料的结构设计和性能预测中展现出较好的效果。【结论】图卷积神经网络已逐渐应用于晶体材料的设计和预测,提高模型的泛化性以及如何提取深层特征,是图卷积神经网络未来的研究方向。
劳思思, 田子奇. 图卷积神经网络在晶体材料研发中的应用进展[J]. 数据与计算发展前沿, 2024, 6(1): 79-93.
LAO Sisi, TIAN Ziqi. Progress in application of Graph Convolutional Neural Network in Crystal Material Development[J]. Frontiers of Data and Computing, 2024, 6(1): 79-93, https://cstr.cn/32002.14.jfdc.CN10-1649/TP.2024.01.008.
表1
各个图卷积网络模型比较"
优点 | 缺点 | 应用任务 | 时间复杂度 | ||
---|---|---|---|---|---|
基于 频域 | Spectral CNN[ | 泛化到非欧式空间 | 整图计算, 效率低 | 图分类 | O(n3) |
ChebyNet [ | 运算简便 | 参数较多, 不灵活 | 节点分类 | O(m) | |
GWNN[ | 参数量少,计算代价低 | 适应性差 | 节点分类 | O(m) | |
DGCNN[ | 特征抽象能力强 | 没有全局 特征 | 图分类 | — | |
优点 | 缺点 | 应用数据集 | 相关函数 | ||
基于空间域(材料领域) | CGCNN[ | 灵活,搜索速度快 | 邻居节点数固定,编码信息单一 | Materials Project、OQMD | 节点向量更新: |
MEGNet[ | 可解释性、可组合性强 | 仅注重局部原子特点 | QM9 | 边缘更新: $e^{\prime}{ }_{k}=\phi_{e}\left(v_{s k} \bigoplus v_{r k} \bigoplus e_{k} \bigoplus u\right)$(2)节点更新:$v_{i}^{\prime}=\phi_{v}\left(\bar{v}_{i}^{e} \oplus v_{i} \bigoplus u\right)$(3) 全局向量更新:$u^{\prime}=\Phi_{u}\left(\bar{u}^{e} \bigoplus \bar{u}^{v} \bigoplus u\right)$(4) | |
iCGCNN[ | 识别率大,成功率高,搜索速度快 | 适用结构单一 | OQMD | 节点向量更新添加函数: | |
文献[ | 参数少、 速度快 | 只关注局部 特征 | USPTO | — | |
GATGNN[ | 考虑全局特征、精度高 | Materials Project | 相邻节点信息加权函数: |
[1] | SIMONYAN K, ZISSEMAN A. Very deep convolutional networks for large-scale image recognition[J/OL]. Computer Science, 2014. DOI:10.48550/arXiv.1409.1556. |
[2] | SZEGEDY C, LIU W, JIA Y, et al. Going deeper with convolutions[C]. Proceedings of the Institute of Electrical and Electronics Engineers Conference on Computer Vision and Pattern Recognition. 2015: 1-9. |
[3] | SHI X, CHEN Z, WANG H, et al. Convolutional LSTM network: A machine learning approach for precipitation nowcasting[J]. Advances in neural information processing systems, 2015, 28. |
[4] | ESTRACH J B, ZAREMBA W, SZLAM A, et al. Spectral networks and deep locally connected networks on graphs[C] 2nd International Conference on Learning Representations, International Conference on Learning Representations, 2014: 1-14. |
[5] | DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[J]. Computer Research Repository, 2016. |
[6] | XU B, SHEN H, CAO Q, et al. Graph wavelet neural network[C]. Proceedings of the International Conference on Learning Representations, arXiv preprint arXiv:1904. 07785, 2019. |
[7] | ZHANG M, CUI Z, NEUMANN M, et al. An end-to-end deep learning architecture for graph classification[C]. Proceedings of the Association for the Advancement of Artificial Intelligence conference on artificial intelligence. 2018, 32(1): 4438-4445. |
[8] | SHERVASHIDZE N, SCHWEITZER P, LEEUWEN V J E, et al. Weisfeiler-Lehman graph kernels[J]. Journal of Machine Learning Research, 2011, 12(3): 2539-2561. |
[9] |
CHOUDHARY K, DECOST B, CHEN C, et al. Recent advances and applications of deep learning methods in materials science[J]. npj Computational Materials, 2022, 8(1): 1-26.
doi: 10.1038/s41524-021-00695-2 |
[10] | NIEPERT M, AHMED M, KUTZKOV K. Learning convolutional neural networks for graphs[C]. International conference on machine learning. Proceedings of Machine Learning Research, 2016: 2014-2023. |
[11] | HAMILTON W, YING Z, LESKOVEC J. Inductive representation learning on large graphs[J]. Advances in neural information processing systems, 2017, 30: 1025-1035. |
[12] | CHEN J, MA T, XIAO C. FASTGCN: fast learning with graph convolutional networks via importance sampling[J]. arXiv preprint arXiv: 1801. 10247. |
[13] |
KEARNES S, KEVIN M, MARC B, et al. Molecular graph convolutions: moving beyond fingerprints[J]. Journal of Computer-Aided Molecular Design, 2016, 30(8): 595-608.
doi: 10.1007/s10822-016-9938-8 pmid: 27558503 |
[14] |
FABER F A, HUTCHISON L, HUANG B, et al. Prediction errors of molecular machine learning models lower than hybrid DFT error[J]. Journal of Chemical Theory and Computation, 2017, 13(11): 5255-5264.
doi: 10.1021/acs.jctc.7b00577 pmid: 28926232 |
[15] | SCHÜTT K, KINDERMANS P J, SAUCEDA FELIX H E, et al. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions[J]. Advances in Neural Information Processing Systems, 2017, 30: 992-1002. |
[16] | GILMER J, SCHOENHOLZ S S, RILEY P F, et al. Neural message passing for quantum chemistry[C]. International conference on machine learning, Proceedings of Machine Learning Research, 2017: 1263-1272. |
[17] |
YANG K, SWANSON K, JIN W, et al. Analyzing Learned Molecular Representations for Property Prediction[J]. Journal of chemical information and modeling, 2019, 59(8): 3370-3388.
doi: 10.1021/acs.jcim.9b00237 pmid: 31361484 |
[18] | SONG Y, ZHENG S, NIU Z, et al. Communicative Representation Learning on Attributed Molecular Graphs[C]. International Joint Conference on Artificial Intelligence, 2020: 2831-2838. |
[19] |
XIE T, GROSSMAN J C. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties[J]. Physical Review Letters, 2018, 120(14): 145301.
doi: 10.1103/PhysRevLett.120.145301 |
[20] |
CHEN C, YE W, ZUO Y, et al. Graph networks as a universal machine learning framework for molecules and crystals[J]. Chemistry of Materials, 2019, 31(9): 3564-3572.
doi: 10.1021/acs.chemmater.9b01294 |
[21] |
PARK W C, WOLVERTON C. Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery[J]. Physical Review Materials, 2020, 4(6): 063801.
doi: 10.1103/PhysRevMaterials.4.063801 |
[22] |
CHEN P, CHEN J, YAN H, et al. Improving material property prediction by leveraging the large-scale computational database and deep learning[J]. The Journal of Physical Chemistry C, 2022, 126(38): 16297-16305.
doi: 10.1021/acs.jpcc.2c03051 |
[23] | VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arXiv preprint arXiv: 1710. 10903, 2017. |
[24] |
COLEY C W, JIN W, ROGERS L, et al. A graph-convolutional neural network model for the prediction of chemical reactivity[J]. Chemical Science, 2019, 10(2): 370-377.
doi: 10.1039/c8sc04228d pmid: 30746086 |
[25] |
LOUIS S Y, ZHAO Y, NASIRI A, et al. Graph convolutional neural networks with global attention for improved materials property prediction[J]. Physical Chemistry Chemical Physics, 2020, 22(32): 18141-18148.
doi: 10.1039/D0CP01474E |
[26] |
VENTURI V, PARKS H L, AHMAD Z, et al. Machine learning enabled discovery of application dependent design principles for two-dimensional materials[J]. Machine Learning: Science and Technology, 2020, 1(3): 035015.
doi: 10.1088/2632-2153/aba002 |
[27] | KIM C, HUAN T D, KRISHNAN S, et al. A hybrid organic-inorganic perovskite dataset[J]. Scientific Data, 2017, 4(1): 1-11. |
[28] | CASTELLI I E, LANDIS D D, THYGESEN K S, et al. New cubic perovskites for one-and two-photon water splitting using the computational materials repository[J]. Energy & Environmental Science, 2012, 5(10): 9034-9043. |
[29] |
RAJAN A C, MISHRA A, SATSANGI S, et al. Machine-learning-assisted accurate band gap predictions of functionalized MXene[J]. Chemistry of Materials, 2018, 30(12): 4031-4038.
doi: 10.1021/acs.chemmater.8b00686 |
[30] |
NOH J, GU G H, KIM S, et al. Uncertainty-quantified hybrid machine learning/density functional theory high throughput screening method for crystals[J]. Journal of Chemical Information and Modeling, 2020, 60(4): 1996-2003.
doi: 10.1021/acs.jcim.0c00003 pmid: 32208718 |
[31] |
JANG J, GU G H, NOH J, et al. Structure-based synthesizability prediction of crystals using partially supervised learning[J]. Journal of the American Chemical Society, 2020, 142(44): 18836-18843.
doi: 10.1021/jacs.0c07384 pmid: 33104335 |
[32] |
GU G H, JANG J, NOH J, et al. Perovskite synthesizability using graph neural networks[J]. npj Computational Materials, 2022, 8(1): 71.
doi: 10.1038/s41524-022-00757-z |
[33] |
MORDELET F, VERT J P. A bagging SVM to learn from positive and unlabeled examples[J]. Pattern Recognition Letters, 2014, 37: 201-209.
doi: 10.1016/j.patrec.2013.06.010 |
[34] |
SUTTON C, BOLEY M, GHIRINGHELLI M L, et al. Identifying domains of applicability of machine learning models for materials science[J]. Nature communications, 2020, 11(1): 4428.
doi: 10.1038/s41467-020-17112-9 pmid: 32887879 |
[35] | WEISS K, KHOSHGOFTAAR M T, WANG D. A survey of transfer learning[J]. Journal of Big Data, 2016, 3(1): 1345-1359. |
[36] |
DAVIES W D, BUTLER T K, JACKSON J A, et al. Computational screening of all stoichiometric inorganic materials[J]. Chem, 2016, 1(4): 617-627.
doi: 10.1016/j.chempr.2016.09.010 pmid: 27790643 |
[37] |
REN Z K, TIAN S I P, NOH J, et al. An invertible crystallographic representation for general inverse design of inorganic crystals with targeted properties[J]. Matter, 2022, 5(1): 314-335.
doi: 10.1016/j.matt.2021.11.032 |
[38] |
ZUO Y X, QIN M D, CHEN C, et al. Accelerating materials discovery with Bayesian optimization and graph deep learning[J]. Materials Today, 2021, 51: 126-135.
doi: 10.1016/j.mattod.2021.08.012 |
[39] | CHENG G, GONG X G, YIN W J. Crystal structure prediction via combining graph network and bayesian optimization[J]. arXiv preprint arXiv: 2011. 10968, 2020. |
[40] |
LEE J, ASAHI R. Transfer learning for materials informatics using crystal graph convolutional neural network[J]. Computational Materials Science, 2021, 190: 110314.
doi: 10.1016/j.commatsci.2021.110314 |
[41] |
NGUYEN N, LOUIS S Y V, WEI L, et al. Predicting lattice vibrational frequencies using deep graph neural networks[J]. ACS omega, 2022, 7(30): 26641-26649.
doi: 10.1021/acsomega.2c02765 pmid: 35936410 |
[42] | LAUGIER L, BASH D, RECATALA J, et al. Predicting thermoelectric properties from crystal graphs and material descriptors-first application for functional materials[J]. arXiv preprint arXiv: 1811. 06219, 2018. |
[43] |
RICCI F, CHEN W, AYDEMIR U, et al. An ab initio electronic transport database for inorganic materials[J]. Scientific Data, 2017, 4(1): 170085.
doi: 10.1038/sdata.2017.85 |
[44] |
AHMAD Z, XIE T, MAHESHWARI C, et al. Machine learning enabled computational screening of inorganic solid electrolytes for suppression of dendrite formation in lithium metal anodes[J]. ACS Central Science, 2018, 4(8): 996-1006.
doi: 10.1021/acscentsci.8b00229 pmid: 30159396 |
[45] |
DAS K, SAMANTA B, GOYAL P, et al. CrysXPP: An explainable property predictor for crystalline materials[J]. npj Computational Materials, 2022, 8(1): 43.
doi: 10.1038/s41524-022-00716-8 |
[46] |
LEVÄMÄKI H., TASNÁDI F., SANGIOVANNI D. G., et al. Predicting elastic properties of hard-coating alloys using ab-initio and machine learning methods[J]. npj Computational Materials, 2022, 8(1): 17.
doi: 10.1038/s41524-022-00698-7 |
[47] |
SUN Y J, HU W P. Novel machine learning framework for thermal conductivity prediction by crystal graph convolution embedded ensemble[J]. SmartMat, 2022, 3(3): 474-481.
doi: 10.1002/smm2.v3.3 |
[48] | 许兴友, 杜江燕. 无机及分析化学[M]. 南京: 南京大学出版社, 2017: 410. |
[49] |
CHANUSSOT L, DAS A, GOYAL S, et al. Open catalyst 2020(OC20)dataset and community challenges[J]. ACS Catalysis, 2021, 11(10): 6059-6072.
doi: 10.1021/acscatal.0c04525 |
[50] | GODWIN J, SCHAARSCHMIDT M, GAUNT A L, et al. Simple GNN regularisation for 3d molecular property prediction and beyond[C]. International Conference on Learning Representations, 2021: 1-23. |
[51] | SHUAIBI M, KOLLURU A, DAS A, et al. Rotation invariant graph neural networks using spin convolutions[J]. arXiv preprint arXiv: 2106. 09575, 2021. |
[52] | SRIRAM A, DAS A, WOOD B M, et al. Towards training billion parameter graph neural networks for atomic simulations[J]. arXiv preprint arXiv: 2203. 09697, 2022. |
[53] | GASTEIGER J, SHUAIBI M, SRIRAM A, et al. How do graph networks generalize to large and diverse molecular systems?[J]. arXiv preprint arXiv: 2204. 02782, 2022. |
[54] | GASTEIGER J, BECKER F, GÜNNEMANN S. Gemnet: Universal directional graph neural networks for molecules[J]. Advances in Neural Information Processing Systems, 2021, 34: 6790-6802. |
[55] | KOROVIN A N, HUMONEN I S, SAMTSEVICH A I, et al. Boosting Heterogeneous Catalyst Discovery by Structurally Constrained Deep Learning Models[J]. arXiv preprint arXiv: 2207. 05013, 2022. |
[56] | SRIRAM A, DAS A, WOOD B M, et al. Towards training billion parameter graph neural networks for atomic simulations[J]. arXiv preprint arXiv: 2203. 09697, 2022. |
[57] |
KIM M, YEO B C, PARK Y, et al. Artificial intelligence to accelerate the discovery of n2 electroreduction catalysts[J]. Chemistry of Materials, 2019, 32(2): 709-720.
doi: 10.1021/acs.chemmater.9b03686 |
[58] |
BACK S, YOON J, TIAN N, et al. Convolutional neural network of atomic surface structures to predict binding energies for high-throughput screening of catalysts[J]. The journal of physical chemistry letters, 2019, 10(15): 4401-4408.
doi: 10.1021/acs.jpclett.9b01428 |
[59] |
TRAN K, ULISSI Z W. Active learning across intermetallics to guide discovery of electrocatalysts for CO2 reduction and H2 evolution[J]. Nature Catalysis, 2018, 1(9): 696-703.
doi: 10.1038/s41929-018-0142-1 |
[60] |
GU G H, NOH J, KIM S, et al. Practical deep-learning representation for fast heterogeneous catalyst screening[J]. The Journal of Physical Chemistry Letters, 2020, 11(9): 3185-3191.
doi: 10.1021/acs.jpclett.0c00634 |
[61] |
LI X, CHIONG R, HU Z, et al. Low-cost Pt alloys for heterogeneous catalysis predicted by density functional theory and active learning[J]. The Journal of Physical Chemistry Letters, 2021, 12(30): 7305-7311.
doi: 10.1021/acs.jpclett.1c01851 |
[62] | KLICPERA J, GROß J, GÜNNEMANN S. Directional message passing for molecular graphs[J]. arXiv preprint arXiv: 2003. 03123, 2020. |
[63] | KLICPERA J, GIRI S, MARGRAF J T, et al. Fast and uncertainty-aware directional message passing for non-equilibrium molecules[J]. arXiv preprint arXiv: 2011. 14115, 2020. |
[64] | WU S, WANG Z, ZHANG H, et al. Deep learning accelerates the discovery of two-dimensional catalysts for hydrogen evolution reaction[J]. Energy & Environmental Materials, 2021, 8: 1-7. |
[65] |
WANG S H, PILLAI H S, WANG S, et al. Infusing theory into deep learning for interpretable reactivity prediction[J]. Nature Communications, 2021, 12(1): 1-9.
doi: 10.1038/s41467-020-20314-w |
[66] | SRIVASTAVA N, HINTON G E, KRIZHEVSKY A. Dropout: a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(1): 1929-1958. |
[67] |
BROWNE M W. Cross-validation methods[J]. Journal of Mathematical Psychology, 2000, 44(1): 108-132.
pmid: 10733860 |
[68] | ARLOT S, CELISSE A. A survey of cross-validation procedures for model selection[J]. Statistics Surveys, 2010, 4: 40-79. |
[69] | BOTTOU L. Stochastic gradient descent tricks[M]. Neural networks: Tricks of the trade, Springer, Berlin, Heidelberg, 2012: 421-436. |
[70] | RUDER S. An overview of gradient descent optimization algorithms[J]. arXiv preprint arXiv: 1609. 04747, 2016. |
[71] | KHIRIRAT S, FEYZMAHDAVIAN H R, JOHANSSON M. Mini-batch gradient descent: Faster convergence under data sparsity[C]. 2017 56th Annual Conference on Decision and Control (CDC), IEEE, 2017: 2880-2887. |
[72] |
SAGI O, ROKACH L. Ensemble learning: A survey[J]. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2018, 8(4): e1249.
doi: 10.1002/widm.2018.8.issue-4 |
[73] |
DONG X, YU Z, CAO W, et al. A survey on ensemble learning[J]. Frontiers of Computer Science, 2020, 14(2): 241-258.
doi: 10.1007/s11704-019-8208-z |
[74] | MEREDIG B, ANTONO E, CHURCH C, et al. Can machine learning identify the next high-temperature superconductor? Examining extrapolation performance for materials discovery[J]. Molecular Systems Design & Engineering, 2018, 3(5): 819-825. |
[75] | OVADIA Y, FERTIG E, REN J, et al. Can you trust your model's uncertainty? evaluating predictive uncertainty under dataset shift[J]. Advances in Neural Information Processing Systems, 2019, 32: 14003-14014. |
[76] |
SUTTON C, BOLEY M, GHIRINGHELLI L M, et al. Identifying domains of applicability of machine learning models for materials science[J]. Nature Communications, 2020, 11(1): 1-9.
doi: 10.1038/s41467-019-13993-7 |
[77] | MOSCATO P, HAQUE M N, HUANG K, et al. Learning to extrapolate using continued fractions: Predicting the critical temperature of superconductor materials[J]. arXiv preprint arXiv: 2012. 03774, 2020. |
[78] |
FUNG V, ZHANG J, JUAREZ E, et al. Benchmarking graph neural networks for materials chemistry[J]. npj Computational Materials, 2021, 7(1): 1-8.
doi: 10.1038/s41524-020-00473-6 |
[1] | 刘端阳, 魏钟鸣. 有监督学习算法在材料科学中的应用[J]. 数据与计算发展前沿, 2023, 5(4): 38-47. |
[2] | 王宗国,万萌,陈子逸,李凯,王晓光,刘淼,孟胜,王彦棡. 数据驱动的材料智能设计平台研究与应用[J]. 数据与计算发展前沿, 2023, 5(2): 86-96. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||