Antiferromagnetic materials are exciting quantum materials with rich physics and great potential for applications.On the other hand, an accurate and efficient theoretical method is highly demanded for determining crit...Antiferromagnetic materials are exciting quantum materials with rich physics and great potential for applications.On the other hand, an accurate and efficient theoretical method is highly demanded for determining critical transition temperatures, Néel temperatures, of antiferromagnetic materials. The powerful graph neural networks(GNNs) that succeed in predicting material properties lose their advantage in predicting magnetic properties due to the small dataset of magnetic materials, while conventional machine learning models heavily depend on the quality of material descriptors. We propose a new strategy to extract high-level material representations by utilizing self-supervised training of GNNs on large-scale unlabeled datasets. According to the dimensional reduction analysis, we find that the learned knowledge about elements and magnetism transfers to the generated atomic vector representations. Compared with popular manually constructed descriptors and crystal graph convolutional neural networks, self-supervised material representations can help us to obtain a more accurate and efficient model for Néel temperatures, and the trained model can successfully predict high Néel temperature antiferromagnetic materials. Our self-supervised GNN may serve as a universal pre-training framework for various material properties.展开更多
基金supported by the Scientific Research Program from Science and Technology Bureau of Chongqing City (Grant No. cstc2020jcyj-msxm X0684)the Science and Technology Research Program of Chongqing Municipal Education Commission (Grant No. KJQN202000639)in part by the National Natural Science Foundation of China (Grant No. 12147102)
文摘Antiferromagnetic materials are exciting quantum materials with rich physics and great potential for applications.On the other hand, an accurate and efficient theoretical method is highly demanded for determining critical transition temperatures, Néel temperatures, of antiferromagnetic materials. The powerful graph neural networks(GNNs) that succeed in predicting material properties lose their advantage in predicting magnetic properties due to the small dataset of magnetic materials, while conventional machine learning models heavily depend on the quality of material descriptors. We propose a new strategy to extract high-level material representations by utilizing self-supervised training of GNNs on large-scale unlabeled datasets. According to the dimensional reduction analysis, we find that the learned knowledge about elements and magnetism transfers to the generated atomic vector representations. Compared with popular manually constructed descriptors and crystal graph convolutional neural networks, self-supervised material representations can help us to obtain a more accurate and efficient model for Néel temperatures, and the trained model can successfully predict high Néel temperature antiferromagnetic materials. Our self-supervised GNN may serve as a universal pre-training framework for various material properties.