Constraining neutrino mass remains an elusive challenge in modern physics.Precision measurements are expected from several upcoming cosmological probes of large-scale structure.Achieving this goal relies on an equal l...Constraining neutrino mass remains an elusive challenge in modern physics.Precision measurements are expected from several upcoming cosmological probes of large-scale structure.Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering.Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process.We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem.We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes.We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run,named Tian Nu,which uses 86%of the machine(13 824 compute nodes).With a total of 2.97 trillion particles,Tian Nu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale.We finish with a discussion of the unanticipated computational challenges that were encountered during the Tian Nu runtime.展开更多
基金the Special Program for Applied Research on Super Computation of the NSFC-Guangdong Joint Fund(the second phase)supported under the U.S.Department of Energy contract DE-AC02-06CH11357+12 种基金General Financial Grant No.2015M570884Special Financial Grant No.2016T90009 from the China Postdoctoral Science Foundationsupport from the European Commission under a Marie-Sklodwoska-Curie European Fellowship(EU project 656869)support from Mo ST 863 program 2012AA121701NSFC grant 11373030CAS grant QYZDJ-SSW-SLH017supported by the National Natural Science Foundation of China(Grant Nos.11573006,11528306,10473002 and 11135009)the National Basic Research Program of China(973 program)under grant No.2012CB821804the Fundamental Research Funds for the Central UniversitiesSciNet is funded by:the Canada Foundation for Innovation under the auspices of Compute Canadathe Government of Ontariothe Ontario Research Fund Research Excellencethe University of Toronto
文摘Constraining neutrino mass remains an elusive challenge in modern physics.Precision measurements are expected from several upcoming cosmological probes of large-scale structure.Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering.Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process.We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem.We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes.We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run,named Tian Nu,which uses 86%of the machine(13 824 compute nodes).With a total of 2.97 trillion particles,Tian Nu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale.We finish with a discussion of the unanticipated computational challenges that were encountered during the Tian Nu runtime.