When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
A large amount of data can partly assure good fitting quality for the trained neural networks.When the quantity of experimental or on-site monitoring data is commonly insufficient and the quality is difficult to contr...A large amount of data can partly assure good fitting quality for the trained neural networks.When the quantity of experimental or on-site monitoring data is commonly insufficient and the quality is difficult to control in engineering practice,numerical simulations can provide a large amount of controlled high quality data.Once the neural networks are trained by such data,they can be used for predicting the properties/responses of the engineering objects instantly,saving the further computing efforts of simulation tools.Correspondingly,a strategy for efficiently transferring the input and output data used and obtained in numerical simulations to neural networks is desirable for engineers and programmers.In this work,we proposed a simple image representation strategy of numerical simulations,where the input and output data are all represented by images.The temporal and spatial information is kept and the data are greatly compressed.In addition,the results are readable for not only computers but also human resources.Some examples are given,indicating the effectiveness of the proposed strategy.展开更多
In data post-processing for quantum key distribution, it is essential to have a highly efficient error reconciliation protocol. Based on the key redistribution scheme, we analyze a one-way error reconciliation protoco...In data post-processing for quantum key distribution, it is essential to have a highly efficient error reconciliation protocol. Based on the key redistribution scheme, we analyze a one-way error reconciliation protocol by data simulation. The relationship between the error correction capability and the key generation efficiency of three kinds of Hamming code are demonstrated. The simulation results indicate that when the initial error rates are (0,1.5%], (1.5,4%], and (4,11%], using the Hamming (31,26), (15,11), and (7,4) codes to correct the error, respectively, the key generation rate will be maximized. Based on this, we propose a modified one-way error reconciliation protocol which employs a mixed Hamming code concatenation scheme. The error correction capability and key generation rate are verified through data simulation. Using the parameters of the posterior distribution based on the tested data, a simple method for estimating the bit error rate (BER) with a given confidence interval is estimated. The simulation results show that when the initial bit error rate is 10.00%, after 7 rounds of error correction, the error bits are eliminated completely, and the key generation rate is 10.36%; the BER expectation is 2.96×10^-10, and when the confidence is 95% the corresponding BER upper limit is 2.17×10^-9. By comparison, for the single (7,4) Hamming code error reconciliation scheme at a confidence of 95%,the key generation rate is only 6.09%, while the BER expectation is 5.92x 10"9, with a BER upper limit of 4.34×10^-8. Hence, our improved protocol is much better than the original one.展开更多
Traditional Chinese medicine(TCM) has been an indispensable source of drugs for curing various human diseases.However,the inherent chemical diversity and complexity of TCM restricted the safety and efficacy of its usa...Traditional Chinese medicine(TCM) has been an indispensable source of drugs for curing various human diseases.However,the inherent chemical diversity and complexity of TCM restricted the safety and efficacy of its usage.Over the past few decades,the combination of liquid chromatography with mass spectrometry has contributed greatly to the TCM qualitative analysis.And novel approaches have been continuously introduced to improve the analytical performance,including both the data acquisition methods to generate a large and informative dataset,and the data post-processing tools to extract the structure-related MS information.Furthermore,the fast-developing computer techniques and big data analytics have markedly enriched the data processing tools,bringing benefits of high efficiency and accuracy.To provide an up-to-date review of the latest techniques on the TCM qualitative analysis,multiple data-independent acquisition methods and data-dependent acquisition methods(precursor ion list,dynamic exclusion,mass tag,precursor ion scan,neutral loss scan,and multiple reaction monitoring)and post-processing techniques(mass defect filtering,diagnostic ion filtering,neutral loss filtering,mass spectral trees similarity filter,molecular networking,statistical analysis,database matching,etc.) were summarized and categorized.Applications of each technique and integrated analytical strategies were highlighted,discussion and future perspectives were proposed as well.展开更多
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金support from the National Natural Science Foundation of China(NSFC)(52178324).
文摘A large amount of data can partly assure good fitting quality for the trained neural networks.When the quantity of experimental or on-site monitoring data is commonly insufficient and the quality is difficult to control in engineering practice,numerical simulations can provide a large amount of controlled high quality data.Once the neural networks are trained by such data,they can be used for predicting the properties/responses of the engineering objects instantly,saving the further computing efforts of simulation tools.Correspondingly,a strategy for efficiently transferring the input and output data used and obtained in numerical simulations to neural networks is desirable for engineers and programmers.In this work,we proposed a simple image representation strategy of numerical simulations,where the input and output data are all represented by images.The temporal and spatial information is kept and the data are greatly compressed.In addition,the results are readable for not only computers but also human resources.Some examples are given,indicating the effectiveness of the proposed strategy.
基金supported in part by the Foundation for Key Program of Chinese Ministry of Education under Grant No.212177Scientific Research Foundation of the Education Department of Shaanxi Province under Grant No.12JK0973
文摘In data post-processing for quantum key distribution, it is essential to have a highly efficient error reconciliation protocol. Based on the key redistribution scheme, we analyze a one-way error reconciliation protocol by data simulation. The relationship between the error correction capability and the key generation efficiency of three kinds of Hamming code are demonstrated. The simulation results indicate that when the initial error rates are (0,1.5%], (1.5,4%], and (4,11%], using the Hamming (31,26), (15,11), and (7,4) codes to correct the error, respectively, the key generation rate will be maximized. Based on this, we propose a modified one-way error reconciliation protocol which employs a mixed Hamming code concatenation scheme. The error correction capability and key generation rate are verified through data simulation. Using the parameters of the posterior distribution based on the tested data, a simple method for estimating the bit error rate (BER) with a given confidence interval is estimated. The simulation results show that when the initial bit error rate is 10.00%, after 7 rounds of error correction, the error bits are eliminated completely, and the key generation rate is 10.36%; the BER expectation is 2.96×10^-10, and when the confidence is 95% the corresponding BER upper limit is 2.17×10^-9. By comparison, for the single (7,4) Hamming code error reconciliation scheme at a confidence of 95%,the key generation rate is only 6.09%, while the BER expectation is 5.92x 10"9, with a BER upper limit of 4.34×10^-8. Hence, our improved protocol is much better than the original one.
基金financially supported by National Key R&D Program of China (2018YFC1707900, 2019YFC1711000, and 2019YFC1711400)National Natural Science Foundation of China (82003938)+1 种基金State Key Program of National Natural Science Foundation of China (81530095)Qi-Huang Scholar of National Traditional Chinese Medicine Leading Talents Support Program (2018)。
文摘Traditional Chinese medicine(TCM) has been an indispensable source of drugs for curing various human diseases.However,the inherent chemical diversity and complexity of TCM restricted the safety and efficacy of its usage.Over the past few decades,the combination of liquid chromatography with mass spectrometry has contributed greatly to the TCM qualitative analysis.And novel approaches have been continuously introduced to improve the analytical performance,including both the data acquisition methods to generate a large and informative dataset,and the data post-processing tools to extract the structure-related MS information.Furthermore,the fast-developing computer techniques and big data analytics have markedly enriched the data processing tools,bringing benefits of high efficiency and accuracy.To provide an up-to-date review of the latest techniques on the TCM qualitative analysis,multiple data-independent acquisition methods and data-dependent acquisition methods(precursor ion list,dynamic exclusion,mass tag,precursor ion scan,neutral loss scan,and multiple reaction monitoring)and post-processing techniques(mass defect filtering,diagnostic ion filtering,neutral loss filtering,mass spectral trees similarity filter,molecular networking,statistical analysis,database matching,etc.) were summarized and categorized.Applications of each technique and integrated analytical strategies were highlighted,discussion and future perspectives were proposed as well.