There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal compo...There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in adoption of the new highly potential advanced technologies while planning experimental designs, data collection, analysis and interpretation of their research data sets.展开更多
Statistics are most crucial than ever due to the accessibility of huge counts of data from several domains such as finance,medicine,science,engineering,and so on.Statistical data mining(SDM)is an interdisciplinary dom...Statistics are most crucial than ever due to the accessibility of huge counts of data from several domains such as finance,medicine,science,engineering,and so on.Statistical data mining(SDM)is an interdisciplinary domain that examines huge existing databases to discover patterns and connections from the data.It varies in classical statistics on the size of datasets and on the detail that the data could not primarily be gathered based on some experimental strategy but conversely for other resolves.Thus,this paper introduces an effective statistical Data Mining for Intelligent Rainfall Prediction using Slime Mould Optimization with Deep Learning(SDMIRPSMODL)model.In the presented SDMIRP-SMODL model,the feature subset selection process is performed by the SMO algorithm,which in turn minimizes the computation complexity.For rainfall prediction.Convolution neural network with long short-term memory(CNN-LSTM)technique is exploited.At last,this study involves the pelican optimization algorithm(POA)as a hyperparameter optimizer.The experimental evaluation of the SDMIRP-SMODL approach is tested utilizing a rainfall dataset comprising 23682 samples in the negative class and 1865 samples in the positive class.The comparative outcomes reported the supremacy of the SDMIRP-SMODL model compared to existing techniques.展开更多
Due to the complex nature of multi-source geological data, it is difficult to rebuild every geological structure through a single 3D modeling method. The multi-source data interpretation method put forward in this ana...Due to the complex nature of multi-source geological data, it is difficult to rebuild every geological structure through a single 3D modeling method. The multi-source data interpretation method put forward in this analysis is based on a database-driven pattern and focuses on the discrete and irregular features of geological data. The geological data from a variety of sources covering a range of accuracy, resolution, quantity and quality are classified and integrated according to their reliability and consistency for 3D modeling. The new interpolation-approximation fitting construction algorithm of geological surfaces with the non-uniform rational B-spline(NURBS) technique is then presented. The NURBS technique can retain the balance among the requirements for accuracy, surface continuity and data storage of geological structures. Finally, four alternative 3D modeling approaches are demonstrated with reference to some examples, which are selected according to the data quantity and accuracy specification. The proposed approaches offer flexible modeling patterns for different practical engineering demands.展开更多
In order to detect fault exactly and quickly, cusp catastrophe theory is used to interpret 3D coal seismic data in this paper. By establishing a cusp model, seismic signal is transformed into standard form of cusp cat...In order to detect fault exactly and quickly, cusp catastrophe theory is used to interpret 3D coal seismic data in this paper. By establishing a cusp model, seismic signal is transformed into standard form of cusp catastrophe and catastrophe parameters, including time-domain catastrophe potential, time-domain catastrophe time, frequency-domain catastrophe potential and frequency- domain degree, are calculated. Catastrophe theory is used in 3D seismic structural interpretation in coal mine. The results show that the position of abnormality of the catastrophe parameter profile or curve is related to the location of fault, and the cusp catastrophe theory is effective to automatically pick up geology information and improve the interpretation precision in 3D seismic data.展开更多
Traffic tunnels include tunnel works for traffic and transport in the areas of railway, highway, and rail transit. With many mountains and nearly one fifth of the global population, China possesses numerous large citi...Traffic tunnels include tunnel works for traffic and transport in the areas of railway, highway, and rail transit. With many mountains and nearly one fifth of the global population, China possesses numerous large cities and megapolises with rapidly growing economies and huge traffic demands. As a result, a great deal of railway, highway, and rail transit facilities are required in this country. In the past, the construction of these facilities mainly involved subgrade and bridge works; in recent years.展开更多
Statistical study is first performed of the efficiency of the technique of statistical interpretation using the products of NWP. The result shows that the application of the technique has improved the predictabilily o...Statistical study is first performed of the efficiency of the technique of statistical interpretation using the products of NWP. The result shows that the application of the technique has improved the predictabilily of predictors in objective forecasting of tropical cyclone motion, increased the forecasting skill of models and extended the valid period of forecast. Then a discussion is made of some technical problems in the application in the motion forecasting, suggesting: a large sample of data and perfect forecast method be employed in constructing objective forecast models for tropical cyclone motion, predictors be included that are so finely built that they reflect all synoptic features and physical quantity fields and NWP products be used and corrected that are available at multiple times. It is one of the effective ways to improve the skill and stability of the forecast by composite use of outcomes from various forecasting models.展开更多
The statistical map is usually used to indicate the quantitative features of various socio economic phenomena among regions on the base map of administrative divisions or on other base maps which connected with stati...The statistical map is usually used to indicate the quantitative features of various socio economic phenomena among regions on the base map of administrative divisions or on other base maps which connected with statistical unit. Making use of geographic information system (GIS) techniques, and supported by Auto CAD software, the author of this paper has put forward a practical method for making statistical map and developed a software (SMT) for the making of small scale statistical map using C language.展开更多
The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods...The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.展开更多
Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at t...Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at the sampled point in the roadway's roof,and then,how the statistical processing of the available geomechanical data can affect the results of numerical modelling of the roadway's stability.Four cases were applied in the numerical analysis,using average values(the most common in geomechanical data analysis),average minus standard deviation,median,and average value minus statistical error.The study show that different approach to the same geomechanical data set can change the modelling results considerably.The case shows that average minus standard deviation is the most conservative and least risky.It gives the displacements and yielded elements zone in four times broader range comparing to the average values scenario,which is the least conservative option.The two other cases need to be studied further.The results obtained from them are placed between most favorable and most adverse values.Taking the average values corrected by statistical error for the numerical analysis seems to be the best solution.Moreover,the confidence level can be adjusted depending on the object importance and the assumed risk level.展开更多
Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observa...Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observatories.Traditional approaches to seeing prediction mostly rely on regional weather models to capture the in-dome optical turbulence patterns.Thanks to the developing of data gathering and aggregation facilities of astronomical observatories in recent years,data-driven approaches are becoming increasingly feasible and attractive to predict astronomical seeing.This paper systematically investigates data-driven approaches to seeing prediction by leveraging various big data techniques,from traditional statistical modeling,machine learning to new emerging deep learning methods,on the monitoring data of the Large sky Area Multi-Object fiber Spectroscopic Telescope(LAMOST).The raw monitoring data are preprocessed to allow for big data modeling.Then we formulate the seeing prediction task under each type of modeling framework and develop seeing prediction models through using representative big data techniques,including ARIMA and Prophet for statistical modeling,MLP and XGBoost for machine learning,and LSTM,GRU and Transformer for deep learning.We perform empirical studies on the developed models with a variety of feature configurations,yielding notable insights into the applicability of big data techniques to the seeing prediction task.展开更多
Extracting and parameterizing ionospheric waves globally and statistically is a longstanding problem. Based on the multichannel maximum entropy method(MMEM) used for studying ionospheric waves by previous work, we c...Extracting and parameterizing ionospheric waves globally and statistically is a longstanding problem. Based on the multichannel maximum entropy method(MMEM) used for studying ionospheric waves by previous work, we calculate the parameters of ionospheric waves by applying the MMEM to numerously temporally approximate and spatially close global-positioning-system radio occultation total electron content profile triples provided by the unique clustered satellites flight between years 2006 and 2007 right after the constellation observing system for meteorology, ionosphere, and climate(COSMIC) mission launch. The results show that the amplitude of ionospheric waves increases at the low and high latitudes(~0.15 TECU) and decreases in the mid-latitudes(~0.05 TECU). The vertical wavelength of the ionospheric waves increases in the mid-latitudes(e.g., ~50 km at altitudes of 200–250 km) and decreases at the low and high latitudes(e.g., ~35 km at altitudes of 200–250 km).The horizontal wavelength shows a similar result(e.g., ~1400 km in the mid-latitudes and ~800 km at the low and high latitudes).展开更多
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction...We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.展开更多
We investigate the major characteristics of the occurrences, causes of and counter measures for aircraft accidents in Japan. We apply statistical data analysis and mathematical modeling techniques to determine the rel...We investigate the major characteristics of the occurrences, causes of and counter measures for aircraft accidents in Japan. We apply statistical data analysis and mathematical modeling techniques to determine the relations among economic growth, aviation demand, the frequency of aircraft/helicopter accidents, the major characteristics of the occurrence intervals of accidents, and the number of fatalities due to accidents. The statistical model analysis suggests that the occurrence intervals of accidents and the number of fatalities can be explained by probability distributions such as the exponential distribution and the negative binomial distribution, respectively. We show that countermeasures for preventing accidents have been developed in every aircraft model, and thus they have contributed to a significant decrease in the number of accidents in the last three decades. We find that the major cause of accidents involving large airplanes has been weather, while accidents involving small airplanes and helicopters are mainly due to the pilot error. We also discover that, with respect to accidents mainly due to pilot error, there is a significant decrease in the number of accidents due to the aging of airplanes, whereas the number of accidents due to weather has barely declined. We further determine that accidents involving small and large airplanes mostly occur during takeoff and landing, whereas those involving helicopters are most likely to happen during flight. In order to decrease the number of accidents, i) enhancing safety and security by further developing technologies for aircraft, airports and air control radars, ii) establishing and improving training methods for crew including pilots, mechanics and traffic controllers, iii) tightening public rules, and iv) strengthening efforts made by individual aviation-related companies are absolutely necessary.展开更多
The most common way to analyze economics data is to use statistics software and spreadsheets.The paper presents opportunities of modern Geographical Information System (GIS) for analysis of marketing, statistical, a...The most common way to analyze economics data is to use statistics software and spreadsheets.The paper presents opportunities of modern Geographical Information System (GIS) for analysis of marketing, statistical, and macroeconomic data. It considers existing tools and models and their applications in various sectors. The advantage is that the statistical data could be combined with geographic views, maps and also additional data derived from the GIS. As a result, a programming system is developed, using GIS for analysis of marketing, statistical, macroeconomic data, and risk assessment in real time and prevention. The system has been successfully implemented as web-based software application designed for use with a variety of hardware platforms (mobile devices, laptops, and desktop computers). The software is mainly written in the programming language Python, which offers a better structure and supports for the development of large applications. Optimization of the analysis, visualization of macroeconomic, and statistical data by region for different business research are achieved. The system is designed with Geographical Information System for settlements in their respective countries and regions. Information system integration with external software packages for statistical calculations and analysis is implemented in order to share data analyzing, processing, and forecasting. Technologies and processes for loading data from different sources and tools for data analysis are developed. The successfully developed system allows implementation of qualitative data analysis.展开更多
This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as econom...This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as economic trends,industrial operations and marketing strategies.展开更多
Results of a research about statistical reasoning that six high school teachers developed in a computer environment are presented in this article. A sequence of three activities with the support of software Fathom was...Results of a research about statistical reasoning that six high school teachers developed in a computer environment are presented in this article. A sequence of three activities with the support of software Fathom was presented to the teachers in a course to investigate about the reasoning that teachers develop about the data analysis, particularly about the distribution concept, that involves important concepts such as averages, variability and graphics representations. The design of the activities was planned so that the teachers analyzed quantitative variables separately first, and later made an analysis of a qualitative variable versus a quantitative variable with the objective of establishing comparisons between distributions and use concepts as averages, variability, shape and outliers. The instructions in each activity indicated to the teachers to use all the resources of the software that were necessary to make the complete analysis and respond to certain questions that pretended to capture the type of representations they used to answer. The results indicate that despite the abundance of representations provided by the software, teachers focu,; on the calculation of averages to describe and compare distributions, rather than on the important properties of data such as variability, :shape and outliers. Many teachers were able to build interesting graphs reflecting important properties of the data, but cannot use them 1:o support data analysis. Hence, it is necessary to extend the teachers' understanding on data analysis so they can take advantage of the cognitive potential that computer tools to offer.展开更多
According to statistics of Printing and Printing Equipment Industries Association of China (PEIAC), the total output value of printing industry of China in 2007 reached 440 billion RMB , the total output value of prin...According to statistics of Printing and Printing Equipment Industries Association of China (PEIAC), the total output value of printing industry of China in 2007 reached 440 billion RMB , the total output value of printing equipment was展开更多
Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and ...Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and another using the Mann-Whitney test. First, the reliability of the TCO measurements was studied hemispherically. While similar coincidences and levels of significance > 0.05 were found with the two statistical tests, an enormous variability in the levels of significance throughout the year was also exposed. Then, using the same statistical comparison methods, a latitudinal study was carried out in order to elucidate the geographical distribution that gave rise to this variability. Our study reveals that between the TOMS and OMI measurements in 2005 there was only a coincidence in 50% of the latitudes, which explained the variability. This implies that for 2005, the TOMS measurements are not completely reliable, except between the -50° and -15° latitude band in the southern hemisphere and between +15° and +50° latitude band in the northern hemisphere. In the case of OMI-OMPS, we observe that between 2011 and 2016 the measurements of both satellite systems are reasonably similar with a confidence level higher than 95%. However, in 2017 a band with a width of 20° latitude centered on the equator appeared, in which the significance levels were much less than 0.05, indicating that one of the measurement systems had begun to fail. In 2018, the fault was not only located in the equator, but was also replicated in various bands in the Southern Hemisphere. We interpret this as evidence of irreversible failure in one of the measurement systems.展开更多
The loess plateau covering the North Shaanxi slope and Tianhuan depression consists of a regional monocline, high in the east and low in the west, with dips of less than 1^0, Structural movement in this region was wea...The loess plateau covering the North Shaanxi slope and Tianhuan depression consists of a regional monocline, high in the east and low in the west, with dips of less than 1^0, Structural movement in this region was weak so that faults and local structures were not well developed. As a result, numerous wide and gentle noses and small traps with magnitudes less than 50 m were developed on the large westward-dipping monocline. Reservoirs, including Mesozoic oil reservoirs and Paleozoic gas reservoirs in the Ordos Basin, are dominantly lithologic with a small number of structural reservoirs. Single reservoirs are characterized as thin with large lateral variations, strong anisotropy, low porosity, low permeability, and low richness. A series of approaches for predicting reservoir thickness, physical properties, and hydrocarbon potential of subtle lithologic reservoirs was established based on the interpretation of erosion surfaces.展开更多
文摘There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in adoption of the new highly potential advanced technologies while planning experimental designs, data collection, analysis and interpretation of their research data sets.
基金This research was partly supported by the Technology Development Program of MSS[No.S3033853]by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A4A1031509).
文摘Statistics are most crucial than ever due to the accessibility of huge counts of data from several domains such as finance,medicine,science,engineering,and so on.Statistical data mining(SDM)is an interdisciplinary domain that examines huge existing databases to discover patterns and connections from the data.It varies in classical statistics on the size of datasets and on the detail that the data could not primarily be gathered based on some experimental strategy but conversely for other resolves.Thus,this paper introduces an effective statistical Data Mining for Intelligent Rainfall Prediction using Slime Mould Optimization with Deep Learning(SDMIRPSMODL)model.In the presented SDMIRP-SMODL model,the feature subset selection process is performed by the SMO algorithm,which in turn minimizes the computation complexity.For rainfall prediction.Convolution neural network with long short-term memory(CNN-LSTM)technique is exploited.At last,this study involves the pelican optimization algorithm(POA)as a hyperparameter optimizer.The experimental evaluation of the SDMIRP-SMODL approach is tested utilizing a rainfall dataset comprising 23682 samples in the negative class and 1865 samples in the positive class.The comparative outcomes reported the supremacy of the SDMIRP-SMODL model compared to existing techniques.
基金Supported by the National Natural Science Foundation of China(No.51379006 and No.51009106)the Program for New Century Excellent Talents in University of Ministry of Education of China(No.NCET-12-0404)the National Basic Research Program of China("973"Program,No.2013CB035903)
文摘Due to the complex nature of multi-source geological data, it is difficult to rebuild every geological structure through a single 3D modeling method. The multi-source data interpretation method put forward in this analysis is based on a database-driven pattern and focuses on the discrete and irregular features of geological data. The geological data from a variety of sources covering a range of accuracy, resolution, quantity and quality are classified and integrated according to their reliability and consistency for 3D modeling. The new interpolation-approximation fitting construction algorithm of geological surfaces with the non-uniform rational B-spline(NURBS) technique is then presented. The NURBS technique can retain the balance among the requirements for accuracy, surface continuity and data storage of geological structures. Finally, four alternative 3D modeling approaches are demonstrated with reference to some examples, which are selected according to the data quantity and accuracy specification. The proposed approaches offer flexible modeling patterns for different practical engineering demands.
文摘In order to detect fault exactly and quickly, cusp catastrophe theory is used to interpret 3D coal seismic data in this paper. By establishing a cusp model, seismic signal is transformed into standard form of cusp catastrophe and catastrophe parameters, including time-domain catastrophe potential, time-domain catastrophe time, frequency-domain catastrophe potential and frequency- domain degree, are calculated. Catastrophe theory is used in 3D seismic structural interpretation in coal mine. The results show that the position of abnormality of the catastrophe parameter profile or curve is related to the location of fault, and the cusp catastrophe theory is effective to automatically pick up geology information and improve the interpretation precision in 3D seismic data.
文摘Traffic tunnels include tunnel works for traffic and transport in the areas of railway, highway, and rail transit. With many mountains and nearly one fifth of the global population, China possesses numerous large cities and megapolises with rapidly growing economies and huge traffic demands. As a result, a great deal of railway, highway, and rail transit facilities are required in this country. In the past, the construction of these facilities mainly involved subgrade and bridge works; in recent years.
文摘Statistical study is first performed of the efficiency of the technique of statistical interpretation using the products of NWP. The result shows that the application of the technique has improved the predictabilily of predictors in objective forecasting of tropical cyclone motion, increased the forecasting skill of models and extended the valid period of forecast. Then a discussion is made of some technical problems in the application in the motion forecasting, suggesting: a large sample of data and perfect forecast method be employed in constructing objective forecast models for tropical cyclone motion, predictors be included that are so finely built that they reflect all synoptic features and physical quantity fields and NWP products be used and corrected that are available at multiple times. It is one of the effective ways to improve the skill and stability of the forecast by composite use of outcomes from various forecasting models.
文摘The statistical map is usually used to indicate the quantitative features of various socio economic phenomena among regions on the base map of administrative divisions or on other base maps which connected with statistical unit. Making use of geographic information system (GIS) techniques, and supported by Auto CAD software, the author of this paper has put forward a practical method for making statistical map and developed a software (SMT) for the making of small scale statistical map using C language.
文摘The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.
文摘Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at the sampled point in the roadway's roof,and then,how the statistical processing of the available geomechanical data can affect the results of numerical modelling of the roadway's stability.Four cases were applied in the numerical analysis,using average values(the most common in geomechanical data analysis),average minus standard deviation,median,and average value minus statistical error.The study show that different approach to the same geomechanical data set can change the modelling results considerably.The case shows that average minus standard deviation is the most conservative and least risky.It gives the displacements and yielded elements zone in four times broader range comparing to the average values scenario,which is the least conservative option.The two other cases need to be studied further.The results obtained from them are placed between most favorable and most adverse values.Taking the average values corrected by statistical error for the numerical analysis seems to be the best solution.Moreover,the confidence level can be adjusted depending on the object importance and the assumed risk level.
基金supported by the National Natural Science Foundation of China(U1931207,61602278 and 61702306)Sci.&Tech.Development Fund of Shandong Province of China(2016ZDJS02A11,ZR2017BF015 and ZR2017MF027)+1 种基金the Humanities and Social Science Research Project of the Ministry of Education(18YJAZH017)the Taishan Scholar Program of Shandong Province,and the Science and Technology Support Plan of Youth Innovation Team of Shandong Higher School(2019KJN024)。
文摘Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observatories.Traditional approaches to seeing prediction mostly rely on regional weather models to capture the in-dome optical turbulence patterns.Thanks to the developing of data gathering and aggregation facilities of astronomical observatories in recent years,data-driven approaches are becoming increasingly feasible and attractive to predict astronomical seeing.This paper systematically investigates data-driven approaches to seeing prediction by leveraging various big data techniques,from traditional statistical modeling,machine learning to new emerging deep learning methods,on the monitoring data of the Large sky Area Multi-Object fiber Spectroscopic Telescope(LAMOST).The raw monitoring data are preprocessed to allow for big data modeling.Then we formulate the seeing prediction task under each type of modeling framework and develop seeing prediction models through using representative big data techniques,including ARIMA and Prophet for statistical modeling,MLP and XGBoost for machine learning,and LSTM,GRU and Transformer for deep learning.We perform empirical studies on the developed models with a variety of feature configurations,yielding notable insights into the applicability of big data techniques to the seeing prediction task.
基金Supported by the National Natural Science Foundation of China under Grant Nos 41774158,41474129 and 41704148the Chinese Meridian Projectthe Youth Innovation Promotion Association of the Chinese Academy of Sciences under Grant No2011324
文摘Extracting and parameterizing ionospheric waves globally and statistically is a longstanding problem. Based on the multichannel maximum entropy method(MMEM) used for studying ionospheric waves by previous work, we calculate the parameters of ionospheric waves by applying the MMEM to numerously temporally approximate and spatially close global-positioning-system radio occultation total electron content profile triples provided by the unique clustered satellites flight between years 2006 and 2007 right after the constellation observing system for meteorology, ionosphere, and climate(COSMIC) mission launch. The results show that the amplitude of ionospheric waves increases at the low and high latitudes(~0.15 TECU) and decreases in the mid-latitudes(~0.05 TECU). The vertical wavelength of the ionospheric waves increases in the mid-latitudes(e.g., ~50 km at altitudes of 200–250 km) and decreases at the low and high latitudes(e.g., ~35 km at altitudes of 200–250 km).The horizontal wavelength shows a similar result(e.g., ~1400 km in the mid-latitudes and ~800 km at the low and high latitudes).
文摘We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.
文摘We investigate the major characteristics of the occurrences, causes of and counter measures for aircraft accidents in Japan. We apply statistical data analysis and mathematical modeling techniques to determine the relations among economic growth, aviation demand, the frequency of aircraft/helicopter accidents, the major characteristics of the occurrence intervals of accidents, and the number of fatalities due to accidents. The statistical model analysis suggests that the occurrence intervals of accidents and the number of fatalities can be explained by probability distributions such as the exponential distribution and the negative binomial distribution, respectively. We show that countermeasures for preventing accidents have been developed in every aircraft model, and thus they have contributed to a significant decrease in the number of accidents in the last three decades. We find that the major cause of accidents involving large airplanes has been weather, while accidents involving small airplanes and helicopters are mainly due to the pilot error. We also discover that, with respect to accidents mainly due to pilot error, there is a significant decrease in the number of accidents due to the aging of airplanes, whereas the number of accidents due to weather has barely declined. We further determine that accidents involving small and large airplanes mostly occur during takeoff and landing, whereas those involving helicopters are most likely to happen during flight. In order to decrease the number of accidents, i) enhancing safety and security by further developing technologies for aircraft, airports and air control radars, ii) establishing and improving training methods for crew including pilots, mechanics and traffic controllers, iii) tightening public rules, and iv) strengthening efforts made by individual aviation-related companies are absolutely necessary.
文摘The most common way to analyze economics data is to use statistics software and spreadsheets.The paper presents opportunities of modern Geographical Information System (GIS) for analysis of marketing, statistical, and macroeconomic data. It considers existing tools and models and their applications in various sectors. The advantage is that the statistical data could be combined with geographic views, maps and also additional data derived from the GIS. As a result, a programming system is developed, using GIS for analysis of marketing, statistical, macroeconomic data, and risk assessment in real time and prevention. The system has been successfully implemented as web-based software application designed for use with a variety of hardware platforms (mobile devices, laptops, and desktop computers). The software is mainly written in the programming language Python, which offers a better structure and supports for the development of large applications. Optimization of the analysis, visualization of macroeconomic, and statistical data by region for different business research are achieved. The system is designed with Geographical Information System for settlements in their respective countries and regions. Information system integration with external software packages for statistical calculations and analysis is implemented in order to share data analyzing, processing, and forecasting. Technologies and processes for loading data from different sources and tools for data analysis are developed. The successfully developed system allows implementation of qualitative data analysis.
文摘This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as economic trends,industrial operations and marketing strategies.
文摘Results of a research about statistical reasoning that six high school teachers developed in a computer environment are presented in this article. A sequence of three activities with the support of software Fathom was presented to the teachers in a course to investigate about the reasoning that teachers develop about the data analysis, particularly about the distribution concept, that involves important concepts such as averages, variability and graphics representations. The design of the activities was planned so that the teachers analyzed quantitative variables separately first, and later made an analysis of a qualitative variable versus a quantitative variable with the objective of establishing comparisons between distributions and use concepts as averages, variability, shape and outliers. The instructions in each activity indicated to the teachers to use all the resources of the software that were necessary to make the complete analysis and respond to certain questions that pretended to capture the type of representations they used to answer. The results indicate that despite the abundance of representations provided by the software, teachers focu,; on the calculation of averages to describe and compare distributions, rather than on the important properties of data such as variability, :shape and outliers. Many teachers were able to build interesting graphs reflecting important properties of the data, but cannot use them 1:o support data analysis. Hence, it is necessary to extend the teachers' understanding on data analysis so they can take advantage of the cognitive potential that computer tools to offer.
文摘According to statistics of Printing and Printing Equipment Industries Association of China (PEIAC), the total output value of printing industry of China in 2007 reached 440 billion RMB , the total output value of printing equipment was
文摘Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and another using the Mann-Whitney test. First, the reliability of the TCO measurements was studied hemispherically. While similar coincidences and levels of significance > 0.05 were found with the two statistical tests, an enormous variability in the levels of significance throughout the year was also exposed. Then, using the same statistical comparison methods, a latitudinal study was carried out in order to elucidate the geographical distribution that gave rise to this variability. Our study reveals that between the TOMS and OMI measurements in 2005 there was only a coincidence in 50% of the latitudes, which explained the variability. This implies that for 2005, the TOMS measurements are not completely reliable, except between the -50° and -15° latitude band in the southern hemisphere and between +15° and +50° latitude band in the northern hemisphere. In the case of OMI-OMPS, we observe that between 2011 and 2016 the measurements of both satellite systems are reasonably similar with a confidence level higher than 95%. However, in 2017 a band with a width of 20° latitude centered on the equator appeared, in which the significance levels were much less than 0.05, indicating that one of the measurement systems had begun to fail. In 2018, the fault was not only located in the equator, but was also replicated in various bands in the Southern Hemisphere. We interpret this as evidence of irreversible failure in one of the measurement systems.
文摘The loess plateau covering the North Shaanxi slope and Tianhuan depression consists of a regional monocline, high in the east and low in the west, with dips of less than 1^0, Structural movement in this region was weak so that faults and local structures were not well developed. As a result, numerous wide and gentle noses and small traps with magnitudes less than 50 m were developed on the large westward-dipping monocline. Reservoirs, including Mesozoic oil reservoirs and Paleozoic gas reservoirs in the Ordos Basin, are dominantly lithologic with a small number of structural reservoirs. Single reservoirs are characterized as thin with large lateral variations, strong anisotropy, low porosity, low permeability, and low richness. A series of approaches for predicting reservoir thickness, physical properties, and hydrocarbon potential of subtle lithologic reservoirs was established based on the interpretation of erosion surfaces.