Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out i...Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out in a limited time frame during the partial solar eclipse that occurred on 2011 January 4 and was observed in Canakkale and Ankara, Turkey. Additionally, records of the geomagnetic field spanning 24 hours, obtained from another observatory (in Iznik, Turkey), were also analyzed to check for any peculiar variations. In the data processing stage, a polynomial fit, following the application of a running average routine, was applied to the geomagnetic field data sets. Geomagnetic field data sets indicated there was a characteristic decrease at the beginning of the solar eclipse and this decrease can be well-correlated with previous geomagnetic field measurements that were taken during the total solar eclipse that was observed in Turkey on 2006 March 29. The behavior of the geomagnetic field is also consistent with previous observations in the literature. As a result of these analyses, it can be suggested that eclipses can cause a shielding effect on the geomagnetic field of the Earth.展开更多
In this paper a stochastic boundary element method (SEEM) is developed to analyze moderately thick plates with random material parameters and random thickness. Based on the Taylor series expansion, the boundary integr...In this paper a stochastic boundary element method (SEEM) is developed to analyze moderately thick plates with random material parameters and random thickness. Based on the Taylor series expansion, the boundary integration equations concerning the mean and deviation of the generalized displacements are derived, respectively. It is found that the randomness of material parameters is equivalent to a random load, so the mean and covariance matrices of unknown generalized boundary displacements and tractions can be obtained. Furthermore, the mean and covariance of generalized displacements and forces at internal points can also be obtained. A numerical example has been worked out with the method proposed and necessary analysis is made for the results.展开更多
The change of the development situation of market makes the competition among enterprises gradually turn totalents,technology, culture and other aspects,talent and technology has become an important embodiment of the ...The change of the development situation of market makes the competition among enterprises gradually turn totalents,technology, culture and other aspects,talent and technology has become an important embodiment of the core competitiveness ofenterprises.In this context,the role of human resource management is becoming more and more important,which needs to improve and innovatethe traditional human resource management mode in time,so as to improve the management level and management efficiency.The application ofstatistical analysis method in the human resource management of modem enterprises,can give full play to the role and value of various data,toimprove the disadvantages and defects in traditional methods of human resource management,can significantly enhance the level and efficiency ofhuman resource management,this paper makes a detailed analysis on its strategy of specific application, aims to provide guidance and reference.展开更多
A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method i...A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method is used to deal with uncertainties. Not necessarily knowing the probabilistic statistics characteristics of the uncertain variables, only little information on physical properties of material is needed in the interval analysis method, that is, the upper bound and lower bound of the uncertain variable. So the interval of response of the structure can be gotten through less computational efforts. The interval analysis method is efficient under the condition that probability approach cannot work well because of small samples and deficient statistics characteristics. For buckling load of a special cross-ply laminates and antisymmetric angle-ply laminates with all edges simply supported, calculations and comparisons between interval analysis method and probability method are performed.展开更多
The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the ...The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the main emission line of Fe XVⅢin the 94?of the Atmospheric Imaging Assembly.However,the diagnostic algorithms for Fe XVⅢ,including the differential emission measure(DEM)and linear diagnostics proposed by Del based on the DEM,have been greatly limited for a long time,and the results obtained are different from the predictions.In this paper,we use the outlier detection method to establish the nonlinear correlation between 94?and 171,193,211?based on the former researches by others.A neural network based on 171,193,211?is constructed to replace the low-temperature emission lines in the ARs of 94?.The predicted results are regarded as the low-temperature components of 94?,and then the predicted results are subtracted from 94?to obtain the outlier component of 94?,or Fe XVⅢ.Then,the outlier components obtained by neural network are compared with the Fe XVⅢobtained by DEM and Del's method,and a high similarity is found,which proves the reliability of neural network to obtain the high-temperature components of ARs,but there are still many differences.In order to analyze the differences between the Fe XVⅢobtained by the three methods,we subtract the Fe XVⅢobtained by the DEM and Del's method from the Fe XVⅢobtained by the neural network to obtain the residual value,and compare it with the results of Fe XIV in the temperature range of 6.1-6.45 MK.It is found that there is a great similarity,which also shows that the Fe XVⅢobtained by DEM and Del's method still has a large low-temperature component dominated by Fe XIV,and the Fe XVⅢobtained by neural network is relatively pure.展开更多
In order to study China's bryophyte,this paper uses bibliometrics for statistical analysis of literature about China's bryophyte during 2005-2015.The results show that in terms of published article distributio...In order to study China's bryophyte,this paper uses bibliometrics for statistical analysis of literature about China's bryophyte during 2005-2015.The results show that in terms of published article distribution of different journals,there are 13 kinds of journals with more than 5 papers about bryophyte,accounting for 32.5%; in terms of the number of papers published in different years,it was smallest in 2005,only 16,while it reached the largest number of 33 in 2008; in terms of the number of papers published for different first authors,there are most authors publishing less than 9 papers,accounting for 87.5%,there is only one author publishing 9 papers,and there are 5 people publishing more than 9 papers; in terms of author unit distribution,in the 278 articles collected,there are 12 units publishing papers of less than 6,accounting for 30%,the unit publishing the most papers(36) is Guizhou Normal University,5 units publish 6 papers,accounting for 12.5%,and the units publishing papers of less than 6 account for 57.5%; in terms of literature research level,there are most papers about basic and applied basic research(natural science),accounting for 91.2%,the papers about engineering and technology(natural science) account for 5.5%,and other papers account for 3.3%.展开更多
We take the papers,on performance appraisal of land consolidation ( including land consolidation benefits,etc. ) published in the core journals in the Chinese Journal Full-text Database,as the study samples; conduct a...We take the papers,on performance appraisal of land consolidation ( including land consolidation benefits,etc. ) published in the core journals in the Chinese Journal Full-text Database,as the study samples; conduct analysis in terms of the number of papers,the paper source journals,the impact of papers,research methods,research topics,and foundation project for research,through literature search and statistical analysis. The results show that the related scholars pay more and more attention to the study on the performance appraisal of land consolidation, and the number of papers increases overall; the core journals on agriculture,land,environment,economy and other areas,put increasing emphasis on the publishing of papers concerning the performance appraisal of land consolidation; in terms of citation frequency of papers,the impact of papers is wide,but the depth is not enough; the research methods are increasingly diversifying,and the research topics are concentrated; the foundation support is yet to be strengthened for research.展开更多
This paper is devoted to the homogenization and statistical multiscale analysis of a transient heat conduction problem in random porous materials with a nonlinear radiation boundary condition.A novel statistical multi...This paper is devoted to the homogenization and statistical multiscale analysis of a transient heat conduction problem in random porous materials with a nonlinear radiation boundary condition.A novel statistical multiscale analysis method based on the two-scale asymptotic expansion is proposed.In the statistical multiscale formulations,a unified linear homogenization procedure is established and the second-order correctors are introduced for modeling the nonlinear radiative heat transfer in random perforations,which are our main contributions.Besides,a numerical algorithm based on the statistical multiscale method is given in details.Numerical results prove the accuracy and efficiency of our method for multiscale simulation of transient nonlinear conduction and radiation heat transfer problem in random porous materials.展开更多
This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of...This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.展开更多
Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noi...Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noise level of a single datacube from MWISP and perform statistical analyses.We identified major factors which increase the noise level of a single datacube,including bad channels,edge effects,baseline distortion and line contamination.Cleaning algorithms are applied to remove or reduce these noise components.As a result,we obtained the cleaned datacube in which noise follows a positively skewed normal distribution.We further analyzed the noise structure distribution of a 3 D mosaicked datacube in the range l=40°7 to 43°3 and b=-2°3 to 0°3 and found that noise in the final mosaicked datacube is mainly characterized by noise fluctuation among the cells.展开更多
In recent years, aluminum-matrix composites (AMCs) have been widely used to replace cast iron in aerospace and automotive industries. Machining of these composite materials requires better understanding of cutting pro...In recent years, aluminum-matrix composites (AMCs) have been widely used to replace cast iron in aerospace and automotive industries. Machining of these composite materials requires better understanding of cutting processes re- garding accuracy and efficiency. This study addresses the modeling of the machinability of self-lubricated aluminum /alumina/graphite hybrid composites synthesized by the powder metallurgy method. In this study, multiple regression analysis (MRA) and artificial neural networks (ANN) were used to investigate the influence of some parameters on the thrust force and torque in the drilling processes of self-lubricated hybrid composite materials. The models were identi- fied by using cutting speed, feed, and volume fraction of the reinforcement particles as input data and the thrust force and torque as the output data. A comparison between two prediction methods was developed to compare the prediction accuracy. ANNs showed better predictability results compared to MRA due to the nonlinearity nature of ANNs. The statistical analysis accompanied with artificial neural network results showed that Al2O3, Gr and cutting feed (f) were the most significant parameters on the drilling process, while spindle speed seemed insignificant. Since the spindle speed was insignificant, it directed us to set it either at the highest spindle speed to obtain high material removal rate or at the lowest spindle speed to prolong the tool life depending on the need for the application.展开更多
In this paper, we evaluate semiempirical methods (AM1, PM3, and ZINDO), HF and DFT (B3LYP) in different basis sets to determine which method best describes the sign and magnitude of the geometrical parameters of artem...In this paper, we evaluate semiempirical methods (AM1, PM3, and ZINDO), HF and DFT (B3LYP) in different basis sets to determine which method best describes the sign and magnitude of the geometrical parameters of artemisinin in the region of the endoperoxide ring compared to crystallographic data. We also classify these methods using statistical analysis. The results of PCA were based on three main components, explaining 98.0539% of the total variance, for the geometrical parameters C3O13, O1O2C3, O13C12C12a, and O2C3O13C12. The DFT method (B3LYP) corresponded well with the experimental data in the hierarchical cluster analysis (HCA). The experimental and theoretical angles were analyzed by simple linear regression, and statistical parameters (correlation coefficients, significance, and predictability) were evaluated to determine the accuracy of the calculations. The statistical analysis exhibited a good correlation and high predictive power for the DFT (B3LYP) method in the 6-31G** basis set.展开更多
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili...Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.展开更多
Increasing contamination of water resources in the world and our country and decreasing water quality over time, not having met the objectives of utilization of water resources;it has increased the importance of water...Increasing contamination of water resources in the world and our country and decreasing water quality over time, not having met the objectives of utilization of water resources;it has increased the importance of water management. The monitoring of the water resources and evaluation of these monitoring results have given direction to the studies’ outcome in order to control factors that pollute water resources and reduce water quality. Nilüfer Creek is very important for both being a source of drinking and potable water and a discharge area for wastewaters for the city of Bursa. In this study, the results of the analysis belonging to the period between 2002-2010 which are taken from 15 points by General Directorate of Bursa Water and Sewerage Administration (BUWSA) were evaluated in relation to water quality of the Nilüfer Creek. Non-parametric methods were used in the evaluation of the water quality data due to the lack of normally distributed data. The identification of the best represented parameters of the water quality was provided by applying Principal Component Analysis. According to results of the analysis, the best representative 9 parameters from the 19 water quality parameters were defined as parameters of BOD5, COD, TSS, T.Fe, Zn, conductivity, NO2-N, Ni and NO3-N that taking part of the first two components.展开更多
To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA)...To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.展开更多
Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased ...Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.展开更多
It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when th...It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when the covariates of the nonparametric component are functional,the robust estimates for the regression parameter and regression operator are introduced.The main propose of the paper is to consider data-driven methods of selecting the number of neighbors in order to make the proposed processes fully automatic.We use thek Nearest Neighbors procedure(kNN)to construct the kernel estimator of the proposed robust model.Under some regularity conditions,we state consistency results for kNN functional estimators,which are uniform in the number of neighbors(UINN).Furthermore,a simulation study and an empirical application to a real data analysis of octane gasoline predictions are carried out to illustrate the higher predictive performances and the usefulness of the kNN approach.展开更多
PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [...PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [Fe/H]), and sets up a linear regression function from spectra to the corresponding parameters. Considering the properties of stellar spectra and the PLS algorithm, we present a piecewise PLS regression method for estimation of stellar parameters, which is composed of one PLS model for Teff, and seven PLS models for log g and [Fe/H] estimation. Its performance is investigated by large experiments on flux calibrated spectra and continuum normalized spectra at different signal-to-noise ratios (SNRs) and resolutions. The results show that the piecewise PLS method is robust for spectra at the medium resolution of 0.23 nm. For low resolution 0.5 nm and 1 nm spectra, it achieves competitive results at higher SNR. Experiments using ELODIE spectra of 0.23 nm resolution illustrate that our piecewise PLS models trained with MILES spectra are efficient for O ~ G stars: for flux calibrated spectra, the systematic offsets are 3.8%, 0.14 dex, and -0.09 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.44 dex and 0.38 dex, respectively; for continuum normalized spectra, the systematic offsets are 3.8%, 0.12dex, and -0.13 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.49 dex and 0.41 dex, respectively. The PLS method is rapid, easy to use and does not rely as strongly on the tightness of a parameter grid of templates to reach high precision as Artificial Neural Networks or minimum distance methods do.展开更多
The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,dist...The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,distributed within a large survey area(17500 deg2)and covering a wide wavelength range(255-1000 nm by three bands GU,GV,and GI).As slitless spectroscopy precludes the usage of wavelength calibration lamps,wavelength calibration is one of the most challenging issues in the reduction of slitless spectra,yet it plays a key role in measuring precise radial velocities of stars and redshifts of galaxies.In this work,we propose a star-based method that can monitor and correct for possible errors in the CSST wavelength calibration using normal scientific observations,taking advantage of the facts that(ⅰ)there are about ten million stars with reliable radial velocities now available thanks to spectroscopic surveys like LAMOST,(ⅱ)the large field of view of CSST enables efficient observations of such stars in a short period of time,and(ⅲ)radial velocities of such stars can be reliably measured using only a narrow segment of CSST spectra.We demonstrate that it is possible to achieve a wavelength calibration precision of a few km s^(-1) for the GU band,and about 10 to 20 kms^(-1) for the GV and GI bands,with only a few hundred velocity standard stars.Implementations of the method to other surveys are also discussed.展开更多
It is a matter of course that Kolmogorov’s probability theory is a very useful mathematical tool for the analysis of statistics. However, this fact never means that statistics is based on Kolmogorov’s probability th...It is a matter of course that Kolmogorov’s probability theory is a very useful mathematical tool for the analysis of statistics. However, this fact never means that statistics is based on Kolmogorov’s probability theory, since it is not guaranteed that mathematics and our world are connected. In order that mathematics asserts some statements concerning our world, a certain theory (so called “world view”) mediates between mathematics and our world. Recently we propose measurement theory (i.e., the theory of the quantum mechanical world view), which is characterized as the linguistic turn of quantum mechanics. In this paper, we assert that statistics is based on measurement theory. And, for example, we show, from the pure theoretical point of view (i.e., from the measurement theoretical point of view), that regression analysis can not be justified without Bayes’ theorem. This may imply that even the conventional classification of (Fisher’s) statistics and Bayesian statistics should be reconsidered.展开更多
文摘Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out in a limited time frame during the partial solar eclipse that occurred on 2011 January 4 and was observed in Canakkale and Ankara, Turkey. Additionally, records of the geomagnetic field spanning 24 hours, obtained from another observatory (in Iznik, Turkey), were also analyzed to check for any peculiar variations. In the data processing stage, a polynomial fit, following the application of a running average routine, was applied to the geomagnetic field data sets. Geomagnetic field data sets indicated there was a characteristic decrease at the beginning of the solar eclipse and this decrease can be well-correlated with previous geomagnetic field measurements that were taken during the total solar eclipse that was observed in Turkey on 2006 March 29. The behavior of the geomagnetic field is also consistent with previous observations in the literature. As a result of these analyses, it can be suggested that eclipses can cause a shielding effect on the geomagnetic field of the Earth.
文摘In this paper a stochastic boundary element method (SEEM) is developed to analyze moderately thick plates with random material parameters and random thickness. Based on the Taylor series expansion, the boundary integration equations concerning the mean and deviation of the generalized displacements are derived, respectively. It is found that the randomness of material parameters is equivalent to a random load, so the mean and covariance matrices of unknown generalized boundary displacements and tractions can be obtained. Furthermore, the mean and covariance of generalized displacements and forces at internal points can also be obtained. A numerical example has been worked out with the method proposed and necessary analysis is made for the results.
文摘The change of the development situation of market makes the competition among enterprises gradually turn totalents,technology, culture and other aspects,talent and technology has become an important embodiment of the core competitiveness ofenterprises.In this context,the role of human resource management is becoming more and more important,which needs to improve and innovatethe traditional human resource management mode in time,so as to improve the management level and management efficiency.The application ofstatistical analysis method in the human resource management of modem enterprises,can give full play to the role and value of various data,toimprove the disadvantages and defects in traditional methods of human resource management,can significantly enhance the level and efficiency ofhuman resource management,this paper makes a detailed analysis on its strategy of specific application, aims to provide guidance and reference.
文摘A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method is used to deal with uncertainties. Not necessarily knowing the probabilistic statistics characteristics of the uncertain variables, only little information on physical properties of material is needed in the interval analysis method, that is, the upper bound and lower bound of the uncertain variable. So the interval of response of the structure can be gotten through less computational efforts. The interval analysis method is efficient under the condition that probability approach cannot work well because of small samples and deficient statistics characteristics. For buckling load of a special cross-ply laminates and antisymmetric angle-ply laminates with all edges simply supported, calculations and comparisons between interval analysis method and probability method are performed.
基金supported by the National Natural Science Foundation of China under Grant Nos.U2031140,11873027,and 12073077。
文摘The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the main emission line of Fe XVⅢin the 94?of the Atmospheric Imaging Assembly.However,the diagnostic algorithms for Fe XVⅢ,including the differential emission measure(DEM)and linear diagnostics proposed by Del based on the DEM,have been greatly limited for a long time,and the results obtained are different from the predictions.In this paper,we use the outlier detection method to establish the nonlinear correlation between 94?and 171,193,211?based on the former researches by others.A neural network based on 171,193,211?is constructed to replace the low-temperature emission lines in the ARs of 94?.The predicted results are regarded as the low-temperature components of 94?,and then the predicted results are subtracted from 94?to obtain the outlier component of 94?,or Fe XVⅢ.Then,the outlier components obtained by neural network are compared with the Fe XVⅢobtained by DEM and Del's method,and a high similarity is found,which proves the reliability of neural network to obtain the high-temperature components of ARs,but there are still many differences.In order to analyze the differences between the Fe XVⅢobtained by the three methods,we subtract the Fe XVⅢobtained by the DEM and Del's method from the Fe XVⅢobtained by the neural network to obtain the residual value,and compare it with the results of Fe XIV in the temperature range of 6.1-6.45 MK.It is found that there is a great similarity,which also shows that the Fe XVⅢobtained by DEM and Del's method still has a large low-temperature component dominated by Fe XIV,and the Fe XVⅢobtained by neural network is relatively pure.
基金Supported by Emergency Management Project of National Natural Science Foundation(31640010)Tibet's Research and Development Project of Characteristic Agriculture and Animal Husbandry Resources Synergy Innovation Center-Plateau Ecology+1 种基金Natural Science Foundation of Tibet Autonomous Region(2016ZR-15-41)Postgraduate Innovation Project of Xizang Agriculture and Animal Husbandry College(YJS2017-01)
文摘In order to study China's bryophyte,this paper uses bibliometrics for statistical analysis of literature about China's bryophyte during 2005-2015.The results show that in terms of published article distribution of different journals,there are 13 kinds of journals with more than 5 papers about bryophyte,accounting for 32.5%; in terms of the number of papers published in different years,it was smallest in 2005,only 16,while it reached the largest number of 33 in 2008; in terms of the number of papers published for different first authors,there are most authors publishing less than 9 papers,accounting for 87.5%,there is only one author publishing 9 papers,and there are 5 people publishing more than 9 papers; in terms of author unit distribution,in the 278 articles collected,there are 12 units publishing papers of less than 6,accounting for 30%,the unit publishing the most papers(36) is Guizhou Normal University,5 units publish 6 papers,accounting for 12.5%,and the units publishing papers of less than 6 account for 57.5%; in terms of literature research level,there are most papers about basic and applied basic research(natural science),accounting for 91.2%,the papers about engineering and technology(natural science) account for 5.5%,and other papers account for 3.3%.
基金Natural Science Foundation of Education Department of Heilongjiang Province ( 12511471)
文摘We take the papers,on performance appraisal of land consolidation ( including land consolidation benefits,etc. ) published in the core journals in the Chinese Journal Full-text Database,as the study samples; conduct analysis in terms of the number of papers,the paper source journals,the impact of papers,research methods,research topics,and foundation project for research,through literature search and statistical analysis. The results show that the related scholars pay more and more attention to the study on the performance appraisal of land consolidation, and the number of papers increases overall; the core journals on agriculture,land,environment,economy and other areas,put increasing emphasis on the publishing of papers concerning the performance appraisal of land consolidation; in terms of citation frequency of papers,the impact of papers is wide,but the depth is not enough; the research methods are increasingly diversifying,and the research topics are concentrated; the foundation support is yet to be strengthened for research.
基金This work was financially supported by the National Natural Science Foundation of China(11501449)the Fundamental Research Funds for the Central Universities(3102017zy043)+1 种基金the fund of the State Key Laboratory of Solidification Processing in NWPU(SKLSP201628)the National Key Research and Development Program of China(2016YFB1100602).
文摘This paper is devoted to the homogenization and statistical multiscale analysis of a transient heat conduction problem in random porous materials with a nonlinear radiation boundary condition.A novel statistical multiscale analysis method based on the two-scale asymptotic expansion is proposed.In the statistical multiscale formulations,a unified linear homogenization procedure is established and the second-order correctors are introduced for modeling the nonlinear radiative heat transfer in random perforations,which are our main contributions.Besides,a numerical algorithm based on the statistical multiscale method is given in details.Numerical results prove the accuracy and efficiency of our method for multiscale simulation of transient nonlinear conduction and radiation heat transfer problem in random porous materials.
基金supported by the Special Funds for the National Basic Research Program of China(Grant No.2012CB025904)the National Natural ScienceFoundation of China(Grant Nos.90916027 and 11302052)
文摘This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.
基金supported by the National Key R&D Program of China(2017YFA0402701)Key Research Program of Frontier Sciences of CAS(QYZDJ-SSW-SLH047)partially supported by the National Natural Science Foundation of China(Grant No.U2031202)。
文摘Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noise level of a single datacube from MWISP and perform statistical analyses.We identified major factors which increase the noise level of a single datacube,including bad channels,edge effects,baseline distortion and line contamination.Cleaning algorithms are applied to remove or reduce these noise components.As a result,we obtained the cleaned datacube in which noise follows a positively skewed normal distribution.We further analyzed the noise structure distribution of a 3 D mosaicked datacube in the range l=40°7 to 43°3 and b=-2°3 to 0°3 and found that noise in the final mosaicked datacube is mainly characterized by noise fluctuation among the cells.
文摘In recent years, aluminum-matrix composites (AMCs) have been widely used to replace cast iron in aerospace and automotive industries. Machining of these composite materials requires better understanding of cutting processes re- garding accuracy and efficiency. This study addresses the modeling of the machinability of self-lubricated aluminum /alumina/graphite hybrid composites synthesized by the powder metallurgy method. In this study, multiple regression analysis (MRA) and artificial neural networks (ANN) were used to investigate the influence of some parameters on the thrust force and torque in the drilling processes of self-lubricated hybrid composite materials. The models were identi- fied by using cutting speed, feed, and volume fraction of the reinforcement particles as input data and the thrust force and torque as the output data. A comparison between two prediction methods was developed to compare the prediction accuracy. ANNs showed better predictability results compared to MRA due to the nonlinearity nature of ANNs. The statistical analysis accompanied with artificial neural network results showed that Al2O3, Gr and cutting feed (f) were the most significant parameters on the drilling process, while spindle speed seemed insignificant. Since the spindle speed was insignificant, it directed us to set it either at the highest spindle speed to obtain high material removal rate or at the lowest spindle speed to prolong the tool life depending on the need for the application.
文摘In this paper, we evaluate semiempirical methods (AM1, PM3, and ZINDO), HF and DFT (B3LYP) in different basis sets to determine which method best describes the sign and magnitude of the geometrical parameters of artemisinin in the region of the endoperoxide ring compared to crystallographic data. We also classify these methods using statistical analysis. The results of PCA were based on three main components, explaining 98.0539% of the total variance, for the geometrical parameters C3O13, O1O2C3, O13C12C12a, and O2C3O13C12. The DFT method (B3LYP) corresponded well with the experimental data in the hierarchical cluster analysis (HCA). The experimental and theoretical angles were analyzed by simple linear regression, and statistical parameters (correlation coefficients, significance, and predictability) were evaluated to determine the accuracy of the calculations. The statistical analysis exhibited a good correlation and high predictive power for the DFT (B3LYP) method in the 6-31G** basis set.
文摘Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.
文摘Increasing contamination of water resources in the world and our country and decreasing water quality over time, not having met the objectives of utilization of water resources;it has increased the importance of water management. The monitoring of the water resources and evaluation of these monitoring results have given direction to the studies’ outcome in order to control factors that pollute water resources and reduce water quality. Nilüfer Creek is very important for both being a source of drinking and potable water and a discharge area for wastewaters for the city of Bursa. In this study, the results of the analysis belonging to the period between 2002-2010 which are taken from 15 points by General Directorate of Bursa Water and Sewerage Administration (BUWSA) were evaluated in relation to water quality of the Nilüfer Creek. Non-parametric methods were used in the evaluation of the water quality data due to the lack of normally distributed data. The identification of the best represented parameters of the water quality was provided by applying Principal Component Analysis. According to results of the analysis, the best representative 9 parameters from the 19 water quality parameters were defined as parameters of BOD5, COD, TSS, T.Fe, Zn, conductivity, NO2-N, Ni and NO3-N that taking part of the first two components.
基金Science and Technology Support Planning of Jiangsu Province(No.BE2014133)the Open Foundation of Key Laboratory of Underw ater Acoustic Signal Processing(No.UASP1301)the Prospective Joint Research Project of Jiangsu province(No.BY2014127-01)
文摘To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.
文摘Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.
文摘It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when the covariates of the nonparametric component are functional,the robust estimates for the regression parameter and regression operator are introduced.The main propose of the paper is to consider data-driven methods of selecting the number of neighbors in order to make the proposed processes fully automatic.We use thek Nearest Neighbors procedure(kNN)to construct the kernel estimator of the proposed robust model.Under some regularity conditions,we state consistency results for kNN functional estimators,which are uniform in the number of neighbors(UINN).Furthermore,a simulation study and an empirical application to a real data analysis of octane gasoline predictions are carried out to illustrate the higher predictive performances and the usefulness of the kNN approach.
基金Supported by the National Natural Science Foundation of China
文摘PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [Fe/H]), and sets up a linear regression function from spectra to the corresponding parameters. Considering the properties of stellar spectra and the PLS algorithm, we present a piecewise PLS regression method for estimation of stellar parameters, which is composed of one PLS model for Teff, and seven PLS models for log g and [Fe/H] estimation. Its performance is investigated by large experiments on flux calibrated spectra and continuum normalized spectra at different signal-to-noise ratios (SNRs) and resolutions. The results show that the piecewise PLS method is robust for spectra at the medium resolution of 0.23 nm. For low resolution 0.5 nm and 1 nm spectra, it achieves competitive results at higher SNR. Experiments using ELODIE spectra of 0.23 nm resolution illustrate that our piecewise PLS models trained with MILES spectra are efficient for O ~ G stars: for flux calibrated spectra, the systematic offsets are 3.8%, 0.14 dex, and -0.09 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.44 dex and 0.38 dex, respectively; for continuum normalized spectra, the systematic offsets are 3.8%, 0.12dex, and -0.13 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.49 dex and 0.41 dex, respectively. The PLS method is rapid, easy to use and does not rely as strongly on the tightness of a parameter grid of templates to reach high precision as Artificial Neural Networks or minimum distance methods do.
基金supported by the National Key Basic R&D Program of China(2019YFA0405500)the National Natural Science Foundation of China(No.11603002)Beijing Normal University(No.310232102)。
文摘The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,distributed within a large survey area(17500 deg2)and covering a wide wavelength range(255-1000 nm by three bands GU,GV,and GI).As slitless spectroscopy precludes the usage of wavelength calibration lamps,wavelength calibration is one of the most challenging issues in the reduction of slitless spectra,yet it plays a key role in measuring precise radial velocities of stars and redshifts of galaxies.In this work,we propose a star-based method that can monitor and correct for possible errors in the CSST wavelength calibration using normal scientific observations,taking advantage of the facts that(ⅰ)there are about ten million stars with reliable radial velocities now available thanks to spectroscopic surveys like LAMOST,(ⅱ)the large field of view of CSST enables efficient observations of such stars in a short period of time,and(ⅲ)radial velocities of such stars can be reliably measured using only a narrow segment of CSST spectra.We demonstrate that it is possible to achieve a wavelength calibration precision of a few km s^(-1) for the GU band,and about 10 to 20 kms^(-1) for the GV and GI bands,with only a few hundred velocity standard stars.Implementations of the method to other surveys are also discussed.
文摘It is a matter of course that Kolmogorov’s probability theory is a very useful mathematical tool for the analysis of statistics. However, this fact never means that statistics is based on Kolmogorov’s probability theory, since it is not guaranteed that mathematics and our world are connected. In order that mathematics asserts some statements concerning our world, a certain theory (so called “world view”) mediates between mathematics and our world. Recently we propose measurement theory (i.e., the theory of the quantum mechanical world view), which is characterized as the linguistic turn of quantum mechanics. In this paper, we assert that statistics is based on measurement theory. And, for example, we show, from the pure theoretical point of view (i.e., from the measurement theoretical point of view), that regression analysis can not be justified without Bayes’ theorem. This may imply that even the conventional classification of (Fisher’s) statistics and Bayesian statistics should be reconsidered.