A multi-objective linear programming problem is made from fuzzy linear programming problem. It is due the fact that it is used fuzzy programming method during the solution. The Multi objective linear programming probl...A multi-objective linear programming problem is made from fuzzy linear programming problem. It is due the fact that it is used fuzzy programming method during the solution. The Multi objective linear programming problem can be converted into the single objective function by various methods as Chandra Sen’s method, weighted sum method, ranking function method, statistical averaging method. In this paper, Chandra Sen’s method and statistical averaging method both are used here for making single objective function from multi-objective function. Two multi-objective programming problems are solved to verify the result. One is numerical example and the other is real life example. Then the problems are solved by ordinary simplex method and fuzzy programming method. It can be seen that fuzzy programming method gives better optimal values than the ordinary simplex method.展开更多
Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts...Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.展开更多
A novel damage detection method is applied to a 3-story frame structure, to obtain statistical quantification control criterion of the existence, location and identification of damage. The mean, standard deviation, an...A novel damage detection method is applied to a 3-story frame structure, to obtain statistical quantification control criterion of the existence, location and identification of damage. The mean, standard deviation, and exponentially weighted moving average (EWMA) are applied to detect damage information according to statistical process control (SPC) theory. It is concluded that the detection is insignificant with the mean and EWMA because the structural response is not independent and is not a normal distribution. On the other hand, the damage information is detected well with the standard deviation because the influence of the data distribution is not pronounced with this parameter. A suitable moderate confidence level is explored for more significant damage location and quantification detection, and the impact of noise is investigated to illustrate the robustness of the method.展开更多
Identification of modal parameters of a linear structure with output-only measurements has received much attention over the past decades. In the paper, the Natural Excitation Technique (NExT) is used for acquisition o...Identification of modal parameters of a linear structure with output-only measurements has received much attention over the past decades. In the paper, the Natural Excitation Technique (NExT) is used for acquisition of the impulse signals from the structural responses. Then Eigensystem Realization Algorithm (ERA) is utilized for modal identification. For disregarding the fictitious ‘computational modes', a procedure, Statistically Averaging Modal Frequency Method (SAMFM), is developed to distinguish the true modes from noise modes, and to improve the precision of the identified modal frequencies of the structure. An offshore platform is modeled with the finite element method. The theoretical modal parameters are obtained for a comparison with the identified values. The dynamic responses of the platform under random wave loading are computed for providing the output signals used for identification with ERA. Results of simulation demonstrate that the proposed method can determine the system modal frequency with high precision.展开更多
In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main ...In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.展开更多
Ag-sheathed (Bi,Pb)(2)SoCa(2)Cu(3)O(x) tapes were prepared by the powder-in-tube method. The influences of rolling parameters on superconducting characteristics of Bi(2223)/Ag tapes were analyzed qualitatively with a ...Ag-sheathed (Bi,Pb)(2)SoCa(2)Cu(3)O(x) tapes were prepared by the powder-in-tube method. The influences of rolling parameters on superconducting characteristics of Bi(2223)/Ag tapes were analyzed qualitatively with a statistical method. The results demonstrate that roll diameter and reduction per pass significantly influence the properties of Bi(2223)/Ag superconducting tapes while roll speed does less and working friction the least. An optimized rolling process was therefore achieved according to the above results.展开更多
Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them...Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox. Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox. Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome, instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.展开更多
In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values ...In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.展开更多
This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of...This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.展开更多
In this paper, the statistical averaging method and the new statistical averaging methods have been used to solve the fuzzy multi-objective linear programming problems. These methods have been applied to form a single...In this paper, the statistical averaging method and the new statistical averaging methods have been used to solve the fuzzy multi-objective linear programming problems. These methods have been applied to form a single objective function from the fuzzy multi-objective linear programming problems. At first, a numerical example of solving fuzzy multi-objective linear programming problem has been provided to validate the maximum risk reduction by the proposed method. The proposed method has been applied to assess the risk of damage due to natural calamities like flood, cyclone, sidor, and storms at the coastal areas in Bangladesh. The proposed method of solving the fuzzy multi-objective linear programming problems by the statistical method has been compared with the Chandra Sen’s method. The numerical results show that the proposed method maximizes the risk reduction capacity better than Chandra Sen’s method.展开更多
The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by B...The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by Bonomo et al.suggests that the occurrence of cold Jupiter companions is not excessive in super-Earth systems.Here we show that this discrepancy can be seen as a Simpson’s paradox and is resolved once the metallicity dependence of the super-Earth-cold Jupiter relation is taken into account.A common feature is noticed that almost all the cold Jupiter detections with inner super-Earth companions are found around metal-rich stars.Focusing on the Sun-like hosts with super-solar metallicities,we show that the frequency of cold Jupiters conditioned on the presence of inner super Earths is 39_(-11)^(+12)%,whereas the frequency of cold Jupiters in the same metallicity range is no more than 20%.Therefore,the occurrences of close-in super Earths and distant cold Jupiters appear correlated around metal-rich hosts.The relation between the two types of planets remains unclear for stars with metal-poor hosts due to the limited sample size and the much lower occurrence rate of cold Jupiters,but a correlation between the two cannot be ruled out.展开更多
We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(...We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(disk+bulge):single exponential,single sersic,exponential+deVaucular(exp+deV),and exponential+sérsic(exp+ser).Under the criteria of the B band disk central surface brightness μ_(0,disk)(B)≥22.5 mag arcsec^(-2) and the axis ratio b/a> 0.3,we selected four none-edge-on LSBG samples from each of the models which contain 1105,1038,207,and 75 galaxies,respectively.There are 756 galaxies in common between LSBGs selected by exponential and sersic models,corresponding to 68.42% of LSBGs selected by the exponential model and 72.83% of LSBGs selected by the sersic model,the rest of the discrepancy is due to the difference in obtaining μ_(0) between the exponential and sersic models.Based on the fitting,in the range of 0.5≤n≤1.5,the relation of μ_(0) from two models can be written as μ_(0,sérsic)-μ_(0,exp)=-1.34(n-1).The LSBGs selected by disk+bulge models(LSBG_(2)comps) are more massive than LSBGs selected by single-component models(LSBG_1comp),and also show a larger disk component.Though the bulges in the majority of our LSBG_(2)comps are not prominent,more than 60% of our LSBG_(2)comps will not be selected if we adopt a single-component model only.We also identified 31 giant low surface brightness galaxies(gLSBGs) from LSBG_(2)comps.They are located at the same region in the color-magnitude diagram as other gLSBGs.After we compared different criteria of gLSBGs selection,we find that for gas-rich LSBGs,M_(*)> 10^(10)M_⊙ is the best to distinguish between gLSBGs and normal LSBGs with bulge.展开更多
Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch event...Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch events are typically observed in the spin frequency(ν) and frequency derivative( ν) of pulsars.The rate of glitch recurrence decreases as the pulsar ages,and the activity parameter is usually measured by linear regression of cumulative glitches over a given period.This method is effective for pulsars with multiple regular glitch events.However,due to the scarcity of glitch events and the difficulty of monitoring all known pulsars,only a few have multiple records of glitch events.This limits the use of the activity parameter in studying neutron star interiors with multiple pulsars.In this study,we examined the relationship between the activity parameters and pulsar spin parameters(spin frequency,frequency derivative,and pulsar characteristic age).We found that a quadratic function provides a better fit for the relationship between activity parameters and spin parameters than the commonly used linear functions.Using this information,we were able to estimate the activity parameters of other pulsars that do not have records of glitches.Our analysis shows that the relationship between the estimated activity parameters and pulsar spin parameters is consistent with that of the observed activity parameters in the ensemble of pulsars.展开更多
A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method i...A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method is used to deal with uncertainties. Not necessarily knowing the probabilistic statistics characteristics of the uncertain variables, only little information on physical properties of material is needed in the interval analysis method, that is, the upper bound and lower bound of the uncertain variable. So the interval of response of the structure can be gotten through less computational efforts. The interval analysis method is efficient under the condition that probability approach cannot work well because of small samples and deficient statistics characteristics. For buckling load of a special cross-ply laminates and antisymmetric angle-ply laminates with all edges simply supported, calculations and comparisons between interval analysis method and probability method are performed.展开更多
According to the need of project design for offshore engineering and coastal engineering, this paper statistically analyses the annual extreme data of waves acquired at 19 observation stations along the coast of China...According to the need of project design for offshore engineering and coastal engineering, this paper statistically analyses the annual extreme data of waves acquired at 19 observation stations along the coast of China. Five kinds of distribution curves are adopted: Pearson III (P-III), Log-Extreme I (LE), Log-Normal(LN), Weibull(W) and Exponential Γ(EΓ) to check the adaptability to the long-term distribution of annual extreme of wave in the China Sea areas. The New Curve Fitting Method (NFIT) method and Probability Weighted Moments (PWM) method are used to estimate the distribution parameters and thereby to derive the design wave parameters with different return periods at 19 observation stations. The test results show that by combining EΓ distribution and NFIT parameter estimation and optimum seeking by computer, the design wave parameters can be estimated with high accuracy, high speed and high efficiency, and the randomness of the estimated results can be avoided.展开更多
We present a statistical method to derive the stellar density profiles of the Milky Way from spectroscopic survey data, taking into account selection effects. We assume the selection function, which can be altered dur...We present a statistical method to derive the stellar density profiles of the Milky Way from spectroscopic survey data, taking into account selection effects. We assume the selection function, which can be altered during observations and data reductions, of the spectroscopic survey is based on photometric colors and magnitude. Then the underlying selection function for a line-of-sight can be recovered well by comparing the distribution of the spectroscopic stars in a color-magnitude plane with that of the photometric dataset. Subsequently, the stellar density profile along a line-of-sight can be derived from the spectroscopically measured stellar density profile multiplied by the selection function. The method is validated using Galaxia mock data with two different selection functions. We demonstrate that the derived stellar density profiles reconstruct the true ones well not only for the full set of targets, but also for sub-populations selected from the full dataset. Finally, the method is applied to map the density pro- files for the Galactic disk and halo, using the LAMOST RGB stars. The Galactic disk extends to about R = 19 kpc, where the disk still contributes about 10% to the total stellar surface density. Beyond this radius, the disk smoothly transitions to the halo without any truncation, bending or breaking. Moreover, no over-density corresponding to the Monoceros ring is found in the Galactic anti-center direction. The disk shows moderate north-south asymmetry at radii larger than 12 kpc. On the other hand, the R-Z tomographic map directly shows that the stellar halo is substantially oblate within a Galactocentric radius of 20 kpc and gradually becomes nearly spherical beyond 30 kpc.展开更多
Groundwater is considered as one of the most important sources for water supply in Iran.The Fasa Plain in Fars Province,Southern Iran is one of the major areas of wheat production using groundwater for irrigation.A la...Groundwater is considered as one of the most important sources for water supply in Iran.The Fasa Plain in Fars Province,Southern Iran is one of the major areas of wheat production using groundwater for irrigation.A large population also uses local groundwater for drinking purposes.Therefore,in this study,this plain was selected to assess the spatial variability of groundwater quality and also to identify main parameters affecting the water quality using multivariate statistical techniques such as Cluster Analysis(CA),Discriminant Analysis(DA),and Principal Component Analysis(PCA).Water quality data was monitored at 22 different wells,for five years(2009-2014)with 10 water quality parameters.By using cluster analysis,the sampling wells were grouped into two clusters with distinct water qualities at different locations.The Lasso Discriminant Analysis(LDA)technique was used to assess the spatial variability of water quality.Based on the results,all of the variables except sodium absorption ratio(SAR)are effective in the LDA model with all variables affording 92.80%correct assignation to discriminate between the clusters from the primary 10 variables.Principal component(PC)analysis and factor analysis reduced the complex data matrix into two main components,accounting for more than 95.93%of the total variance.The first PC contained the parameters of TH,Ca2+,and Mg2+.Therefore,the first dominant factor was hardness.In the second PC,Cl-,SAR,and Na+were the dominant parameters,which may indicate salinity.The originally acquired factors illustrate natural(existence of geological formations)and anthropogenic(improper disposal of domestic and agricultural wastes)factors which affect the groundwater quality.展开更多
In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating th...In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.展开更多
The flow of novel coronavirus(COVID-19)has affected almost every aspect of human life around the globe.Being the emerging ground and early sufferer of the virus,Wuhan city-data remains a case of multifold significance...The flow of novel coronavirus(COVID-19)has affected almost every aspect of human life around the globe.Being the emerging ground and early sufferer of the virus,Wuhan city-data remains a case of multifold significance.Further,it is of notable importance to explore the impact of unique and unprecedented public health response of Chinese authorities—the extreme lockdown of the city.In this research,we investigate the statistical nature of the viral transmission concerning social distancing,extreme quarantine,and robust lockdown interventions.We observed highly convincing and statistically significant evidences in favor of quarantine and social distancing approaches.These findings might help countries,now facing,or likely to face the wave of the virus.We analyzed Wuhan-based data of“number of deaths”and“confirmed cases,”extracted from China CDC weekly database,dated from February 13,2020,to March 24,2020.To estimate the underlying group structure,the assembled data is further subdivided into three blocks,each consists of two weeks.Thus,the complete data set is studied in three phases,such as,phase 1(Ph 1)=February 13,2020,to February 26,2020;phase 2(Ph 2)=February 27,2020 to March 11,2020;and phase 3(Ph 3)=March 12,2020 to March 24,2020.We observed the overall median proportion of deaths in those six weeks remained 0.0127.This estimate is highly influenced by Ph1,when the early flaws of weak health response were still prevalent.Over the time,we witnessed a median decline of 92.12%in the death proportions.Moreover,a non-parametric version of the variability analysis of death data,estimated that the average rank of reported proportions in Ph 3 remained 7,which was 20.5 in Ph 2,and stayed 34.5 in the first phase.Similar patterns were observed,when studying the confirmed cases data.We estimated the overall median of the proportion of confirmed cases in Wuhan as 0.0041,which again,is highly inclined towards Ph 1 and Ph 2.We also witnessed minimum average rank proportions for Ph 3,such as 7,which was noticeably lower than Ph 2,21.71,and Ph 1, 32.29. Moreover, the varying degree of clustering indicates that the effectivenessof quarantine based policies is time-dependent. In general, the declinein coronavirus transmission in Wuhan significantly coincides with the lockdown.展开更多
Making use of the 2MASS Data Release, we have searched for nearinfrared (JHK) counterparts to 268 blazars from Donato et al. and obtained 238 counterparts within 5'' in the area covered by 2MASS. It provides us a ...Making use of the 2MASS Data Release, we have searched for nearinfrared (JHK) counterparts to 268 blazars from Donato et al. and obtained 238 counterparts within 5'' in the area covered by 2MASS. It provides us a sample with infrared data several times larger than the previous one of the same kind. Based on our sample and the sample by Donato et al., we have compared in detail the properties of HBLs, LBLs and FSRQs from five aspects and found that HBLs are significantly different from LBLs and FSRQs while LBLs are not obviously different from FSRQs. Our results strongly support the division of BL Lac objects into the high-frequency peaked (HBL) and low-frequency peaked (LBL) objects introduced by Padovani & Giommi and show that HBLs and LBLs are two kinds of blazar having different physical properties.展开更多
文摘A multi-objective linear programming problem is made from fuzzy linear programming problem. It is due the fact that it is used fuzzy programming method during the solution. The Multi objective linear programming problem can be converted into the single objective function by various methods as Chandra Sen’s method, weighted sum method, ranking function method, statistical averaging method. In this paper, Chandra Sen’s method and statistical averaging method both are used here for making single objective function from multi-objective function. Two multi-objective programming problems are solved to verify the result. One is numerical example and the other is real life example. Then the problems are solved by ordinary simplex method and fuzzy programming method. It can be seen that fuzzy programming method gives better optimal values than the ordinary simplex method.
文摘Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.
基金Natural Natural Science Foundation of China Under Grant No 50778077 & 50608036the Graduate Innovation Fund of Huazhong University of Science and Technology Under Grant No HF-06-028
文摘A novel damage detection method is applied to a 3-story frame structure, to obtain statistical quantification control criterion of the existence, location and identification of damage. The mean, standard deviation, and exponentially weighted moving average (EWMA) are applied to detect damage information according to statistical process control (SPC) theory. It is concluded that the detection is insignificant with the mean and EWMA because the structural response is not independent and is not a normal distribution. On the other hand, the damage information is detected well with the standard deviation because the influence of the data distribution is not pronounced with this parameter. A suitable moderate confidence level is explored for more significant damage location and quantification detection, and the impact of noise is investigated to illustrate the robustness of the method.
文摘Identification of modal parameters of a linear structure with output-only measurements has received much attention over the past decades. In the paper, the Natural Excitation Technique (NExT) is used for acquisition of the impulse signals from the structural responses. Then Eigensystem Realization Algorithm (ERA) is utilized for modal identification. For disregarding the fictitious ‘computational modes', a procedure, Statistically Averaging Modal Frequency Method (SAMFM), is developed to distinguish the true modes from noise modes, and to improve the precision of the identified modal frequencies of the structure. An offshore platform is modeled with the finite element method. The theoretical modal parameters are obtained for a comparison with the identified values. The dynamic responses of the platform under random wave loading are computed for providing the output signals used for identification with ERA. Results of simulation demonstrate that the proposed method can determine the system modal frequency with high precision.
文摘In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.
文摘Ag-sheathed (Bi,Pb)(2)SoCa(2)Cu(3)O(x) tapes were prepared by the powder-in-tube method. The influences of rolling parameters on superconducting characteristics of Bi(2223)/Ag tapes were analyzed qualitatively with a statistical method. The results demonstrate that roll diameter and reduction per pass significantly influence the properties of Bi(2223)/Ag superconducting tapes while roll speed does less and working friction the least. An optimized rolling process was therefore achieved according to the above results.
文摘Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox. Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox. Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome, instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.
文摘In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.
基金supported by the Special Funds for the National Basic Research Program of China(Grant No.2012CB025904)the National Natural ScienceFoundation of China(Grant Nos.90916027 and 11302052)
文摘This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.
文摘In this paper, the statistical averaging method and the new statistical averaging methods have been used to solve the fuzzy multi-objective linear programming problems. These methods have been applied to form a single objective function from the fuzzy multi-objective linear programming problems. At first, a numerical example of solving fuzzy multi-objective linear programming problem has been provided to validate the maximum risk reduction by the proposed method. The proposed method has been applied to assess the risk of damage due to natural calamities like flood, cyclone, sidor, and storms at the coastal areas in Bangladesh. The proposed method of solving the fuzzy multi-objective linear programming problems by the statistical method has been compared with the Chandra Sen’s method. The numerical results show that the proposed method maximizes the risk reduction capacity better than Chandra Sen’s method.
基金supported by the National Natural Science Foundation of China(NSFC,grant Nos.12173021 and 12133005)CASSACA grant CCJRF2105。
文摘The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by Bonomo et al.suggests that the occurrence of cold Jupiter companions is not excessive in super-Earth systems.Here we show that this discrepancy can be seen as a Simpson’s paradox and is resolved once the metallicity dependence of the super-Earth-cold Jupiter relation is taken into account.A common feature is noticed that almost all the cold Jupiter detections with inner super-Earth companions are found around metal-rich stars.Focusing on the Sun-like hosts with super-solar metallicities,we show that the frequency of cold Jupiters conditioned on the presence of inner super Earths is 39_(-11)^(+12)%,whereas the frequency of cold Jupiters in the same metallicity range is no more than 20%.Therefore,the occurrences of close-in super Earths and distant cold Jupiters appear correlated around metal-rich hosts.The relation between the two types of planets remains unclear for stars with metal-poor hosts due to the limited sample size and the much lower occurrence rate of cold Jupiters,but a correlation between the two cannot be ruled out.
基金supported by the National Key R&D Program of China (grant No.2022YFA1602901)support of the National Natural Science Foundation of China(NSFC) grant Nos. 12090040, 12090041, and 12003043+5 种基金supported by the Youth Innovation Promotion AssociationCAS (No. 2020057)the science research grants of CSST from the China Manned Space Projectsupport of the NSFC grant Nos.11733006 and U1931109supported by the Strategic Priority Research Program of the Chinese Academy of Sciences,Grant No. XDB0550100partially supported by the Open Project Program of the Key Laboratory of Optical Astronomy,National Astronomical Observatories,Chinese Academy of Sciences。
文摘We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(disk+bulge):single exponential,single sersic,exponential+deVaucular(exp+deV),and exponential+sérsic(exp+ser).Under the criteria of the B band disk central surface brightness μ_(0,disk)(B)≥22.5 mag arcsec^(-2) and the axis ratio b/a> 0.3,we selected four none-edge-on LSBG samples from each of the models which contain 1105,1038,207,and 75 galaxies,respectively.There are 756 galaxies in common between LSBGs selected by exponential and sersic models,corresponding to 68.42% of LSBGs selected by the exponential model and 72.83% of LSBGs selected by the sersic model,the rest of the discrepancy is due to the difference in obtaining μ_(0) between the exponential and sersic models.Based on the fitting,in the range of 0.5≤n≤1.5,the relation of μ_(0) from two models can be written as μ_(0,sérsic)-μ_(0,exp)=-1.34(n-1).The LSBGs selected by disk+bulge models(LSBG_(2)comps) are more massive than LSBGs selected by single-component models(LSBG_1comp),and also show a larger disk component.Though the bulges in the majority of our LSBG_(2)comps are not prominent,more than 60% of our LSBG_(2)comps will not be selected if we adopt a single-component model only.We also identified 31 giant low surface brightness galaxies(gLSBGs) from LSBG_(2)comps.They are located at the same region in the color-magnitude diagram as other gLSBGs.After we compared different criteria of gLSBGs selection,we find that for gas-rich LSBGs,M_(*)> 10^(10)M_⊙ is the best to distinguish between gLSBGs and normal LSBGs with bulge.
文摘Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch events are typically observed in the spin frequency(ν) and frequency derivative( ν) of pulsars.The rate of glitch recurrence decreases as the pulsar ages,and the activity parameter is usually measured by linear regression of cumulative glitches over a given period.This method is effective for pulsars with multiple regular glitch events.However,due to the scarcity of glitch events and the difficulty of monitoring all known pulsars,only a few have multiple records of glitch events.This limits the use of the activity parameter in studying neutron star interiors with multiple pulsars.In this study,we examined the relationship between the activity parameters and pulsar spin parameters(spin frequency,frequency derivative,and pulsar characteristic age).We found that a quadratic function provides a better fit for the relationship between activity parameters and spin parameters than the commonly used linear functions.Using this information,we were able to estimate the activity parameters of other pulsars that do not have records of glitches.Our analysis shows that the relationship between the estimated activity parameters and pulsar spin parameters is consistent with that of the observed activity parameters in the ensemble of pulsars.
文摘A method named interval analysis method, which solves the buckling load of composite laminate with uncertainties, is presented. Based on interval mathematics and Taylor series expansion, the interval analysis method is used to deal with uncertainties. Not necessarily knowing the probabilistic statistics characteristics of the uncertain variables, only little information on physical properties of material is needed in the interval analysis method, that is, the upper bound and lower bound of the uncertain variable. So the interval of response of the structure can be gotten through less computational efforts. The interval analysis method is efficient under the condition that probability approach cannot work well because of small samples and deficient statistics characteristics. For buckling load of a special cross-ply laminates and antisymmetric angle-ply laminates with all edges simply supported, calculations and comparisons between interval analysis method and probability method are performed.
基金This paper is financially supported by the Ministry of Water Conservancy and Electric Power,P.R.China
文摘According to the need of project design for offshore engineering and coastal engineering, this paper statistically analyses the annual extreme data of waves acquired at 19 observation stations along the coast of China. Five kinds of distribution curves are adopted: Pearson III (P-III), Log-Extreme I (LE), Log-Normal(LN), Weibull(W) and Exponential Γ(EΓ) to check the adaptability to the long-term distribution of annual extreme of wave in the China Sea areas. The New Curve Fitting Method (NFIT) method and Probability Weighted Moments (PWM) method are used to estimate the distribution parameters and thereby to derive the design wave parameters with different return periods at 19 observation stations. The test results show that by combining EΓ distribution and NFIT parameter estimation and optimum seeking by computer, the design wave parameters can be estimated with high accuracy, high speed and high efficiency, and the randomness of the estimated results can be avoided.
基金supported by the Strategic Priority Research Program“The Emergence of Cosmological Structures”of the Chinese Academy of Sciences(Grant No.XDB09000000)the National Key Basic Research Program of China(2014CB845700)+1 种基金the National Natural Science Foundation of China(Grant Nos.11373032 and 11333003)a National Major Scientific Project built by the Chinese Academy of Sciences.Funding for the project has been provided by the project has been provided by the National Development and Reform Commission
文摘We present a statistical method to derive the stellar density profiles of the Milky Way from spectroscopic survey data, taking into account selection effects. We assume the selection function, which can be altered during observations and data reductions, of the spectroscopic survey is based on photometric colors and magnitude. Then the underlying selection function for a line-of-sight can be recovered well by comparing the distribution of the spectroscopic stars in a color-magnitude plane with that of the photometric dataset. Subsequently, the stellar density profile along a line-of-sight can be derived from the spectroscopically measured stellar density profile multiplied by the selection function. The method is validated using Galaxia mock data with two different selection functions. We demonstrate that the derived stellar density profiles reconstruct the true ones well not only for the full set of targets, but also for sub-populations selected from the full dataset. Finally, the method is applied to map the density pro- files for the Galactic disk and halo, using the LAMOST RGB stars. The Galactic disk extends to about R = 19 kpc, where the disk still contributes about 10% to the total stellar surface density. Beyond this radius, the disk smoothly transitions to the halo without any truncation, bending or breaking. Moreover, no over-density corresponding to the Monoceros ring is found in the Galactic anti-center direction. The disk shows moderate north-south asymmetry at radii larger than 12 kpc. On the other hand, the R-Z tomographic map directly shows that the stellar halo is substantially oblate within a Galactocentric radius of 20 kpc and gradually becomes nearly spherical beyond 30 kpc.
基金The authors would like to thank the Laboratory of Water Engineering,Fasa University for providing the facilities to perform this research.
文摘Groundwater is considered as one of the most important sources for water supply in Iran.The Fasa Plain in Fars Province,Southern Iran is one of the major areas of wheat production using groundwater for irrigation.A large population also uses local groundwater for drinking purposes.Therefore,in this study,this plain was selected to assess the spatial variability of groundwater quality and also to identify main parameters affecting the water quality using multivariate statistical techniques such as Cluster Analysis(CA),Discriminant Analysis(DA),and Principal Component Analysis(PCA).Water quality data was monitored at 22 different wells,for five years(2009-2014)with 10 water quality parameters.By using cluster analysis,the sampling wells were grouped into two clusters with distinct water qualities at different locations.The Lasso Discriminant Analysis(LDA)technique was used to assess the spatial variability of water quality.Based on the results,all of the variables except sodium absorption ratio(SAR)are effective in the LDA model with all variables affording 92.80%correct assignation to discriminate between the clusters from the primary 10 variables.Principal component(PC)analysis and factor analysis reduced the complex data matrix into two main components,accounting for more than 95.93%of the total variance.The first PC contained the parameters of TH,Ca2+,and Mg2+.Therefore,the first dominant factor was hardness.In the second PC,Cl-,SAR,and Na+were the dominant parameters,which may indicate salinity.The originally acquired factors illustrate natural(existence of geological formations)and anthropogenic(improper disposal of domestic and agricultural wastes)factors which affect the groundwater quality.
基金supported by the National Key R&D Program of China(2018YFE0202900)support by the NAOC Nebula Talents Program and the Cultivation Project for FAST Scientific Payoff and Research Achievement of CAMS-CAS。
文摘In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.
文摘The flow of novel coronavirus(COVID-19)has affected almost every aspect of human life around the globe.Being the emerging ground and early sufferer of the virus,Wuhan city-data remains a case of multifold significance.Further,it is of notable importance to explore the impact of unique and unprecedented public health response of Chinese authorities—the extreme lockdown of the city.In this research,we investigate the statistical nature of the viral transmission concerning social distancing,extreme quarantine,and robust lockdown interventions.We observed highly convincing and statistically significant evidences in favor of quarantine and social distancing approaches.These findings might help countries,now facing,or likely to face the wave of the virus.We analyzed Wuhan-based data of“number of deaths”and“confirmed cases,”extracted from China CDC weekly database,dated from February 13,2020,to March 24,2020.To estimate the underlying group structure,the assembled data is further subdivided into three blocks,each consists of two weeks.Thus,the complete data set is studied in three phases,such as,phase 1(Ph 1)=February 13,2020,to February 26,2020;phase 2(Ph 2)=February 27,2020 to March 11,2020;and phase 3(Ph 3)=March 12,2020 to March 24,2020.We observed the overall median proportion of deaths in those six weeks remained 0.0127.This estimate is highly influenced by Ph1,when the early flaws of weak health response were still prevalent.Over the time,we witnessed a median decline of 92.12%in the death proportions.Moreover,a non-parametric version of the variability analysis of death data,estimated that the average rank of reported proportions in Ph 3 remained 7,which was 20.5 in Ph 2,and stayed 34.5 in the first phase.Similar patterns were observed,when studying the confirmed cases data.We estimated the overall median of the proportion of confirmed cases in Wuhan as 0.0041,which again,is highly inclined towards Ph 1 and Ph 2.We also witnessed minimum average rank proportions for Ph 3,such as 7,which was noticeably lower than Ph 2,21.71,and Ph 1, 32.29. Moreover, the varying degree of clustering indicates that the effectivenessof quarantine based policies is time-dependent. In general, the declinein coronavirus transmission in Wuhan significantly coincides with the lockdown.
文摘Making use of the 2MASS Data Release, we have searched for nearinfrared (JHK) counterparts to 268 blazars from Donato et al. and obtained 238 counterparts within 5'' in the area covered by 2MASS. It provides us a sample with infrared data several times larger than the previous one of the same kind. Based on our sample and the sample by Donato et al., we have compared in detail the properties of HBLs, LBLs and FSRQs from five aspects and found that HBLs are significantly different from LBLs and FSRQs while LBLs are not obviously different from FSRQs. Our results strongly support the division of BL Lac objects into the high-frequency peaked (HBL) and low-frequency peaked (LBL) objects introduced by Padovani & Giommi and show that HBLs and LBLs are two kinds of blazar having different physical properties.