期刊文献+
共找到28篇文章
< 1 2 >
每页显示 20 50 100
Effective data sampling strategies and boundary condition constraints of physics-informed neural networks for identifying material properties in solid mechanics 被引量:1
1
作者 W.WU M.DANEKER +2 位作者 M.A.JOLLEY K.T.TURNER L.LU 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI CSCD 2023年第7期1039-1068,共30页
Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the ch... Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the characteristic of the material is highly nonlinear in nature,as is common in biological tissue.In this work,we identify unknown material properties in continuum solid mechanics via physics-informed neural networks(PINNs).To improve the accuracy and efficiency of PINNs,we develop efficient strategies to nonuniformly sample observational data.We also investigate different approaches to enforce Dirichlet-type boundary conditions(BCs)as soft or hard constraints.Finally,we apply the proposed methods to a diverse set of time-dependent and time-independent solid mechanic examples that span linear elastic and hyperelastic material space.The estimated material parameters achieve relative errors of less than 1%.As such,this work is relevant to diverse applications,including optimizing structural integrity and developing novel materials. 展开更多
关键词 solid mechanics material identification physics-informed neural network(PINN) data sampling boundary condition(BC)constraint
下载PDF
Minimum Data Sampling Method in the Inverse Scattering Problem
2
作者 Yu Wenhua(Res. Inst. of EM Field and Microwave Tech.), Southwest Jiaotong University, Chengdu 610031 ,ChinaPeng Zhongqiu(Beijng Remote Sensing and Information Institute),Beijing 100011,ChinaRen Lang(Res.inst. of EM Field and Microwave Tech.), Southwest J 《Journal of Modern Transportation》 1994年第2期114-118,共5页
Fourier transform is a basis of the analysis. This paper presents a kind ofmethod of minimum sampling data determined profile of the inverted object ininverse scattering.
关键词 inverse scattering nonuniqueness sampling data
下载PDF
Brittleness index predictions from Lower Barnett Shale well-log data applying an optimized data matching algorithm at various sampling densities 被引量:1
3
作者 David A.Wood 《Geoscience Frontiers》 SCIE CAS CSCD 2021年第6期444-457,共14页
The capability of accurately predicting mineralogical brittleness index (BI) from basic suites of well logs is desirable as it provides a useful indicator of the fracability of tight formations.Measuring mineralogical... The capability of accurately predicting mineralogical brittleness index (BI) from basic suites of well logs is desirable as it provides a useful indicator of the fracability of tight formations.Measuring mineralogical components in rocks is expensive and time consuming.However,the basic well log curves are not well correlated with BI so correlation-based,machine-learning methods are not able to derive highly accurate BI predictions using such data.A correlation-free,optimized data-matching algorithm is configured to predict BI on a supervised basis from well log and core data available from two published wells in the Lower Barnett Shale Formation (Texas).This transparent open box (TOB) algorithm matches data records by calculating the sum of squared errors between their variables and selecting the best matches as those with the minimum squared errors.It then applies optimizers to adjust weights applied to individual variable errors to minimize the root mean square error (RMSE)between calculated and predicted (BI).The prediction accuracy achieved by TOB using just five well logs (Gr,ρb,Ns,Rs,Dt) to predict BI is dependent on the density of data records sampled.At a sampling density of about one sample per 0.5 ft BI is predicted with RMSE~0.056 and R^(2)~0.790.At a sampling density of about one sample per0.1 ft BI is predicted with RMSE~0.008 and R^(2)~0.995.Adding a stratigraphic height index as an additional (sixth)input variable method improves BI prediction accuracy to RMSE~0.003 and R^(2)~0.999 for the two wells with only 1 record in 10,000 yielding a BI prediction error of>±0.1.The model has the potential to be applied in an unsupervised basis to predict BI from basic well log data in surrounding wells lacking mineralogical measurements but with similar lithofacies and burial histories.The method could also be extended to predict elastic rock properties in and seismic attributes from wells and seismic data to improve the precision of brittleness index and fracability mapping spatially. 展开更多
关键词 Well-log brittleness index estimates data record sample densities Zoomed-in data interpolation Correlation-free prediction analysis Mineralogical and elastic influences
下载PDF
Compressed Least Squares Algorithm of Continuous-Time Linear Stochastic Regression Model Using Sampling Data
4
作者 XIE Siyu ZHANG Shujun +1 位作者 WANG Ziming GAN Die 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2024年第4期1488-1506,共19页
In this paper,the authors consider a sparse parameter estimation problem in continuoustime linear stochastic regression models using sampling data.Based on the compressed sensing(CS)method,the authors propose a compre... In this paper,the authors consider a sparse parameter estimation problem in continuoustime linear stochastic regression models using sampling data.Based on the compressed sensing(CS)method,the authors propose a compressed least squares(LS) algorithm to deal with the challenges of parameter sparsity.At each sampling time instant,the proposed compressed LS algorithm first compresses the original high-dimensional regressor using a sensing matrix and obtains a low-dimensional LS estimate for the compressed unknown parameter.Then,the original high-dimensional sparse unknown parameter is recovered by a reconstruction method.By introducing a compressed excitation assumption and employing stochastic Lyapunov function and martingale estimate methods,the authors establish the performance analysis of the compressed LS algorithm under the condition on the sampling time interval without using independence or stationarity conditions on the system signals.At last,a simulation example is provided to verify the theoretical results by comparing the standard and the compressed LS algorithms for estimating a high-dimensional sparse unknown parameter. 展开更多
关键词 Compressed excitation condition compressed sensing continuous-time model least squares linear stochastic regression parameter identification sampling data
原文传递
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine 被引量:1
5
作者 James A. Hodgson Tiffany H. Seyler +2 位作者 Ernest McGahee Stephen Arnstein Lanqing Wang 《American Journal of Analytical Chemistry》 2016年第2期165-178,共14页
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at hig... Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and side-stream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/ MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR<sup>TM</sup> and Caliper Staccato<sup>TM</sup> workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. 展开更多
关键词 Volatile Nitrosamines AUTOMATION Sample data Flow Gas Chromatography Tandem Mass Spectrometry
下载PDF
Consensus for second-order multi-agent systems with position sampled data
6
作者 王如生 高利新 +1 位作者 陈文海 戴大蒙 《Chinese Physics B》 SCIE EI CAS CSCD 2016年第10期13-23,共11页
In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated.The interaction topology among the agents is depicted by a directed graph. The full-order and reduce... In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated.The interaction topology among the agents is depicted by a directed graph. The full-order and reduced-order observers with position sampled data are proposed, by which two kinds of sampled data-based consensus protocols are constructed. With the provided sampled protocols, the consensus convergence analysis of a continuous-time multi-agent system is equivalently transformed into that of a discrete-time system. Then, by using matrix theory and a sampled control analysis method, some sufficient and necessary consensus conditions based on the coupling parameters, spectrum of the Laplacian matrix and sampling period are obtained. While the sampling period tends to zero, our established necessary and sufficient conditions are degenerated to the continuous-time protocol case, which are consistent with the existing result for the continuous-time case. Finally, the effectiveness of our established results is illustrated by a simple simulation example. 展开更多
关键词 multi-agent systems distributed control CONSENSUS OBSERVER sampled data
下载PDF
Model-data-driven seismic inversion method based on small sample data
7
作者 LIU Jinshui SUN Yuhang LIU Yang 《Petroleum Exploration and Development》 CSCD 2022年第5期1046-1055,共10页
As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this prob... As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this problem,a model-data-driven seismic AVO(amplitude variation with offset)inversion method based on a space-variant objective function has been worked out.In this method,zero delay cross-correlation function and F norm are used to establish objective function.Based on inverse distance weighting theory,change of the objective function is controlled according to the location of the target CDP(common depth point),to change the constraint weights of training samples,initial low-frequency models,and seismic data on the inversion.Hence,the proposed method can get high resolution and high-accuracy velocity and density from inversion of small sample data,and is suitable for identifying thin interbedded sand bodies.Tests with thin interbedded geological models show that the proposed method has high inversion accuracy and resolution for small sample data,and can identify sandstone and mudstone layers of about one-30th of the dominant wavelength thick.Tests on the field data of Lishui sag show that the inversion results of the proposed method have small relative error with well-log data,and can identify thin interbedded sandstone layers of about one-15th of the dominant wavelength thick with small sample data. 展开更多
关键词 small sample data space-variant objective function model-data-driven neural network seismic AVO inversion thin interbedded sandstone identification Paleocene Lishui sag
下载PDF
Quality of Life and Cannabis Use: Results from Canadian Sample Survey Data
8
作者 Rawan Hassunah James McIntosh 《Health》 CAS 2016年第14期1576-1588,共14页
Data from the 2013 Canadian Tobacco, Alcohol and Drugs Survey, and two other surveys are used to determine the effects of cannabis use on self-reported physical and mental health. Daily or almost daily marijuana use i... Data from the 2013 Canadian Tobacco, Alcohol and Drugs Survey, and two other surveys are used to determine the effects of cannabis use on self-reported physical and mental health. Daily or almost daily marijuana use is shown to be detrimental to both measures of health for some age groups but not all. The age group specific effects depend on gender. Males and females respond differently to cannabis use. The health costs of regularly using cannabis are significant but they are much smaller than those associated with tobacco use. These costs are attributed to both the presence of delta9-tetrahydrocannabinol and the fact that smoking cannabis is itself a health hazard because of the toxic properties of the smoke ingested. Cannabis use is costlier to regular smokers and age of first use below the age of 15 or 20 and being a former user leads to reduced physical and mental capacities which are permanent. These results strongly suggest that the legalization of marijuana be accompanied by educational programs, counseling services, and a delivery system, which minimizes juvenile and young adult usage. 展开更多
关键词 Marijuana Sample Survey data CANADA
下载PDF
A New Economy Forecasting Method Based on Data Barycentre Forecasting Method
9
作者 Jilin Zhang Qun Zhang 《Chinese Business Review》 2005年第5期25-28,共4页
A new and useful method of technology economics, parameter estimation method, was presented in light of the stability of gravity center of object in this paper. This method could deal with the fitting and forecasting ... A new and useful method of technology economics, parameter estimation method, was presented in light of the stability of gravity center of object in this paper. This method could deal with the fitting and forecasting of economy volume and could greatly decrease the errors of the fitting and forecasting results. Moreover, the strict hypothetical conditions in least squares method were not necessary in the method presented in this paper, which overcame the shortcomings of least squares method and expanded the application of data barycentre method. Application to the steel consumption volume forecasting was presented in this paper. It was shown that the result of fitting and forecasting was satisfactory. From the comparison between data barycentre forecasting method and least squares method, we could conclude that the fitting and forecasting results using data barycentre method were more stable than those of using least squares regression forecasting method, and the computation of data barycentre forecasting method was simpler than that of least squares method. As a result, the data barycentre method was convenient to use in technical economy. 展开更多
关键词 data barycentre method parameter estimation small sample steel forecasting
下载PDF
The Importance of Integrating Geological Mapping Information with Validated Assay Data for Generating Accurate Geological Wireframes in Orebody Modelling of Mineral Deposit in Mineral Resource Estimation: A Case Study in AngloGold Ashanti, Obuasi Mine
10
作者 Joshua Wereko Opong Chiri G. Amedjoe +1 位作者 Andy Asante Matthew Coffie Wilson 《International Journal of Geosciences》 2022年第6期426-437,共12页
The basis of accurate mineral resource estimates is to have a geological model which replicates the nature and style of the orebody. Key inputs into the generation of a good geological model are the sample data and ma... The basis of accurate mineral resource estimates is to have a geological model which replicates the nature and style of the orebody. Key inputs into the generation of a good geological model are the sample data and mapping information. The Obuasi Mine sample data with a lot of legacy issues were subjected to a robust validation process and integrated with mapping information to generate an accurate geological orebody model for mineral resource estimation in Block 8 Lower. Validation of the sample data focused on replacing missing collar coordinates, missing assays, and correcting magnetic declination that was used to convert the downhole surveys from true to magnetic, fix missing lithology and finally assign confidence numbers to all the sample data. The missing coordinates which were replaced ensured that the sample data plotted at their correct location in space as intended from the planning stage. Magnetic declination data, which was maintained constant throughout all the years even though it changes every year, was also corrected in the validation project. The corrected magnetic declination ensured that the drillholes were plotted on their accurate trajectory as per the planned azimuth and also reflected the true position of the intercepted mineralized fissure(s) which was previously not the case and marked a major blot in the modelling of the Obuasi orebody. The incorporation of mapped data with the validated sample data in the wireframes resulted in a better interpretation of the orebody. The updated mineral resource generated by domaining quartz from the sulphides and compared with the old resource showed that the sulphide tonnes in the old resource estimates were overestimated by 1% and the grade overestimated by 8.5%. 展开更多
关键词 Mineral Resource Estimation Geological Models Sample data Validation Assay data Geological Mapping
下载PDF
Distributed Least Squares Algorithm of Continuous-Time Stochastic Regression Model Based on Sampled Data
11
作者 ZHU Xinghua GAN Die LIU Zhixin 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2024年第2期609-628,共20页
In this paper,the authors consider the distributed adaptive identification problem over sensor networks using sampled data,where the dynamics of each sensor is described by a stochastic differential equation.By minimi... In this paper,the authors consider the distributed adaptive identification problem over sensor networks using sampled data,where the dynamics of each sensor is described by a stochastic differential equation.By minimizing a local objective function at sampling time instants,the authors propose an online distributed least squares algorithm based on sampled data.A cooperative non-persistent excitation condition is introduced,under which the convergence results of the proposed algorithm are established by properly choosing the sampling time interval.The upper bound on the accumulative regret of the adaptive predictor can also be provided.Finally,the authors demonstrate the cooperative effect of multiple sensors in the estimation of unknown parameters by computer simulations. 展开更多
关键词 Cooperative excitation condition distributed least squares REGRET sampled data stochastic differential equation
原文传递
A Survey of Data Partitioning and Sampling Methods to Support Big Data Analysis 被引量:17
12
作者 Mohammad Sultan Mahmud Joshua Zhexue Huang +2 位作者 Salman Salloum Tamer Z.Emara Kuanishbay Sadatdiynov 《Big Data Mining and Analytics》 2020年第2期85-101,共17页
Computer clusters with the shared-nothing architecture are the major computing platforms for big data processing and analysis.In cluster computing,data partitioning and sampling are two fundamental strategies to speed... Computer clusters with the shared-nothing architecture are the major computing platforms for big data processing and analysis.In cluster computing,data partitioning and sampling are two fundamental strategies to speed up the computation of big data and increase scalability.In this paper,we present a comprehensive survey of the methods and techniques of data partitioning and sampling with respect to big data processing and analysis.We start with an overview of the mainstream big data frameworks on Hadoop clusters.The basic methods of data partitioning are then discussed including three classical horizontal partitioning schemes:range,hash,and random partitioning.Data partitioning on Hadoop clusters is also discussed with a summary of new strategies for big data partitioning,including the new Random Sample Partition(RSP)distributed model.The classical methods of data sampling are then investigated,including simple random sampling,stratified sampling,and reservoir sampling.Two common methods of big data sampling on computing clusters are also discussed:record-level sampling and blocklevel sampling.Record-level sampling is not as efficient as block-level sampling on big distributed data.On the other hand,block-level sampling on data blocks generated with the classical data partitioning methods does not necessarily produce good representative samples for approximate computing of big data.In this survey,we also summarize the prevailing strategies and related work on sampling-based approximation on Hadoop clusters.We believe that data partitioning and sampling should be considered together to build approximate cluster computing frameworks that are reliable in both the computational and statistical respects. 展开更多
关键词 big data analysis data partitioning data sampling distributed and parallel computing approximate computing
原文传递
Application of Artificial Neural Network to Battlefield Target Classification
13
作者 李芳 张中民 李科杰 《Journal of Beijing Institute of Technology》 EI CAS 2000年第2期201-204,共4页
To study the capacity of artificial neural network (ANN) applying to battlefield target classification and result of classification, according to the characteristics of battlefield target acoustic and seismic sign... To study the capacity of artificial neural network (ANN) applying to battlefield target classification and result of classification, according to the characteristics of battlefield target acoustic and seismic signals, an on the spot experiment was carried out to derive acoustic and seismic signals of a tank and jeep by special experiment system. Experiment data processed by fast Fourier transform(FFT) were used to train the ANN to distinguish the two battlefield targets. The ANN classifier was performed by the special program based on the modified back propagation (BP) algorithm. The ANN classifier has high correct identification rates for acoustic and seismic signals of battlefield targets, and is suitable for the classification of battlefield targets. The modified BP algorithm eliminates oscillations and local minimum of the standard BP algorithm, and enhances the convergence rate of the ANN. 展开更多
关键词 artificial neural network sample data CLASSIFIER TRAINING
下载PDF
Fuzzy modeling of multirate sampled nonlinear systems based on multi-model method 被引量:1
14
作者 WANG Hongwei FENG Penglong 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2020年第4期761-769,共9页
Based on the multi-model principle, the fuzzy identification for nonlinear systems with multirate sampled data is studied.Firstly, the nonlinear system with multirate sampled data can be shown as the nonlinear weighte... Based on the multi-model principle, the fuzzy identification for nonlinear systems with multirate sampled data is studied.Firstly, the nonlinear system with multirate sampled data can be shown as the nonlinear weighted combination of some linear models at multiple local working points. On this basis, the fuzzy model of the multirate sampled nonlinear system is built. The premise structure of the fuzzy model is confirmed by using fuzzy competitive learning, and the conclusion parameters of the fuzzy model are estimated by the random gradient descent algorithm. The convergence of the proposed identification algorithm is given by using the martingale theorem and lemmas. The fuzzy model of the PH neutralization process of acid-base titration for hair quality detection is constructed to demonstrate the effectiveness of the proposed method. 展开更多
关键词 multirate sampled data nonlinear system fuzzy model MULTI-MODEL
下载PDF
Parameter estimation for dual-rate sampled Hammerstein systems with dead-zone nonlinearity 被引量:1
15
作者 WANG Hongwei CHEN Yuxiao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2020年第1期185-193,共9页
The identification of nonlinear systems with multiple sampled rates is a difficult task.The motivation of our paper is to study the parameter estimation problem of Hammerstein systems with dead-zone characteristics by... The identification of nonlinear systems with multiple sampled rates is a difficult task.The motivation of our paper is to study the parameter estimation problem of Hammerstein systems with dead-zone characteristics by using the dual-rate sampled data.Firstly,the auxiliary model identification principle is used to estimate the unmeasurable variables,and the recursive estimation algorithm is proposed to identify the parameters of the static nonlinear model with the dead-zone function and the parameters of the dynamic linear system model.Then,the convergence of the proposed identification algorithm is analyzed by using the martingale convergence theorem.It is proved theoretically that the estimated parameters can converge to the real values under the condition of continuous excitation.Finally,the validity of the proposed algorithm is proved by the identification of the dual-rate sampled nonlinear systems. 展开更多
关键词 dual-rate sampled data dead-zone nonlinearity Hammerstein model system identification convergence analysis
下载PDF
Synchronization of nonlinear multi-agent systems using a non-fragile sampled data control approach and its application to circuit systems 被引量:1
16
作者 Stephen AROCKIA SAMY Raja RAMACHANDRAN +1 位作者 Pratap ANBALAGAN Yang CAO 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2023年第4期553-566,共14页
The main aim of this work is to design a non-fragile sampled data control(NFSDC) scheme for the asymptotic synchronization criteria for interconnected coupled circuit systems(multi-agent systems, MASs). NFSDC is used ... The main aim of this work is to design a non-fragile sampled data control(NFSDC) scheme for the asymptotic synchronization criteria for interconnected coupled circuit systems(multi-agent systems, MASs). NFSDC is used to conduct synchronization analysis of the considered MASs in the presence of time-varying delays. By constructing suitable Lyapunov functions, sufficient conditions are derived in terms of linear matrix inequalities(LMIs) to ensure synchronization between the MAS leader and follower systems. Finally, two numerical examples are given to show the effectiveness of the proposed control scheme and less conservation of the proposed Lyapunov functions. 展开更多
关键词 Multi-agent systems(MASs) Non-fragile sampled data control(NFSDC) Time-varying delay Linear matrix inequality(LMI) Asymptotic synchronization
原文传递
AN ELIMINATION ALGORITHM OF EXTREME VALUES FOR INTEGRATED TRISPECTRUM
17
作者 LiangZongchuang LiuXingzhao 《Journal of Electronics(China)》 2002年第2期146-151,共6页
In this paper, an algorithm for eliminating extreme values and reducing the estimation variance of an integrated trispectrum under low signal-to-noise ratio and short data sample conditions is presented. An analysis o... In this paper, an algorithm for eliminating extreme values and reducing the estimation variance of an integrated trispectrum under low signal-to-noise ratio and short data sample conditions is presented. An analysis of the results of simulations using this algorithm and comparison with the conventional power spectrum and integrated trispectrum methods are presented. 展开更多
关键词 Integrated trispectrum Short data sample Extreme values
下载PDF
Structural Reliability Analysis Based on Support Vector Machine and Dual Neural Network Direct Integration Method
18
作者 NIE Xiaobo LI Haibin 《Journal of Donghua University(English Edition)》 CAS 2021年第1期51-56,共6页
Aiming at the reliability analysis of small sample data or implicit structural function,a novel structural reliability analysis model based on support vector machine(SVM)and neural network direct integration method(DN... Aiming at the reliability analysis of small sample data or implicit structural function,a novel structural reliability analysis model based on support vector machine(SVM)and neural network direct integration method(DNN)is proposed.Firstly,SVM with good small sample learning ability is used to train small sample data,fit structural performance functions and establish regular integration regions.Secondly,DNN is approximated the integral function to achieve multiple integration in the integration region.Finally,structural reliability was obtained by DNN.Numerical examples are investigated to demonstrate the effectiveness of the present method,which provides a feasible way for the structural reliability analysis. 展开更多
关键词 support vector machine(SVM) neural network direct integration method structural reliability small sample data performance function
下载PDF
THE DESIGN AND DATA PROCESSING OF THE SAMPLING SURVEY OF CHILDREN'S SITUATION IN CHINA, 1987
19
作者 冯士雍 王恩平 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 1990年第4期351-360,共10页
In July of 1987, the Sampling Survey of Children's Situation was conducted in 9 provincesautonomous regions of China. A stratified two--stage cluster sampling plan was designed for thesurvey. The paper presents th... In July of 1987, the Sampling Survey of Children's Situation was conducted in 9 provincesautonomous regions of China. A stratified two--stage cluster sampling plan was designed for thesurvey. The paper presents the methods of stratification, selecting n=2 PSU's (cities/counties) withunequal probabilities without replacement in each stratum and selecting residents/village committeein each sampled city/county. All formulae of estimating population characteristics (especiallypopulation totals and the ratios of two totals), and estimating variances of those estimators aregiven. Finally, we analyse the precision of the survey preliminarily from the result of dataprocessing. 展开更多
关键词 SAMPL THE DESIGN AND data PROCESSING OF THE sampling SURVEY OF CHILDREN’S SITUATION IN CHINA
原文传递
Bayesian Method Reliability of Flight Simulator
20
作者 WANG Li XIONG Jing 《International English Education Research》 2017年第1期76-78,共3页
This paper introduces the basic viewpoints and characteristics of Bayesian statistics. Which provides a theoretical basis for solving the problem of small sample of flight simulator using Bayesian method. A series of ... This paper introduces the basic viewpoints and characteristics of Bayesian statistics. Which provides a theoretical basis for solving the problem of small sample of flight simulator using Bayesian method. A series of formulas were derived to establish the Bayesian reliability modeling and evaluation model for flight simulation equipment. The two key problems of Bayesian method were pointed out as follows: obtaining the prior distribution of WeibuU parameter, calculating the parameter a posterior distribution and parameter estimation without analytic solution, and proposing the corresponding solution scheme. 展开更多
关键词 Small sample data Flight simulation equipment Reliability modeling Bayesian method Weibull parameter
下载PDF
上一页 1 2 下一页 到第
使用帮助 返回顶部