期刊文献+
共找到902篇文章
< 1 2 46 >
每页显示 20 50 100
Obtaining Prior Information for Ultrasonic Signal Reconstruction from FRI Sparse Sampling Data
1
作者 Shoupeng Song Yingjie Ni Yonghua Shao 《Journal of Harbin Institute of Technology(New Series)》 EI CAS 2018年第4期65-72,共8页
Finite rate of innovation sampling is a novel sub-Nyquist sampling method that can reconstruct a signal from sparse sampling data.The application of this method in ultrasonic testing greatly reduces the signal samplin... Finite rate of innovation sampling is a novel sub-Nyquist sampling method that can reconstruct a signal from sparse sampling data.The application of this method in ultrasonic testing greatly reduces the signal sampling rate and the quantity of sampling data.However,the pulse number of the signal must be known beforehand for the signal reconstruction procedure.The accuracy of this prior information directly affects the accuracy of the estimated parameters of the signal and influences the assessment of flaws,leading to a lower defect detection ratio.Although the pulse number can be pre-given by theoretical analysis,the process is still unable to assess actual complex random orientation defects.Therefore,this paper proposes a new method that uses singular value decomposition(SVD) for estimating the pulse number from sparse sampling data and avoids the shortcoming of providing the pulse number in advance for signal reconstruction.When the sparse sampling data have been acquired from the ultrasonic signal,these data are transformed to discrete Fourier coefficients.A Hankel matrix is then constructed from these coefficients,and SVD is performed on the matrix.The decomposition coefficients reserve the information of the pulse number.When the decomposition coefficients generated by noise according to noise level are removed,the number of the remaining decomposition coefficients is the signal pulse number.The feasibility of the proposed method was verified through simulation experiments.The applicability was tested in ultrasonic experiments by using sample flawed pipelines.Results from simulations and real experiments demonstrated the efficiency of this method. 展开更多
关键词 FRI ultrasonic signal sparse sampling signal reconstruction prior information
下载PDF
Fisher information for generalized Rayleigh distribution in ranked set sampling design with application to parameter estimation
2
作者 SHEN Bing-liang CHEN Wang-xue +1 位作者 WANG Shuo CHEN Meng 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2022年第4期615-630,共16页
In the current paper,we considered the Fisher information matrix from the generalized Rayleigh distribution(GR)distribution in ranked set sampling(RSS).The numerical results show that the ranked set sample carries mor... In the current paper,we considered the Fisher information matrix from the generalized Rayleigh distribution(GR)distribution in ranked set sampling(RSS).The numerical results show that the ranked set sample carries more information about λ and α than a simple random sample of equivalent size.In order to give more insight into the performance of RSS with respect to(w.r.t.)simple random sampling(SRS),a modified unbiased estimator and a modified best linear unbiased estimator(BLUE)of scale and shape λ and α from GR distribution in SRS and RSS are studied.The numerical results show that the modified unbiased estimator and the modified BLUE of λ and α in RSS are significantly more efficient than the ones in SRS. 展开更多
关键词 ranked set sample Fisher information number best linear unbiased estimator
下载PDF
A Model-calibration Approach to Using Complete Auxiliary Information from Stratified Sampling Survey Data
3
作者 WU Chang-chun ZHANG Run-chu 《Chinese Quarterly Journal of Mathematics》 CSCD 北大核心 2006年第2期309-316,共8页
In stratified survey sampling, sometimes we have complete auxiliary information. One of the fundamental questions is how to effectively use the complete auxiliary information at the estimation stage. In this paper, we... In stratified survey sampling, sometimes we have complete auxiliary information. One of the fundamental questions is how to effectively use the complete auxiliary information at the estimation stage. In this paper, we extend the model-calibration method to obtain estimators of the finite population mean by using complete auxiliary information from stratified sampling survey data. We show that the resulting estimators effectively use auxiliary information at the estimation stage and possess a number of attractive features such as asymptotically design-unbiased irrespective of the working model and approximately model-unbiased under the model. When a linear working-model is used, the resulting estimators reduce to the usual calibration estimator(or GREG). 展开更多
关键词 model-calibration pseudo empirical likelihood stratified sampling survey complete auxiliary information estimating equations generalized linear models superpopulation
下载PDF
Data processing of small samples based on grey distance information approach 被引量:14
4
作者 Ke Hongfa, Chen Yongguang & Liu Yi 1. Coll. of Electronic Science and Engineering, National Univ. of Defense Technology, Changsha 410073, P. R. China 2. Unit 63880, Luoyang 471003, P. R. China 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第2期281-289,共9页
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di... Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective. 展开更多
关键词 Data processing Grey theory Norm theory Small samples Uncertainty assessments Grey distance measure information whitening ratio.
下载PDF
Selective sampling with Gromov–Hausdorff metric:Efficient dense-shape correspondence via Confidence-based sample consensus
5
作者 Dvir GINZBURG Dan RAVIV 《虚拟现实与智能硬件(中英文)》 EI 2024年第1期30-42,共13页
Background Functional mapping, despite its proven efficiency, suffers from a “chicken or egg” scenario, in that, poor spatial features lead to inadequate spectral alignment and vice versa during training, often resu... Background Functional mapping, despite its proven efficiency, suffers from a “chicken or egg” scenario, in that, poor spatial features lead to inadequate spectral alignment and vice versa during training, often resulting in slow convergence, high computational costs, and learning failures, particularly when small datasets are used. Methods A novel method is presented for dense-shape correspondence, whereby the spatial information transformed by neural networks is combined with the projections onto spectral maps to overcome the “chicken or egg” challenge by selectively sampling only points with high confidence in their alignment. These points then contribute to the alignment and spectral loss terms, boosting training, and accelerating convergence by a factor of five. To ensure full unsupervised learning, the Gromov–Hausdorff distance metric was used to select the points with the maximal alignment score displaying most confidence. Results The effectiveness of the proposed approach was demonstrated on several benchmark datasets, whereby results were reported as superior to those of spectral and spatial-based methods. Conclusions The proposed method provides a promising new approach to dense-shape correspondence, addressing the key challenges in the field and offering significant advantages over the current methods, including faster convergence, improved accuracy, and reduced computational costs. 展开更多
关键词 Dense-shape correspondence Spatial information Neural networks Spectral maps Selective sampling
下载PDF
General limited information diffusion method of small-sample information analysis in insurance 被引量:14
6
作者 忻莉莉 耿辉 +1 位作者 王永民 张晶晶 《Journal of Shanghai University(English Edition)》 CAS 2007年第3期259-262,共4页
When analyzing and evaluating risks in insurance, people are often confronted with the situation of incomplete information and insufficient data, which is known as a small-sample problem. In this paper, a one-dimensio... When analyzing and evaluating risks in insurance, people are often confronted with the situation of incomplete information and insufficient data, which is known as a small-sample problem. In this paper, a one-dimensional small-sample problem in insurance was investigated using the kernel density estimation method (KerM) and general limited information diffusion method (GIDM). In particular, MacCormack technique was applied to get the solutions of GIDM equations and then the optimal diffusion solution was acquired based on the two optimization principles. Finally, the analysis introduced in this paper was verified by treating some examples and satisfying results were obtained. 展开更多
关键词 fuzzy mathematics kernel density estimation information diffusion MacCormack technique small-sample
下载PDF
Prediction and Optimization Performance Models for Poor Information Sample Prediction Problems
7
作者 LU Fei SUN Ruishan +2 位作者 CHEN Zichen CHEN Huiyu WANG Xiaomin 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI CSCD 2021年第2期316-324,共9页
The prediction process often runs with small samples and under-sufficient information.To target this problem,we propose a performance comparison study that combines prediction and optimization algorithms based on expe... The prediction process often runs with small samples and under-sufficient information.To target this problem,we propose a performance comparison study that combines prediction and optimization algorithms based on experimental data analysis.Through a large number of prediction and optimization experiments,the accuracy and stability of the prediction method and the correction ability of the optimization method are studied.First,five traditional single-item prediction methods are used to process small samples with under-sufficient information,and the standard deviation method is used to assign weights on the five methods for combined forecasting.The accuracy of the prediction results is ranked.The mean and variance of the rankings reflect the accuracy and stability of the prediction method.Second,the error elimination prediction optimization method is proposed.To make,the prediction results are corrected by error elimination optimization method(EEOM),Markov optimization and two-layer optimization separately to obtain more accurate prediction results.The degree improvement and decline are used to reflect the correction ability of the optimization method.The results show that the accuracy and stability of combined prediction are the best in the prediction methods,and the correction ability of error elimination optimization is the best in the optimization methods.The combination of the two methods can well solve the problem of prediction with small samples and under-sufficient information.Finally,the accuracy of the combination of the combined prediction and the error elimination optimization is verified by predicting the number of unsafe events in civil aviation in a certain year. 展开更多
关键词 small sample and poor information prediction method performance optimization method performance combined prediction error elimination optimization model Markov optimization
下载PDF
The Information Hiding Technology Based on the Similar Sample Blocks
8
作者 Tzu-Chuen Lu Siang-Ru Liao Chun-Ming Chang 《通讯和计算机(中英文版)》 2012年第4期434-443,共10页
关键词 信息隐藏技术 样本块 类似 标准图像 个人形象 秘密信息 隐藏方法 灰度图像
下载PDF
Hierarchical hybrid testability modeling and evaluation method based on information fusion 被引量:4
9
作者 Xishan Zhang Kaoli Huang +1 位作者 Pengcheng Yan Guangyao Lian 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2015年第3期523-532,共10页
In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HH... In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HHTME), which combines the testabi- lity structure model (TSM) with the testability Bayesian networks model (TBNM), is presented. Firstly, the testability network topo- logy of complex equipment is built by using the hierarchical hybrid testability modeling method. Secondly, the prior conditional prob- ability distribution between network nodes is determined through expert experience. Then the Bayesian method is used to update the conditional probability distribution, according to history test information, virtual simulation information and similar product in- formation. Finally, the learned hierarchical hybrid testability model (HHTM) is used to estimate the testability of equipment. Compared with the results of other modeling methods, the relative deviation of the HHTM is only 0.52%, and the evaluation result is the most accu rate. 展开更多
关键词 small sample complex equipment hierarchical hybrid information fusion testability modeling and evaluation.
下载PDF
Iterative Learning Control With Incomplete Information: A Survey 被引量:12
10
作者 Dong Shen Senior Member IEEE 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2018年第5期885-901,共17页
Abstract--This paper conducts a survey on iterative learn- ing control (ILC) with incomplete information and associated control system design, which is a frontier of the ILC field. The incomplete information, includ... Abstract--This paper conducts a survey on iterative learn- ing control (ILC) with incomplete information and associated control system design, which is a frontier of the ILC field. The incomplete information, including passive and active types, can cause data loss or fragment due to various factors. Passive incomplete information refers to incomplete data and information caused by practical system limitations during data collection, storage, transmission, and processing, such as data dropouts, delays, disordering, and limited transmission bandwidth. Active incomplete information refers to incomplete data and information caused by man-made reduction of data quantity and quality on the premise that the given objective is satisfied, such as sampling and quantization. This survey emphasizes two aspects: the first one is how to guarantee good learning performance and tracking performance with passive incomplete data, and the second is how to balance the control performance index and data demand by active means. The promising research directions along this topic are also addressed, where data robustness is highly emphasized. This survey is expected to improve understanding of the restrictive relationship and trade-off between incomplete data and tracking performance, quantitatively, and promote further developments of ILC theory. Index Terms--Data dropout, data robustness, incomplete in- formation, iterative learning controi(ILC), quantized control, sampled control, varying lengths. 展开更多
关键词 Data dropout data robustness incomplete information iterative learning control(ILC) quantized control sampled control varying lengths
下载PDF
Optimization of Well Position and Sampling Frequency for Groundwater Monitoring and Inverse Identification of Contamination Source Conditions Using Bayes’Theorem 被引量:2
11
作者 Shuangsheng Zhang Hanhu Liu +3 位作者 Jing Qiang Hongze Gao Diego Galar Jing Lin 《Computer Modeling in Engineering & Sciences》 SCIE EI 2019年第5期373-394,共22页
Coupling Bayes’Theorem with a two-dimensional(2D)groundwater solute advection-diffusion transport equation allows an inverse model to be established to identify a set of contamination source parameters including sour... Coupling Bayes’Theorem with a two-dimensional(2D)groundwater solute advection-diffusion transport equation allows an inverse model to be established to identify a set of contamination source parameters including source intensity(M),release location(0 X,0 Y)and release time(0 T),based on monitoring well data.To address the issues of insufficient monitoring wells or weak correlation between monitoring data and model parameters,a monitoring well design optimization approach was developed based on the Bayesian formula and information entropy.To demonstrate how the model works,an exemplar problem with an instantaneous release of a contaminant in a confined groundwater aquifer was employed.The information entropy of the model parameters posterior distribution was used as a criterion to evaluate the monitoring data quantity index.The optimal monitoring well position and monitoring frequency were solved by the two-step Monte Carlo method and differential evolution algorithm given a known well monitoring locations and monitoring events.Based on the optimized monitoring well position and sampling frequency,the contamination source was identified by an improved Metropolis algorithm using the Latin hypercube sampling approach.The case study results show that the following parameters were obtained:1)the optimal monitoring well position(D)is at(445,200);and 2)the optimal monitoring frequency(Δt)is 7,providing that the monitoring events is set as 5 times.Employing the optimized monitoring well position and frequency,the mean errors of inverse modeling results in source parameters(M,X0,Y0,T0)were 9.20%,0.25%,0.0061%,and 0.33%,respectively.The optimized monitoring well position and sampling frequency canIt was also learnt that the improved Metropolis-Hastings algorithm(a Markov chain Monte Carlo method)can make the inverse modeling result independent of the initial sampling points and achieves an overall optimization,which significantly improved the accuracy and numerical stability of the inverse modeling results. 展开更多
关键词 Contamination source identification monitoring well optimization Bayes’Theorem information entropy differential evolution algorithm Metropolis Hastings algorithm Latin hypercube sampling
下载PDF
Adaptive sampling strategy for characterizing spatial distribution of soil liquefaction potential using cone penetration test 被引量:1
12
作者 Zheng Guan Yu Wang Tengyuan Zhao 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2022年第4期1221-1231,共11页
Characterizing spatial distribution of soil liquefaction potential is critical for assessing liquefactionrelated hazards(e.g.building damages caused by liquefaction-induced differential settlement).However,in engineer... Characterizing spatial distribution of soil liquefaction potential is critical for assessing liquefactionrelated hazards(e.g.building damages caused by liquefaction-induced differential settlement).However,in engineering practice,soil liquefaction potential is usually measured at limited locations in a specific site using in situ tests,e.g.cone penetration tests(CPTs),due to the restrictions of time,cost and access to subsurface space.In these cases,liquefaction potential of soil at untested locations requires to be interpreted from limited measured data points using proper interpolation method,leading to remarkable statistical uncertainty in liquefaction assessment.This underlines an important question of how to optimize the locations of CPT soundings and determine the minimum number of CPTs for achieving a target reliability level of liquefaction assessment.To tackle this issue,this study proposes a smart sampling strategy for determining the minimum number of CPTs and their optimal locations in a selfadaptive and data-driven manner.The proposed sampling strategy leverages on information entropy and Bayesian compressive sampling(BCS).Both simulated and real CPT data are used to demonstrate the proposed method.Illustrative examples indicate that the proposed method can adaptively and sequentially select the required number and optimal locations of CPTs. 展开更多
关键词 Liquefaction potential information entropy Cone penetration test(CPT) Site characterization Compressive sampling
下载PDF
An Information Diffusion Technique for Fire Risk Analysis 被引量:1
13
作者 刘静 黄崇福 《Journal of Donghua University(English Edition)》 EI CAS 2004年第3期54-57,共4页
There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion techn... There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk. 展开更多
关键词 Fire risk Small sample information distribution Fuzzy relationship
下载PDF
Fault Diagnosis of a Rotary Machine Based on Information Entropy and Rough Set 被引量:3
14
作者 LI Jian-lan HUANG Shu-hong 《International Journal of Plant Engineering and Management》 2007年第4期199-206,共8页
There exists some discord or contradiction of information during the process of fault diagnosis for rotary machine. But the traditional methods used in fault diagnosis can not dispose of the information. A model of fa... There exists some discord or contradiction of information during the process of fault diagnosis for rotary machine. But the traditional methods used in fault diagnosis can not dispose of the information. A model of fault diagnosis for a rotary machine based on information entropy theory and rough set theory is presented in this paper. The model has clear mathematical definition and can dispose both complete unification information and complete inconsistent information of vibration faults. By using the model, decision rules of six typical vibration faults of a steam turbine and electric generating set are deduced from experiment samples. Finally, the decision rules are validated by selected samples and good identification results are acquired. 展开更多
关键词 fault diagnosis rough set information entropy decision rule samplE rotary machine
下载PDF
Synthetic security assessment for incomplete interval-valued information system 被引量:1
15
作者 赵亮 Xue Zhi 《High Technology Letters》 EI CAS 2012年第2期160-166,共7页
In order to understand the security conditions of the incomplete interval-valued information system (IllS) and acquire the corresponding solution of security problems, this paper proposes a multi-attribute group dec... In order to understand the security conditions of the incomplete interval-valued information system (IllS) and acquire the corresponding solution of security problems, this paper proposes a multi-attribute group decision- making (MAGDM) security assessment method based on the technique for order performance by similarity to ideal solution (TOPSIS). For IllS with preference information, combining with dominance-based rough set approach (DRSA), the effect of incomplete interval-valued information on decision results is discussed. For the imprecise judgment matrices, the security attribute weight can be obtained using Gibbs sampling. A numerical example shows that the proposed method can acquire some valuable knowledge hidden in the incomplete interval-valued information. The effectiveness of the proposed method in the synthetic security assessment for IIIS is verified. 展开更多
关键词 security assessment incomplete interval-valued information system(IIIS) multi-attribute group decision-making(MAGDM) technique for order performance by similarity to ideal solution(TOPSIS) dominance- based rough set approach(DRSA) Gibbs sampling
下载PDF
基于聚合二次模态分解及Informer的短期负荷预测 被引量:5
16
作者 石卓见 冉启武 徐福聪 《电网技术》 EI CSCD 北大核心 2024年第6期2574-2583,I0087-I0091,共15页
针对区域级负荷的非平稳性及长序列预测精度低的问题,该文提出了一种基于聚合二次模态分解及Informer的短期负荷预测方法。首先,运用改进完全集合经验模态分解(improved complete ensemble empirical mode decomposition with adaptive ... 针对区域级负荷的非平稳性及长序列预测精度低的问题,该文提出了一种基于聚合二次模态分解及Informer的短期负荷预测方法。首先,运用改进完全集合经验模态分解(improved complete ensemble empirical mode decomposition with adaptive noise,ICEEMDAN)对负荷序列进行初步分解,削弱原始序列的随机性与波动性;其次,根据子序列的样本熵计算结果进行聚合,并通过比较不同的聚合方式选出最优重构方案;然后,利用变分模态分解对高复杂度的合作模态函数进行二次分解;充分考虑到电价、气象等因素对负荷的影响,采用随机森林(random forest,RF)算法进行相关性分析,从而为每个子序列构建不同的高耦合度特征矩阵并输入Informer进行建模,并通过其多层次编码及稀疏多头自注意力机制等方式提高对负荷序列的预测效率;最后采用巴塞罗那区域级负荷数据集进行实例验证,结果显示所提框架有效解决了模态分解过程中的模态混叠以及高频分量问题,并且其长序列预测均方根误差相比其他经典深度学习模型最高降低了65.28%。 展开更多
关键词 短期负荷预测 二次分解 样本熵 聚合方式比较 informER 随机森林算法 长序列预测
下载PDF
复杂环境下基于改进Informed RRT*的无人机路径规划算法 被引量:3
17
作者 刘文倩 单梁 +2 位作者 张伟龙 刘成林 马强 《上海交通大学学报》 EI CAS CSCD 北大核心 2024年第4期511-524,共14页
针对无人机在复杂环境中进行路径规划时,快速搜索随机树(RRT)算法易出现规划时间长、路径冗余、狭窄空间中易陷入局部约束导致规划失败的问题,提出一种改进的Informed RRT*算法.首先,引入人工势场法使采样点按照势场下降的方式向目标点... 针对无人机在复杂环境中进行路径规划时,快速搜索随机树(RRT)算法易出现规划时间长、路径冗余、狭窄空间中易陷入局部约束导致规划失败的问题,提出一种改进的Informed RRT*算法.首先,引入人工势场法使采样点按照势场下降的方式向目标点移动,以提高RRT树扩展的目的性和方向性.然后,考虑随机树在扩展过程中全局环境的复杂度,引入自适应步长调整策略以增加随机树在无障碍环境下的扩展速度,并在随机树扩展的过程中加入相关约束条件以确保生成路径的可行性.在找到第一条可达路径后,采用变化的椭圆或椭球采样域限制采样点选取和自适应步长的扩展范围,加快算法收敛到渐进最优的速度.最后,在复杂二维和三维环境下进行传统算法和改进算法的对比实验,仿真分析表明:改进算法可以在很少的迭代次数下找到更优的初始路径,更快地锁定椭圆或椭球采样域,从而给路径优化留出更多时间,算法规划效果更好. 展开更多
关键词 路径规划 informed RRT* 人工势场法 自适应步长 椭圆采样域
下载PDF
Sampling strategies for estimating forest cover from remote sensing-based two-stage inventories
18
作者 Piermaria Corona Lorenzo Fattorini Maria Chiara Pagliarella 《Forest Ecosystems》 SCIE CSCD 2015年第3期208-219,共12页
Background: Remote sensing-based inventories are essential in estimating forest cover in tropical and subtropical countries, where ground inventories cannot be performed periodically at a large scale owing to high cos... Background: Remote sensing-based inventories are essential in estimating forest cover in tropical and subtropical countries, where ground inventories cannot be performed periodically at a large scale owing to high costs and forest inaccessibility(e.g. REDD projects) and are mandatory for constructing historical records that can be used as forest cover baselines. Given the conditions of such inventories, the survey area is partitioned into a grid of imagery segments of pre-fixed size where the proportion of forest cover can be measured within segments using a combination of unsupervised(automated or semi-automated) classification of satellite imagery and manual(i.e. visual on-screen)enhancements. Because visual on-screen operations are time expensive procedures, manual classification can be performed only for a sample of imagery segments selected at a first stage, while forest cover within each selected segment is estimated at a second stage from a sample of pixels selected within the segment. Because forest cover data arising from unsupervised satellite imagery classification may be freely available(e.g. Landsat imagery)over the entire survey area(wall-to-wall data) and are likely to be good proxies of manually classified cover data(sample data), they can be adopted as suitable auxiliary information.Methods: The question is how to choose the sample areas where manual classification is carried out. We have investigated the efficiency of one-per-stratum stratified sampling for selecting segments and pixels, where to carry out manual classification and to determine the efficiency of the difference estimator for exploiting auxiliary information at the estimation level. The performance of this strategy is compared with simple random sampling without replacement.Results: Our results were obtained theoretically from three artificial populations constructed from the Landsat classification(forest/non forest) available at pixel level for a study area located in central Italy, assuming three levels of error rates of the unsupervised classification of satellite imagery. The exploitation of map data as auxiliary information in the difference estimator proves to be highly effective with respect to the Horvitz-Thompson estimator,in which no auxiliary information is exploited. The use of one-per-stratum stratified sampling provides relevant improvement with respect to the use of simple random sampling without replacement.Conclusions: The use of one-per-stratum stratified sampling with many imagery segments selected at the first stage and few pixels within at the second stage- jointly with a difference estimator- proves to be a suitable strategy to estimate forest cover by remote sensing-based inventories. 展开更多
关键词 Spatially balanced sampling Auxiliary information
下载PDF
Local Polynomial Regression Estimator of the Finite Population Total under Stratified Random Sampling: A Model-Based Approach
19
作者 Charles K. Syengo Sarah Pyeye +1 位作者 George O. Orwa Romanus O. Odhiambo 《Open Journal of Statistics》 2016年第6期1085-1097,共13页
In this paper, auxiliary information is used to determine an estimator of finite population total using nonparametric regression under stratified random sampling. To achieve this, a model-based approach is adopted by ... In this paper, auxiliary information is used to determine an estimator of finite population total using nonparametric regression under stratified random sampling. To achieve this, a model-based approach is adopted by making use of the local polynomial regression estimation to predict the nonsampled values of the survey variable y. The performance of the proposed estimator is investigated against some design-based and model-based regression estimators. The simulation experiments show that the resulting estimator exhibits good properties. Generally, good confidence intervals are seen for the nonparametric regression estimators, and use of the proposed estimator leads to relatively smaller values of RE compared to other estimators. 展开更多
关键词 sample Surveys Stratified Random sampling Auxiliary information Local Polynomial Regression Model-Based Approach Nonparametric Regression
下载PDF
二次分解策略组合Informer的短期电力负荷预测方法 被引量:6
20
作者 朱莉 韩凯萍 朱春强 《国外电子测量技术》 北大核心 2023年第6期23-32,共10页
针对电力负荷数据存在的波动性、非平稳性而导致预测精度低的问题,提出一种具有二次分解重构策略的深度学习电力负荷预测模型。首先,对负荷数据进行基于局部加权回归的周期趋势分解(STL)-改进的自适应噪声完备集合经验模态分解(ICEEMDAN... 针对电力负荷数据存在的波动性、非平稳性而导致预测精度低的问题,提出一种具有二次分解重构策略的深度学习电力负荷预测模型。首先,对负荷数据进行基于局部加权回归的周期趋势分解(STL)-改进的自适应噪声完备集合经验模态分解(ICEEMDAN)二次分解,通过计算样本熵和最大信息数对分量进行重构;然后在Informer模型中引入非平稳性机制,并融合卷积神经网络对重构分量进行预测;最后,将各分量的预测结果线性相加,得到最终预测结果。实验结果表明,所提方法在3个评价指标上的预测误差均低于所对比模型,证明该预测方法可以有效降低数据的非平稳性并提高预测精度。 展开更多
关键词 短期电力负荷预测 二次分解 样本熵 最大信息数 informer模型
下载PDF
上一页 1 2 46 下一页 到第
使用帮助 返回顶部