期刊文献+
共找到1,902篇文章
< 1 2 96 >
每页显示 20 50 100
Research on aiming methods for small sample size shooting tests of two-dimensional trajectory correction fuse
1
作者 Chen Liang Qiang Shen +4 位作者 Zilong Deng Hongyun Li Wenyang Pu Lingyun Tian Ziyang Lin 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第3期506-517,共12页
The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction ... The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction fuse actuator.The impact point easily deviates from the target,and thus the correction result cannot be readily evaluated.However,the cost of shooting tests is considerably high to conduct many tests for data collection.To address this issue,this study proposes an aiming method for shooting tests based on small sample size.The proposed method uses the Bootstrap method to expand the test data;repeatedly iterates and corrects the position of the simulated theoretical impact points through an improved compatibility test method;and dynamically adjusts the weight of the prior distribution of simulation results based on Kullback-Leibler divergence,which to some extent avoids the real data being"submerged"by the simulation data and achieves the fusion Bayesian estimation of the dispersion center.The experimental results show that when the simulation accuracy is sufficiently high,the proposed method yields a smaller mean-square deviation in estimating the dispersion center and higher shooting accuracy than those of the three comparison methods,which is more conducive to reflecting the effect of the control algorithm and facilitating test personnel to iterate their proposed structures and algorithms.;in addition,this study provides a knowledge base for further comprehensive studies in the future. 展开更多
关键词 Two-dimensional trajectory correction fuse small sample size test Compatibility test KL divergence Fusion bayesian estimation
下载PDF
Design and Research on Identification of Typical Tea Plant Diseases Using Small Sample Learning
2
作者 Jian Yang 《Journal of Electronic Research and Application》 2024年第5期21-25,共5页
Tea plants are susceptible to diseases during their growth.These diseases seriously affect the yield and quality of tea.The effective prevention and control of diseases requires accurate identification of diseases.Wit... Tea plants are susceptible to diseases during their growth.These diseases seriously affect the yield and quality of tea.The effective prevention and control of diseases requires accurate identification of diseases.With the development of artificial intelligence and computer vision,automatic recognition of plant diseases using image features has become feasible.As the support vector machine(SVM)is suitable for high dimension,high noise,and small sample learning,this paper uses the support vector machine learning method to realize the segmentation of disease spots of diseased tea plants.An improved Conditional Deep Convolutional Generation Adversarial Network with Gradient Penalty(C-DCGAN-GP)was used to expand the segmentation of tea plant spots.Finally,the Visual Geometry Group 16(VGG16)deep learning classification network was trained by the expanded tea lesion images to realize tea disease recognition. 展开更多
关键词 small sample learning Tea plant disease VGG16 deep learning
下载PDF
Yarn Quality Prediction for Small Samples Based on AdaBoost Algorithm 被引量:1
3
作者 刘智玉 陈南梁 汪军 《Journal of Donghua University(English Edition)》 CAS 2023年第3期261-266,共6页
In order to solve the problems of weak prediction stability and generalization ability of a neural network algorithm model in the yarn quality prediction research for small samples,a prediction model based on an AdaBo... In order to solve the problems of weak prediction stability and generalization ability of a neural network algorithm model in the yarn quality prediction research for small samples,a prediction model based on an AdaBoost algorithm(AdaBoost model) was established.A prediction model based on a linear regression algorithm(LR model) and a prediction model based on a multi-layer perceptron neural network algorithm(MLP model) were established for comparison.The prediction experiments of the yarn evenness and the yarn strength were implemented.Determination coefficients and prediction errors were used to evaluate the prediction accuracy of these models,and the K-fold cross validation was used to evaluate the generalization ability of these models.In the prediction experiments,the determination coefficient of the yarn evenness prediction result of the AdaBoost model is 76% and 87% higher than that of the LR model and the MLP model,respectively.The determination coefficient of the yarn strength prediction result of the AdaBoost model is slightly higher than that of the other two models.Considering that the yarn evenness dataset has a weaker linear relationship with the cotton dataset than that of the yarn strength dataset in this paper,the AdaBoost model has the best adaptability for the nonlinear dataset among the three models.In addition,the AdaBoost model shows generally better results in the cross-validation experiments and the series of prediction experiments at eight different training set sample sizes.It is proved that the AdaBoost model not only has good prediction accuracy but also has good prediction stability and generalization ability for small samples. 展开更多
关键词 stability and generalization ability for small samples.Key words:yarn quality prediction AdaBoost algorithm small sample generalization ability
下载PDF
Calculation of Two-Tailed Exact Probability in the Wald-Wolfowitz One-Sample Runs Test
4
作者 José Moral De La Rubia 《Journal of Data Analysis and Information Processing》 2024年第1期89-114,共26页
The objectives of this paper are to demonstrate the algorithms employed by three statistical software programs (R, Real Statistics using Excel, and SPSS) for calculating the exact two-tailed probability of the Wald-Wo... The objectives of this paper are to demonstrate the algorithms employed by three statistical software programs (R, Real Statistics using Excel, and SPSS) for calculating the exact two-tailed probability of the Wald-Wolfowitz one-sample runs test for randomness, to present a novel approach for computing this probability, and to compare the four procedures by generating samples of 10 and 11 data points, varying the parameters n<sub>0</sub> (number of zeros) and n<sub>1</sub> (number of ones), as well as the number of runs. Fifty-nine samples are created to replicate the behavior of the distribution of the number of runs with 10 and 11 data points. The exact two-tailed probabilities for the four procedures were compared using Friedman’s test. Given the significant difference in central tendency, post-hoc comparisons were conducted using Conover’s test with Benjamini-Yekutielli correction. It is concluded that the procedures of Real Statistics using Excel and R exhibit some inadequacies in the calculation of the exact two-tailed probability, whereas the new proposal and the SPSS procedure are deemed more suitable. The proposed robust algorithm has a more transparent rationale than the SPSS one, albeit being somewhat more conservative. We recommend its implementation for this test and its application to others, such as the binomial and sign test. 展开更多
关键词 RANDOMNESS Nonparametric Test Exact Probability small samples QUANTILES
下载PDF
Metal Corrosion Rate Prediction of Small Samples Using an Ensemble Technique
5
作者 Yang Yang Pengfei Zheng +3 位作者 Fanru Zeng Peng Xin Guoxi He Kexi Liao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第1期267-291,共25页
Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample o... Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample of metal corrosion data in the laboratory was developed to provide a new perspective on how to solve the problem of pipeline corrosion under the condition of insufficient real samples.This approach employed the bagging algorithm to construct a strong learner by integrating several KNN learners.A total of 99 data were collected and split into training and test set with a 9:1 ratio.The training set was used to obtain the best hyperparameters by 10-fold cross-validation and grid search,and the test set was used to determine the performance of the model.The results showed that theMean Absolute Error(MAE)of this framework is 28.06%of the traditional model and outperforms other ensemblemethods.Therefore,the proposed framework is suitable formetal corrosion prediction under small sample conditions. 展开更多
关键词 Oil pipeline BAGGING KNN ensemble learning small sample size
下载PDF
Community Structure and Diversity Distributions of Small Mammals in Different Sample Plots in the Eastern Part of Wuling Mountains 被引量:13
6
作者 刘井元 杜红 +3 位作者 田耕百 余品红 王身文 彭红 《Zoological Research》 CAS CSCD 北大核心 2008年第6期637-645,共9页
Five years' (2000-2004) continuous study has been carried out on small mammals such as rodents in seven different sample plots, at three different altitudes and in six different ecological environment types in the ... Five years' (2000-2004) continuous study has been carried out on small mammals such as rodents in seven different sample plots, at three different altitudes and in six different ecological environment types in the eastern part of the Wuling Mountains, south bank of the Three Gorges of Yangtze River in Hubei. A total of 29 297 rat clamps/times were placed and 2271 small mammals such as rodents were captured, and 26 small mammals were captured by other means. All the small mammals captured belonged to 8 families 19 genera and 24 species, of which rodentia accounted for 70.83% and insectivora 29.17%. Through analysis of the data, the results showed that: 1 ) although the species richness had a trend of increasing along different sample plots as altitude increased from south to north, quite a few species showed a wide habitat range in a vertical distribution ( 15 species were dispersed over three zones and two species over two zones) , indicating a strong adaptability of small mammals such as rOdents at lower altitudes in most areas and comparatively less vertical span of entire mountains; 2) whether in seven different sample plots or six different ecological types, Apodemus agrarius and Rattus norvegicus were dominant species below 1200m, and Anourosorex squamipes, Niviventer confucianus and Apodemus draco were dominant above altitudes of 1300m, however, in quantity they were short of identical regularity, meaning they did not increase as the altitude did, or decrease as the ecological areas changed; 3)the density in winter was obviously greater than that in spring, and the distribution showed an increasing trend along with altitude, but the density in different sample plots was short of identical regularity, showing changes in different seasons and altitude grades had an important impact on small mammals such as rodents; 4) in species diversity and evenness index, there were obvious changes between the seven different sample plots, probably caused by frequent human interference in this area. Comparatively speaking, there was less human interference at high altitudes where vegetation was rich and had a high diversity and evenness index, and the boundary effect and community stability were obvious. Most ecological types have been seriously interfered with due to excessive assart at low altitudes with singular vegetation and low diversity and evenness index and poor community stability, showing an ecosystem with poor anti-reversion. If human interference can be reduced in those communities at high altitudes with low diversity and evenness index, the biological diversity in the communities will gradually recover to similar levels of other ecological areas. 展开更多
关键词 small mammals Community structure Species diversity sample plots Eastern part of Wuling Mountains
下载PDF
Wasserstein GAN-Based Small-Sample Augmentation for New-Generation Artificial Intelligence: A Case Study of Cancer-Staging Data in Biology 被引量:16
7
作者 Yufei Liu Yuan Zhou +3 位作者 Xin Liu Fang Dong Chang Wang Zihong Wang 《Engineering》 SCIE EI 2019年第1期156-163,共8页
It is essential to utilize deep-learning algorithms based on big data for the implementation of the new generation of artificial intelligence. Effective utilization of deep learning relies considerably on the number o... It is essential to utilize deep-learning algorithms based on big data for the implementation of the new generation of artificial intelligence. Effective utilization of deep learning relies considerably on the number of labeled samples, which restricts the application of deep learning in an environment with a small sample size. In this paper, we propose an approach based on a generative adversarial network (GAN) combined with a deep neural network (DNN). First, the original samples were divided into a training set and a test set. The GAN was trained with the training set to generate synthetic sample data, which enlarged the training set. Next, the DNN classifier was trained with the synthetic samples. Finally, the classifier was tested with the test set, and the effectiveness of the approach for multi-classification with a small sample size was validated by the indicators. As an empirical case, the approach was then applied to identify the stages of cancers with a small labeled sample size. The experimental results verified that the proposed approach achieved a greater accuracy than traditional methods. This research was an attempt to transform the classical statistical machine-learning classification method based on original samples into a deep-learning classification method based on data augmentation. The use of this approach will contribute to an expansion of application scenarios for the new generation of artificial intelligence based on deep learning, and to an increase in application effectiveness. This research is also expected to contribute to the comprehensive promotion of new-generation artificial intelligence. 展开更多
关键词 Artificial intelligence Generative adversarial NETWORK Deep neural NETWORK small sample size CANCER
下载PDF
Data processing of small samples based on grey distance information approach 被引量:14
8
作者 Ke Hongfa, Chen Yongguang & Liu Yi 1. Coll. of Electronic Science and Engineering, National Univ. of Defense Technology, Changsha 410073, P. R. China 2. Unit 63880, Luoyang 471003, P. R. China 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第2期281-289,共9页
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di... Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective. 展开更多
关键词 Data processing Grey theory Norm theory small samples Uncertainty assessments Grey distance measure Information whitening ratio.
下载PDF
Small sample Bayesian analyses in assessment of weapon performance 被引量:6
9
作者 Li Qingmin Wang Hongwei Liu Jun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第3期545-550,共6页
Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by ... Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations. The several Bayesian approaches are discussed and some limitations are founded. An improvement is put forward after limitations of Bayesian approaches available are analyzed and the improved approach is applied to assessment of some new weapon performance. 展开更多
关键词 Bayesian approach small sample confidence.
下载PDF
Reliability Assessment for the Solenoid Valve of a High-Speed Train Braking System under Small Sample Size 被引量:9
10
作者 Jian-Wei Yang Jin-Hai Wang +1 位作者 Qiang Huang Ming Zhou 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2018年第3期189-199,共11页
Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well ... Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well under conditions of large sample size and complete failure data,which lead to large deviation under conditions of small sample size and zero?failure data. To improve this problem,a new Bayesian method is proposed. Based on the characteristics of the solenoid valve in the braking system of a high?speed train,the modified Weibull distribution is selected to describe the failure rate over the entire lifetime. Based on the assumption of a binomial distribution for the failure probability at censored time,a concave method is employed to obtain the relationships between accumulation failure prob?abilities. A numerical simulation is performed to compare the results of the proposed method with those obtained from maximum likelihood estimation,and to illustrate that the proposed Bayesian model exhibits a better accuracy for the expectation value when the sample size is less than 12. Finally,the robustness of the model is demonstrated by obtaining the reliability indicators for a numerical case involving the solenoid valve of the braking system,which shows that the change in the reliability and failure rate among the di erent hyperparameters is small. The method is provided to avoid misleading of subjective information and improve accuracy of reliability assessment under condi?tions of small sample size and zero?failure data. 展开更多
关键词 Zero?failure data Modified Weibull distribution small sample size Bayesian method
下载PDF
Application of Small Sample Analysis in Life Estimation of Aeroengine Components 被引量:4
11
作者 聂挺 《Journal of Southwest Jiaotong University(English Edition)》 2010年第4期285-288,共4页
The samples of fatigue life tests for aeroengine components are usually less than 5,so the evaluation of these samples belongs to small sample analysis. The Weibull distribution is known to describe the life data accu... The samples of fatigue life tests for aeroengine components are usually less than 5,so the evaluation of these samples belongs to small sample analysis. The Weibull distribution is known to describe the life data accurately,and the Weibayes method (developed from Bayesian method) expands on the experiential data in the small sample analysis of fatigue life in aeroengine. Based on the Weibull analysis,a program was developed to improve the efficiency of the reliability analysis for aeroengine compgnents. This program has complete functions and offers highly accurate results. A particular turbine disk's low cycle fatigue life was evaluated by this program. From the results,the following conclusions were drawn:(a) that this program could be used for the engineering applications,and (b) while a lack of former test data lowered the validity of evaluation results,the Weibayes method ensured the results of small sample analysis did not deviate from the truth. 展开更多
关键词 AEROENGINE LIFE small sample Weibull distribution
下载PDF
Gabor-CNN for object detection based on small samples 被引量:4
12
作者 Xiao-dong Hu Xin-qing Wang +5 位作者 Fan-jie Meng Xia Hua Yu-ji Yan Yu-yang Li Jing Huang Xun-lin Jiang 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2020年第6期1116-1129,共14页
Object detection models based on convolutional neural networks(CNN)have achieved state-of-the-art performance by heavily rely on large-scale training samples.They are insufficient when used in specific applications,su... Object detection models based on convolutional neural networks(CNN)have achieved state-of-the-art performance by heavily rely on large-scale training samples.They are insufficient when used in specific applications,such as the detection of military objects,as in these instances,a large number of samples is hard to obtain.In order to solve this problem,this paper proposes the use of Gabor-CNN for object detection based on a small number of samples.First of all,a feature extraction convolution kernel library composed of multi-shape Gabor and color Gabor is constructed,and the optimal Gabor convolution kernel group is obtained by means of training and screening,which is convolved with the input image to obtain feature information of objects with strong auxiliary function.Then,the k-means clustering algorithm is adopted to construct several different sizes of anchor boxes,which improves the quality of the regional proposals.We call this regional proposal process the Gabor-assisted Region Proposal Network(Gabor-assisted RPN).Finally,the Deeply-Utilized Feature Pyramid Network(DU-FPN)method is proposed to strengthen the feature expression of objects in the image.A bottom-up and a topdown feature pyramid is constructed in ResNet-50 and feature information of objects is deeply utilized through the transverse connection and integration of features at various scales.Experimental results show that the method proposed in this paper achieves better results than the state-of-art contrast models on data sets with small samples in terms of accuracy and recall rate,and thus has a strong application prospect. 展开更多
关键词 Deep learning Convolutional neural network small samples Gabor convolution kernel Feature pyramid
下载PDF
Life prediction and test period optimization research based on small sample reliability test of hydraulic pumps 被引量:5
13
作者 郭锐 Ning Chao +4 位作者 Zhao Jingyi Wang Ping Shi Yu Zhou Jinsheng Luo Jing 《High Technology Letters》 EI CAS 2017年第1期63-70,共8页
Hydraulic pumps belong to reliable long-life hydraulic components. The reliability evaluation includes characters such as long test period,high cost,and high power loss and so on. Based on the principle of energy-savi... Hydraulic pumps belong to reliable long-life hydraulic components. The reliability evaluation includes characters such as long test period,high cost,and high power loss and so on. Based on the principle of energy-saving and power recovery,a small sample hydraulic pump reliability test rig is built,and the service life of hydraulic pump is predicted,and then the sampling period of reliability test is optimized. On the basis of considering the performance degradation mechanism of hydraulic pump,the feature information of degradation distribution of hydraulic pump volumetric efficiency during the test is collected,so an optimal degradation path model of feature information is selected from the aspect of fitting accuracy,and pseudo life data are obtained. Then a small sample reliability test of period constrained optimization search strategy for hydraulic pump is constructed to solve the optimization problem of the test sampling period and tightening end threshold,and it is verified that the accuracy of the minimum sampling period by the non-parametric hypothes is tested. Simulation result shows it could possess instructional significance and referenced value for hydraulic pump reliability life evaluation and the test's research and design. 展开更多
关键词 hydraulic pump small sample test volumetric efficiency degradation path model life span period optimal
下载PDF
General limited information diffusion method of small-sample information analysis in insurance 被引量:14
14
作者 忻莉莉 耿辉 +1 位作者 王永民 张晶晶 《Journal of Shanghai University(English Edition)》 CAS 2007年第3期259-262,共4页
When analyzing and evaluating risks in insurance, people are often confronted with the situation of incomplete information and insufficient data, which is known as a small-sample problem. In this paper, a one-dimensio... When analyzing and evaluating risks in insurance, people are often confronted with the situation of incomplete information and insufficient data, which is known as a small-sample problem. In this paper, a one-dimensional small-sample problem in insurance was investigated using the kernel density estimation method (KerM) and general limited information diffusion method (GIDM). In particular, MacCormack technique was applied to get the solutions of GIDM equations and then the optimal diffusion solution was acquired based on the two optimization principles. Finally, the analysis introduced in this paper was verified by treating some examples and satisfying results were obtained. 展开更多
关键词 fuzzy mathematics kernel density estimation information diffusion MacCormack technique small-sample
下载PDF
Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay
15
作者 王娜 吴治海 彭力 《Chinese Physics B》 SCIE EI CAS CSCD 2014年第10期617-625,共9页
In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay fo... In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. 展开更多
关键词 heterogeneous multi-agent systems CONSENSUS sampleD-DATA small sampling delay
下载PDF
LF-CNN:Deep Learning-Guided Small Sample Target Detection for Remote Sensing Classification
16
作者 Chengfan Li Lan Liu +1 位作者 Junjuan Zhao Xuefeng Liu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2022年第4期429-444,共16页
Target detection of small samples with a complex background is always difficult in the classification of remote sensing images.We propose a new small sample target detection method combining local features and a convo... Target detection of small samples with a complex background is always difficult in the classification of remote sensing images.We propose a new small sample target detection method combining local features and a convolutional neural network(LF-CNN)with the aim of detecting small numbers of unevenly distributed ground object targets in remote sensing images.The k-nearest neighbor method is used to construct the local neighborhood of each point and the local neighborhoods of the features are extracted one by one from the convolution layer.All the local features are aggregated by maximum pooling to obtain global feature representation.The classification probability of each category is then calculated and classified using the scaled expected linear units function and the full connection layer.The experimental results show that the proposed LF-CNN method has a high accuracy of target detection and classification for hyperspectral imager remote sensing data under the condition of small samples.Despite drawbacks in both time and complexity,the proposed LF-CNN method can more effectively integrate the local features of ground object samples and improve the accuracy of target identification and detection in small samples of remote sensing images than traditional target detection methods. 展开更多
关键词 small samples local features convolutional neural network(CNN) k-nearest neighbor(KNN) target detection
下载PDF
Model-data-driven seismic inversion method based on small sample data
17
作者 LIU Jinshui SUN Yuhang LIU Yang 《Petroleum Exploration and Development》 CSCD 2022年第5期1046-1055,共10页
As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this prob... As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this problem,a model-data-driven seismic AVO(amplitude variation with offset)inversion method based on a space-variant objective function has been worked out.In this method,zero delay cross-correlation function and F norm are used to establish objective function.Based on inverse distance weighting theory,change of the objective function is controlled according to the location of the target CDP(common depth point),to change the constraint weights of training samples,initial low-frequency models,and seismic data on the inversion.Hence,the proposed method can get high resolution and high-accuracy velocity and density from inversion of small sample data,and is suitable for identifying thin interbedded sand bodies.Tests with thin interbedded geological models show that the proposed method has high inversion accuracy and resolution for small sample data,and can identify sandstone and mudstone layers of about one-30th of the dominant wavelength thick.Tests on the field data of Lishui sag show that the inversion results of the proposed method have small relative error with well-log data,and can identify thin interbedded sandstone layers of about one-15th of the dominant wavelength thick with small sample data. 展开更多
关键词 small sample data space-variant objective function model-data-driven neural network seismic AVO inversion thin interbedded sandstone identification Paleocene Lishui sag
下载PDF
Analysis method on shoot precision of weapon in small-sample case
18
作者 Jiang Jun Song Baowei Liang Qingwei 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第4期781-784,共4页
Because of limits of cost, in general, the test data of weapons are shortness. It is always an important topic that to gain scientific results of weapon performance analyses in small-sample case. Based on the analysis... Because of limits of cost, in general, the test data of weapons are shortness. It is always an important topic that to gain scientific results of weapon performance analyses in small-sample case. Based on the analysis of distribution function characteristics and grey mathematics, a weighting grey method in small-sample case is presented. According to the analysis of test data of a weapon, it is proved that the method is a good method to deal with data in the small-sample case and has a high value in the analysis of weapon performance. 展开更多
关键词 WEAPON small-sample shoot precision statistical characters
下载PDF
Reliability-Based Optimization:Small Sample Optimization Strategy
19
作者 Drahomir Novak Ondrej Slowik Maosen Cao 《Journal of Computer and Communications》 2014年第11期31-37,共7页
The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process o... The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process of reliability-based optimization and reliability constrains are calculated in inner loop. Innovation of suggested approach is in application of newly developed optimization strategy based on multilevel simulation using an advanced Latin Hypercube Sampling technique. This method is called Aimed multilevel sampling and it is designated for optimization of problems where only limited number of simulations is possible to perform due to enormous com- putational demands. 展开更多
关键词 OPTIMIZATION Reliability Assessment Aimed Multilevel sampling Monte Carlo Latin Hypercube sampling Probability of Failure Reliability-Based Design Optimization small sample Analysis
下载PDF
An Improved Algorithm for Imbalanced Data and Small Sample Size Classification
20
作者 Yong Hu Dongfa Guo +7 位作者 Zengwei Fan Chen Dong Qiuhong Huang Shengkai Xie Guifang Liu Jing Tan Boping Li Qiwei Xie 《Journal of Data Analysis and Information Processing》 2015年第3期27-33,共7页
Traditional classification algorithms perform not very well on imbalanced data sets and small sample size. To deal with the problem, a novel method is proposed to change the class distribution through adding virtual s... Traditional classification algorithms perform not very well on imbalanced data sets and small sample size. To deal with the problem, a novel method is proposed to change the class distribution through adding virtual samples, which are generated by the windowed regression over-sampling (WRO) method. The proposed method WRO not only reflects the additive effects but also reflects the multiplicative effect between samples. A comparative study between the proposed method and other over-sampling methods such as synthetic minority over-sampling technique (SMOTE) and borderline over-sampling (BOS) on UCI datasets and Fourier transform infrared spectroscopy (FTIR) data set is provided. Experimental results show that the WRO method can achieve better performance than other methods. 展开更多
关键词 Class IMBALANCE Learning OVER-samplING high-dimensionAL small-sample SIZE Support VECTOR Machine
下载PDF
上一页 1 2 96 下一页 到第
使用帮助 返回顶部