期刊文献+
共找到51篇文章
< 1 2 3 >
每页显示 20 50 100
Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay
1
作者 王娜 吴治海 彭力 《Chinese Physics B》 SCIE EI CAS CSCD 2014年第10期617-625,共9页
In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay fo... In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. 展开更多
关键词 heterogeneous multi-agent systems CONSENSUS SAMPLED-DATA small sampling delay
下载PDF
Research on aiming methods for small sample size shooting tests of two-dimensional trajectory correction fuse
2
作者 Chen Liang Qiang Shen +4 位作者 Zilong Deng Hongyun Li Wenyang Pu Lingyun Tian Ziyang Lin 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第3期506-517,共12页
The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction ... The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction fuse actuator.The impact point easily deviates from the target,and thus the correction result cannot be readily evaluated.However,the cost of shooting tests is considerably high to conduct many tests for data collection.To address this issue,this study proposes an aiming method for shooting tests based on small sample size.The proposed method uses the Bootstrap method to expand the test data;repeatedly iterates and corrects the position of the simulated theoretical impact points through an improved compatibility test method;and dynamically adjusts the weight of the prior distribution of simulation results based on Kullback-Leibler divergence,which to some extent avoids the real data being"submerged"by the simulation data and achieves the fusion Bayesian estimation of the dispersion center.The experimental results show that when the simulation accuracy is sufficiently high,the proposed method yields a smaller mean-square deviation in estimating the dispersion center and higher shooting accuracy than those of the three comparison methods,which is more conducive to reflecting the effect of the control algorithm and facilitating test personnel to iterate their proposed structures and algorithms.;in addition,this study provides a knowledge base for further comprehensive studies in the future. 展开更多
关键词 Two-dimensional trajectory correction fuse small sample size test Compatibility test KL divergence Fusion bayesian estimation
下载PDF
Design and Research on Identification of Typical Tea Plant Diseases Using Small Sample Learning
3
作者 Jian Yang 《Journal of Electronic Research and Application》 2024年第5期21-25,共5页
Tea plants are susceptible to diseases during their growth.These diseases seriously affect the yield and quality of tea.The effective prevention and control of diseases requires accurate identification of diseases.Wit... Tea plants are susceptible to diseases during their growth.These diseases seriously affect the yield and quality of tea.The effective prevention and control of diseases requires accurate identification of diseases.With the development of artificial intelligence and computer vision,automatic recognition of plant diseases using image features has become feasible.As the support vector machine(SVM)is suitable for high dimension,high noise,and small sample learning,this paper uses the support vector machine learning method to realize the segmentation of disease spots of diseased tea plants.An improved Conditional Deep Convolutional Generation Adversarial Network with Gradient Penalty(C-DCGAN-GP)was used to expand the segmentation of tea plant spots.Finally,the Visual Geometry Group 16(VGG16)deep learning classification network was trained by the expanded tea lesion images to realize tea disease recognition. 展开更多
关键词 small sample learning Tea plant disease VGG16 deep learning
下载PDF
Data processing of small samples based on grey distance information approach 被引量:14
4
作者 Ke Hongfa, Chen Yongguang & Liu Yi 1. Coll. of Electronic Science and Engineering, National Univ. of Defense Technology, Changsha 410073, P. R. China 2. Unit 63880, Luoyang 471003, P. R. China 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第2期281-289,共9页
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di... Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective. 展开更多
关键词 Data processing Grey theory Norm theory small samples Uncertainty assessments Grey distance measure Information whitening ratio.
下载PDF
Reliability Assessment for the Solenoid Valve of a High-Speed Train Braking System under Small Sample Size 被引量:10
5
作者 Jian-Wei Yang Jin-Hai Wang +1 位作者 Qiang Huang Ming Zhou 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2018年第3期189-199,共11页
Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well ... Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well under conditions of large sample size and complete failure data,which lead to large deviation under conditions of small sample size and zero?failure data. To improve this problem,a new Bayesian method is proposed. Based on the characteristics of the solenoid valve in the braking system of a high?speed train,the modified Weibull distribution is selected to describe the failure rate over the entire lifetime. Based on the assumption of a binomial distribution for the failure probability at censored time,a concave method is employed to obtain the relationships between accumulation failure prob?abilities. A numerical simulation is performed to compare the results of the proposed method with those obtained from maximum likelihood estimation,and to illustrate that the proposed Bayesian model exhibits a better accuracy for the expectation value when the sample size is less than 12. Finally,the robustness of the model is demonstrated by obtaining the reliability indicators for a numerical case involving the solenoid valve of the braking system,which shows that the change in the reliability and failure rate among the di erent hyperparameters is small. The method is provided to avoid misleading of subjective information and improve accuracy of reliability assessment under condi?tions of small sample size and zero?failure data. 展开更多
关键词 Zero?failure data Modified Weibull distribution small sample size Bayesian method
下载PDF
Small sample Bayesian analyses in assessment of weapon performance 被引量:6
6
作者 Li Qingmin Wang Hongwei Liu Jun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第3期545-550,共6页
Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by ... Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations. The several Bayesian approaches are discussed and some limitations are founded. An improvement is put forward after limitations of Bayesian approaches available are analyzed and the improved approach is applied to assessment of some new weapon performance. 展开更多
关键词 Bayesian approach small sample confidence.
下载PDF
Gabor-CNN for object detection based on small samples 被引量:4
7
作者 Xiao-dong Hu Xin-qing Wang +5 位作者 Fan-jie Meng Xia Hua Yu-ji Yan Yu-yang Li Jing Huang Xun-lin Jiang 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2020年第6期1116-1129,共14页
Object detection models based on convolutional neural networks(CNN)have achieved state-of-the-art performance by heavily rely on large-scale training samples.They are insufficient when used in specific applications,su... Object detection models based on convolutional neural networks(CNN)have achieved state-of-the-art performance by heavily rely on large-scale training samples.They are insufficient when used in specific applications,such as the detection of military objects,as in these instances,a large number of samples is hard to obtain.In order to solve this problem,this paper proposes the use of Gabor-CNN for object detection based on a small number of samples.First of all,a feature extraction convolution kernel library composed of multi-shape Gabor and color Gabor is constructed,and the optimal Gabor convolution kernel group is obtained by means of training and screening,which is convolved with the input image to obtain feature information of objects with strong auxiliary function.Then,the k-means clustering algorithm is adopted to construct several different sizes of anchor boxes,which improves the quality of the regional proposals.We call this regional proposal process the Gabor-assisted Region Proposal Network(Gabor-assisted RPN).Finally,the Deeply-Utilized Feature Pyramid Network(DU-FPN)method is proposed to strengthen the feature expression of objects in the image.A bottom-up and a topdown feature pyramid is constructed in ResNet-50 and feature information of objects is deeply utilized through the transverse connection and integration of features at various scales.Experimental results show that the method proposed in this paper achieves better results than the state-of-art contrast models on data sets with small samples in terms of accuracy and recall rate,and thus has a strong application prospect. 展开更多
关键词 Deep learning Convolutional neural network small samples Gabor convolution kernel Feature pyramid
下载PDF
Life prediction and test period optimization research based on small sample reliability test of hydraulic pumps 被引量:5
8
作者 郭锐 Ning Chao +4 位作者 Zhao Jingyi Wang Ping Shi Yu Zhou Jinsheng Luo Jing 《High Technology Letters》 EI CAS 2017年第1期63-70,共8页
Hydraulic pumps belong to reliable long-life hydraulic components. The reliability evaluation includes characters such as long test period,high cost,and high power loss and so on. Based on the principle of energy-savi... Hydraulic pumps belong to reliable long-life hydraulic components. The reliability evaluation includes characters such as long test period,high cost,and high power loss and so on. Based on the principle of energy-saving and power recovery,a small sample hydraulic pump reliability test rig is built,and the service life of hydraulic pump is predicted,and then the sampling period of reliability test is optimized. On the basis of considering the performance degradation mechanism of hydraulic pump,the feature information of degradation distribution of hydraulic pump volumetric efficiency during the test is collected,so an optimal degradation path model of feature information is selected from the aspect of fitting accuracy,and pseudo life data are obtained. Then a small sample reliability test of period constrained optimization search strategy for hydraulic pump is constructed to solve the optimization problem of the test sampling period and tightening end threshold,and it is verified that the accuracy of the minimum sampling period by the non-parametric hypothes is tested. Simulation result shows it could possess instructional significance and referenced value for hydraulic pump reliability life evaluation and the test's research and design. 展开更多
关键词 hydraulic pump small sample test volumetric efficiency degradation path model life span period optimal
下载PDF
Application of Small Sample Analysis in Life Estimation of Aeroengine Components 被引量:4
9
作者 聂挺 《Journal of Southwest Jiaotong University(English Edition)》 2010年第4期285-288,共4页
The samples of fatigue life tests for aeroengine components are usually less than 5,so the evaluation of these samples belongs to small sample analysis. The Weibull distribution is known to describe the life data accu... The samples of fatigue life tests for aeroengine components are usually less than 5,so the evaluation of these samples belongs to small sample analysis. The Weibull distribution is known to describe the life data accurately,and the Weibayes method (developed from Bayesian method) expands on the experiential data in the small sample analysis of fatigue life in aeroengine. Based on the Weibull analysis,a program was developed to improve the efficiency of the reliability analysis for aeroengine compgnents. This program has complete functions and offers highly accurate results. A particular turbine disk's low cycle fatigue life was evaluated by this program. From the results,the following conclusions were drawn:(a) that this program could be used for the engineering applications,and (b) while a lack of former test data lowered the validity of evaluation results,the Weibayes method ensured the results of small sample analysis did not deviate from the truth. 展开更多
关键词 AEROENGINE LIFE small sample Weibull distribution
下载PDF
Yarn Quality Prediction for Small Samples Based on AdaBoost Algorithm 被引量:1
10
作者 刘智玉 陈南梁 汪军 《Journal of Donghua University(English Edition)》 CAS 2023年第3期261-266,共6页
In order to solve the problems of weak prediction stability and generalization ability of a neural network algorithm model in the yarn quality prediction research for small samples,a prediction model based on an AdaBo... In order to solve the problems of weak prediction stability and generalization ability of a neural network algorithm model in the yarn quality prediction research for small samples,a prediction model based on an AdaBoost algorithm(AdaBoost model) was established.A prediction model based on a linear regression algorithm(LR model) and a prediction model based on a multi-layer perceptron neural network algorithm(MLP model) were established for comparison.The prediction experiments of the yarn evenness and the yarn strength were implemented.Determination coefficients and prediction errors were used to evaluate the prediction accuracy of these models,and the K-fold cross validation was used to evaluate the generalization ability of these models.In the prediction experiments,the determination coefficient of the yarn evenness prediction result of the AdaBoost model is 76% and 87% higher than that of the LR model and the MLP model,respectively.The determination coefficient of the yarn strength prediction result of the AdaBoost model is slightly higher than that of the other two models.Considering that the yarn evenness dataset has a weaker linear relationship with the cotton dataset than that of the yarn strength dataset in this paper,the AdaBoost model has the best adaptability for the nonlinear dataset among the three models.In addition,the AdaBoost model shows generally better results in the cross-validation experiments and the series of prediction experiments at eight different training set sample sizes.It is proved that the AdaBoost model not only has good prediction accuracy but also has good prediction stability and generalization ability for small samples. 展开更多
关键词 stability and generalization ability for small samples.Key words:yarn quality prediction AdaBoost algorithm small sample generalization ability
下载PDF
Metal Corrosion Rate Prediction of Small Samples Using an Ensemble Technique
11
作者 Yang Yang Pengfei Zheng +3 位作者 Fanru Zeng Peng Xin Guoxi He Kexi Liao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第1期267-291,共25页
Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample o... Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample of metal corrosion data in the laboratory was developed to provide a new perspective on how to solve the problem of pipeline corrosion under the condition of insufficient real samples.This approach employed the bagging algorithm to construct a strong learner by integrating several KNN learners.A total of 99 data were collected and split into training and test set with a 9:1 ratio.The training set was used to obtain the best hyperparameters by 10-fold cross-validation and grid search,and the test set was used to determine the performance of the model.The results showed that theMean Absolute Error(MAE)of this framework is 28.06%of the traditional model and outperforms other ensemblemethods.Therefore,the proposed framework is suitable formetal corrosion prediction under small sample conditions. 展开更多
关键词 Oil pipeline BAGGING KNN ensemble learning small sample size
下载PDF
Model-data-driven seismic inversion method based on small sample data
12
作者 LIU Jinshui SUN Yuhang LIU Yang 《Petroleum Exploration and Development》 CSCD 2022年第5期1046-1055,共10页
As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this prob... As sandstone layers in thin interbedded section are difficult to identify,conventional model-driven seismic inversion and data-driven seismic prediction methods have low precision in predicting them.To solve this problem,a model-data-driven seismic AVO(amplitude variation with offset)inversion method based on a space-variant objective function has been worked out.In this method,zero delay cross-correlation function and F norm are used to establish objective function.Based on inverse distance weighting theory,change of the objective function is controlled according to the location of the target CDP(common depth point),to change the constraint weights of training samples,initial low-frequency models,and seismic data on the inversion.Hence,the proposed method can get high resolution and high-accuracy velocity and density from inversion of small sample data,and is suitable for identifying thin interbedded sand bodies.Tests with thin interbedded geological models show that the proposed method has high inversion accuracy and resolution for small sample data,and can identify sandstone and mudstone layers of about one-30th of the dominant wavelength thick.Tests on the field data of Lishui sag show that the inversion results of the proposed method have small relative error with well-log data,and can identify thin interbedded sandstone layers of about one-15th of the dominant wavelength thick with small sample data. 展开更多
关键词 small sample data space-variant objective function model-data-driven neural network seismic AVO inversion thin interbedded sandstone identification Paleocene Lishui sag
下载PDF
LF-CNN:Deep Learning-Guided Small Sample Target Detection for Remote Sensing Classification
13
作者 Chengfan Li Lan Liu +1 位作者 Junjuan Zhao Xuefeng Liu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2022年第4期429-444,共16页
Target detection of small samples with a complex background is always difficult in the classification of remote sensing images.We propose a new small sample target detection method combining local features and a convo... Target detection of small samples with a complex background is always difficult in the classification of remote sensing images.We propose a new small sample target detection method combining local features and a convolutional neural network(LF-CNN)with the aim of detecting small numbers of unevenly distributed ground object targets in remote sensing images.The k-nearest neighbor method is used to construct the local neighborhood of each point and the local neighborhoods of the features are extracted one by one from the convolution layer.All the local features are aggregated by maximum pooling to obtain global feature representation.The classification probability of each category is then calculated and classified using the scaled expected linear units function and the full connection layer.The experimental results show that the proposed LF-CNN method has a high accuracy of target detection and classification for hyperspectral imager remote sensing data under the condition of small samples.Despite drawbacks in both time and complexity,the proposed LF-CNN method can more effectively integrate the local features of ground object samples and improve the accuracy of target identification and detection in small samples of remote sensing images than traditional target detection methods. 展开更多
关键词 small samples local features convolutional neural network(CNN) k-nearest neighbor(KNN) target detection
下载PDF
Reliability-Based Optimization:Small Sample Optimization Strategy
14
作者 Drahomir Novak Ondrej Slowik Maosen Cao 《Journal of Computer and Communications》 2014年第11期31-37,共7页
The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process o... The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process of reliability-based optimization and reliability constrains are calculated in inner loop. Innovation of suggested approach is in application of newly developed optimization strategy based on multilevel simulation using an advanced Latin Hypercube Sampling technique. This method is called Aimed multilevel sampling and it is designated for optimization of problems where only limited number of simulations is possible to perform due to enormous com- putational demands. 展开更多
关键词 OPTIMIZATION Reliability Assessment Aimed Multilevel sampling Monte Carlo Latin Hypercube sampling Probability of Failure Reliability-Based Design Optimization small Sample Analysis
下载PDF
Key indexes identifying approach of weapon equipment system-of-systems effectiveness integrating Bayes method and dynamic grey incidence analysis model
15
作者 ZHANG Jingru FANG Zhigeng +1 位作者 YE Feng CHEN Ding 《Journal of Systems Engineering and Electronics》 CSCD 2024年第6期1482-1490,共9页
Aiming at the characteristics of multi-stage and(extremely)small samples of the identification problem of key effectiveness indexes of weapon equipment system-of-systems(WESoS),a Bayesian intelligent identification an... Aiming at the characteristics of multi-stage and(extremely)small samples of the identification problem of key effectiveness indexes of weapon equipment system-of-systems(WESoS),a Bayesian intelligent identification and inference model for system effectiveness assessment indexes based on dynamic grey incidence is proposed.The method uses multi-layer Bayesian techniques,makes full use of historical statistics and empirical information,and determines the Bayesian estima-tion of the incidence degree of indexes,which effectively solves the difficulties of small sample size of effectiveness indexes and difficulty in obtaining incidence rules between indexes.Sec-ondly,The method quantifies the incidence relationship between evaluation indexes and combat effectiveness based on Bayesian posterior grey incidence,and then identifies key system effec-tiveness evaluation indexes.Finally,the proposed method is applied to a case of screening key effectiveness indexes of a missile defensive system,and the analysis results show that the proposed method can fuse multi-moment information and extract multi-stage key indexes,and has good data extraction capability in the case of small samples. 展开更多
关键词 weapon equipment system-of-systems(WESoS) effectiveness index system effectiveness key index Bayes theo-rem grey incidence analysis (extremely)small samples
下载PDF
Calculation of Two-Tailed Exact Probability in the Wald-Wolfowitz One-Sample Runs Test
16
作者 José Moral De La Rubia 《Journal of Data Analysis and Information Processing》 2024年第1期89-114,共26页
The objectives of this paper are to demonstrate the algorithms employed by three statistical software programs (R, Real Statistics using Excel, and SPSS) for calculating the exact two-tailed probability of the Wald-Wo... The objectives of this paper are to demonstrate the algorithms employed by three statistical software programs (R, Real Statistics using Excel, and SPSS) for calculating the exact two-tailed probability of the Wald-Wolfowitz one-sample runs test for randomness, to present a novel approach for computing this probability, and to compare the four procedures by generating samples of 10 and 11 data points, varying the parameters n<sub>0</sub> (number of zeros) and n<sub>1</sub> (number of ones), as well as the number of runs. Fifty-nine samples are created to replicate the behavior of the distribution of the number of runs with 10 and 11 data points. The exact two-tailed probabilities for the four procedures were compared using Friedman’s test. Given the significant difference in central tendency, post-hoc comparisons were conducted using Conover’s test with Benjamini-Yekutielli correction. It is concluded that the procedures of Real Statistics using Excel and R exhibit some inadequacies in the calculation of the exact two-tailed probability, whereas the new proposal and the SPSS procedure are deemed more suitable. The proposed robust algorithm has a more transparent rationale than the SPSS one, albeit being somewhat more conservative. We recommend its implementation for this test and its application to others, such as the binomial and sign test. 展开更多
关键词 RANDOMNESS Nonparametric Test Exact Probability small Samples QUANTILES
下载PDF
LOCAL BAGGING AND ITS APPLICATIONON FACE RECOGNITION 被引量:1
17
作者 朱玉莲 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2010年第3期255-260,共6页
Bagging is not quite suitable for stable classifiers such as nearest neighbor classifiers due to the lack of diversity and it is difficult to be directly applied to face recognition as well due to the small sample si... Bagging is not quite suitable for stable classifiers such as nearest neighbor classifiers due to the lack of diversity and it is difficult to be directly applied to face recognition as well due to the small sample size (SSS) property of face recognition. To solve the two problems,local Bagging (L-Bagging) is proposed to simultaneously make Bagging apply to both nearest neighbor classifiers and face recognition. The major difference between L-Bagging and Bagging is that L-Bagging performs the bootstrap sampling on each local region partitioned from the original face image rather than the whole face image. Since the dimensionality of local region is usually far less than the number of samples and the component classifiers are constructed just in different local regions,L-Bagging deals with SSS problem and generates more diverse component classifiers. Experimental results on four standard face image databases (AR,Yale,ORL and Yale B) indicate that the proposed L-Bagging method is effective and robust to illumination,occlusion and slight pose variation. 展开更多
关键词 face recognition local Bagging (L-Bagging) small sample size (SSS) nearest neighbor classifiers
下载PDF
Machine learning strategies for small sample size in materials science
18
作者 Qiuling Tao JinXin Yu +6 位作者 Xiangyu Mu Xue Jia Rongpei Shi Zhifu Yao Cuiping Wang Haijun Zhang Xingjun Liu 《Science China Materials》 2025年第2期387-405,共19页
Machine learning (ML) has been widely used todesign and develop new materials owing to its low computational cost and powerful predictive capabilities. In recentyears, the shortcomings of ML in materials science have ... Machine learning (ML) has been widely used todesign and develop new materials owing to its low computational cost and powerful predictive capabilities. In recentyears, the shortcomings of ML in materials science have gradually emerged, with a primary concern being the scarcity ofdata. It is challenging to build reliable and accurate ML modelsusing limited data. Moreover, the small sample size problemwill remain long-standing in materials science because of theslow accumulation of material data. Therefore, it is importantto review and categorize strategies for small-sample learningfor the development of ML in materials science. This reviewsystematically sorts the research progress of small-samplelearning strategies in materials science, including ensemblelearning, unsupervised learning, active learning, and transferlearning. The directions for future research are proposed, including few-shot learning, and virtual sample generation.More importantly, we emphasize the significance of embedding material domain knowledge into ML and elaborate on thebasic idea for implementing this strategy. 展开更多
关键词 material design machine learning small sample size few-shot learning material domain knowledge
原文传递
Hierarchical hybrid testability modeling and evaluation method based on information fusion 被引量:4
19
作者 Xishan Zhang Kaoli Huang +1 位作者 Pengcheng Yan Guangyao Lian 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2015年第3期523-532,共10页
In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HH... In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HHTME), which combines the testabi- lity structure model (TSM) with the testability Bayesian networks model (TBNM), is presented. Firstly, the testability network topo- logy of complex equipment is built by using the hierarchical hybrid testability modeling method. Secondly, the prior conditional prob- ability distribution between network nodes is determined through expert experience. Then the Bayesian method is used to update the conditional probability distribution, according to history test information, virtual simulation information and similar product in- formation. Finally, the learned hierarchical hybrid testability model (HHTM) is used to estimate the testability of equipment. Compared with the results of other modeling methods, the relative deviation of the HHTM is only 0.52%, and the evaluation result is the most accu rate. 展开更多
关键词 small sample complex equipment hierarchical hybrid information fusion testability modeling and evaluation.
下载PDF
A New Curve Fitting Method for Forming Limit Experimental Data 被引量:4
20
作者 Jieshi CHEN Xianbin ZHOU 《Journal of Materials Science & Technology》 SCIE EI CAS CSCD 2005年第4期521-525,共5页
The forming limit curve (FLC) can be obtained by means of curve fitting the limit strain points of different strain paths. The theory of percent regression analysis is applied to the curve fitting of forming limit e... The forming limit curve (FLC) can be obtained by means of curve fitting the limit strain points of different strain paths. The theory of percent regression analysis is applied to the curve fitting of forming limit experimental data.Forecast intervals of FLC percentiles can be calculated. Thus reliability and confidence level can be considered. The theoretical method to get the limits of limit strain points distributing region is presented, and the FLC position can be adjusted according to practical requirement. Method for establishing FLC with high reliability using small samples is presented at the same time. This method can make full use of the current experimental data and the previous data.Compared with the traditional method that can only use current experimental data, fewer specimens are required in the present method to obtain the same precision and the result is more accurate with the same number of specimens. 展开更多
关键词 Forming limit curve Regression analysis Reliability analysis small samples method
下载PDF
上一页 1 2 3 下一页 到第
使用帮助 返回顶部