Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challe...Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.展开更多
Popular descriptive multivariate statistical method currently employed is the principal component analyses (PCA) method. PCA is used to develop linear combinations that successively maximize the total variance of a ...Popular descriptive multivariate statistical method currently employed is the principal component analyses (PCA) method. PCA is used to develop linear combinations that successively maximize the total variance of a sample where there is no known group structure. This study aimed at demonstrating the performance evaluation of pilot activated sludge treatment system by inoculating a strain of Pseudomonas capable of degrading malathion which was isolated by enrichment technique. An intensive analytical program was followed for evaluating the efficiency of biosimulator by maintaining the dissolved oxygen (DO) concentration at 4.0 mg/L. Analyses by high performance liquid chromatographic technique revealed that 90% of malathion removal was achieved within 29 h of treatment whereas COD got reduced considerably during the treatment process and mean removal efficiency was found to be 78%. The mean pH values increased gradually during the treatment process ranging from 7.36-8.54. Similarly the mean ammonia-nitrogen (NH3-N) values were found to be fluctuating between 19.425-28.488 mg/L, mean nitrite-nitrogen (NO3-N) ranging between 1.301- 2.940 mg/L and mean nitrate-nitrogen (NO3-N) ranging between 0.0071-0.0711 mg/L. The study revealed that inoculation of bacterial culture under laboratory conditions could be used in bioremediation of environmental pollution caused by xenobiotics. The PCA analyses showed that pH, COD, organic load and total malathion concentration were highly correlated and emerged as the variables controlling the first component, whereas dissolved oxygen, NO3-N and NH3-N governed the second component. The third component repeated the trend exhibited by the first two components.展开更多
The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cut...The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cutter at the right time.In order to recognize the health condition of the milling cutter,a method based on the long short term memory(LSTM)was proposed to recognize tool health state in this paper.The various signals collected in the tool wear experiments were analyzed by time-domain statistics,and then the extracted data were generated by principal component analysis(PCA)method.The preprocessed data extracted by PCA is transmitted to the LSTM model for recognition.Compared with back propagation neural network(BPNN)and support vector machine(SVM),the proposed method can effectively utilize the time-domain regulation in the data to achieve higher recognition speed and accuracy.展开更多
Matrix principal component analysis (MatPCA), as an effective feature extraction method, can deal with the matrix pattern and the vector pattern. However, like PCA, MatPCA does not use the class information of sampl...Matrix principal component analysis (MatPCA), as an effective feature extraction method, can deal with the matrix pattern and the vector pattern. However, like PCA, MatPCA does not use the class information of samples. As a result, the extracted features cannot provide enough useful information for distinguishing pat- tern from one another, and further resulting in degradation of classification performance. To fullly use class in- formation of samples, a novel method, called the fuzzy within-class MatPCA (F-WMatPCA)is proposed. F-WMatPCA utilizes the fuzzy K-nearest neighbor method(FKNN) to fuzzify the class membership degrees of a training sample and then performs fuzzy MatPCA within these patterns having the same class label. Due to more class information is used in feature extraction, F-WMatPCA can intuitively improve the classification perfor- mance. Experimental results in face databases and some benchmark datasets show that F-WMatPCA is effective and competitive than MatPCA. The experimental analysis on face image databases indicates that F-WMatPCA im- proves the recognition accuracy and is more stable and robust in performing classification than the existing method of fuzzy-based F-Fisherfaces.展开更多
To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on princip...To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on principal component analysis (PCA) are proposed. Firstly, the PCA method is introduced to extract the feature sequences of a behavioral matrix. Then, the grey incidence analysis between two behavioral matrices is transformed into the similarity and nearness measure between their feature sequences. Based on the classic grey incidence analysis theory, absolute and relative incidence degree models for feature sequences are constructed, and a comprehensive grey incidence model is proposed. Furthermore, the properties of models are researched. It proves that the proposed models satisfy the properties of translation invariance, multiple transformation invariance, and axioms of the grey incidence analysis, respectively. Finally, a case is studied. The results illustrate that the model is effective than other multivariate grey incidence analysis models.展开更多
Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is...Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is applied to rock analysis.Fourteen emission lines including Fe,Mg,Ca,Al,Si,and Ti are selected as analysis lines.A good accuracy(91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA.It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program,but also solve the problem of linear inseparability by combining PCA and SVM.By this method,the ability of LIBS to classify rock is validated.展开更多
The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger cont...The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger contribution ratios axe chosen as the tabu- lated scalars to build the look-up chemistry table. Prior tests show that strained premixed flame structure can be well reconstructed. To highlight the physical meanings of the tabu- lated scalars in stratified flames, a modified PCA method is developed, where the mixture fraction is used to replace one of the PCs with the highest correlation coefficient. The other two tabulated scalars are then modified with the Schmidt orthogonalization. The modified tabulated scalars not only have clear physical meanings, but also contain passive scalars. The PCA method has good commonality, and can be extended for building the thermo-chemistry table including strain rate effects when different fuels are used.展开更多
A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are p...A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are presented, showing that the PCA based watermarks can resist malicious attacks including lowpass filtering, re scaling, and compression coding.展开更多
This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzh...This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzhou, China. First, citizens registered various items constituting desirable values of residential outdoor spaces through a preliminary questionnaire. The result proposed three general attributes (functional, aesthetic and ecological) and ten specific qualities of residential outdoor spaces. An analytic hierarchy process (AHP) was applied to an interview survey in order to clarify the weights among these attributes and qualities. Second, principal factors were extracted from the ten specific qualities with principal component analysis (PCA) for both the common case and the campus case. In addition, the variations of respondents’ groups were classified with cluster analysis (CA) using the results of the PCA. The results of the AHP application found that the public prefers the functional attribute, rather than the aesthetic attribute. The latter is always viewed as the core value of open spaces in the eyes of architects and designers. Fur-thermore, comparisons of ten specific qualities showed that the public prefers the open spaces that can be utilized conveniently and easily for group activities, because such spaces keep an active lifestyle of neighborhood communication, which is also seen to protect human-regarding residential environments. Moreover, different groups of respondents diverge largely in terms of gender, age, behavior and preference.展开更多
Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir mode...Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir models.For stable convergence in ensemble Kalman filter(EnKF),increasing ensemble size can be one of the solutions,but it causes high computational cost in large-scale reservoir systems.In this paper,we propose a preprocessing of good initial model selection to reduce the ensemble size,and then,EnKF is utilized to predict production performances stochastically.In the model selection scheme,representative models are chosen by using principal component analysis(PCA)and clustering analysis.The dimension of initial models is reduced using PCA,and the reduced models are grouped by clustering.Then,we choose and simulate representative models from the cluster groups to compare errors of production predictions with historical observation data.One representative model with the minimum error is considered as the best model,and we use the ensemble members near the best model in the cluster plane for applying EnKF.We demonstrate the proposed scheme for two 3D models that EnKF provides reliable assimilation results with much reduced computation time.展开更多
Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractic...Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractical for users to specify quantitative and exact preferences. Moreover, due to that Quality of Service (QoS) attributes are often interrelated, existing Web service selection approaches which employ weighted summation of QoS attribute values to compute the overall QoS of Web services may produce inaccurate results, since they do not take correlations among QoS attributes into account. To resolve these problems, a Web service selection framework considering user's preference priority is proposed, which incorporates a searching mechanism with QoS range setting to identify services satisfying the user's QoS constraints. With the identified service candidates, based on the idea of Principal Component Analysis (PCA), an algorithm of Web service selection named PCA-WSS (Web Service Selection based on PCA) is proposed, which can eliminate the correlations among QoS attributes and compute the overall QoS of Web services accurately. After computing the overall QoS for each service, the algorithm ranks the Web service candidates based on their overall QoS and recommends services with top QoS values to users. Finally, the effectiveness and feasibility of our approach are validated by experiments, i.e. the selected Web service by our approach is given high average evaluation than other ones by users and the time cost of PCA-WSS algorithm is not affected acutely by the number of service candidates.展开更多
Principal Component Analysis(PCA)is one of the most important feature extraction methods,and Kernel Principal Component Analysis(KPCA)is a nonlinear extension of PCA based on kernel methods.In real world,each input da...Principal Component Analysis(PCA)is one of the most important feature extraction methods,and Kernel Principal Component Analysis(KPCA)is a nonlinear extension of PCA based on kernel methods.In real world,each input data may not be fully assigned to one class and it may partially belong to other classes.Based on the theory of fuzzy sets,this paper presents Fuzzy Principal Component Analysis(FPCA)and its nonlinear extension model,i.e.,Kernel-based Fuzzy Principal Component Analysis(KFPCA).The experimental results indicate that the proposed algorithms have good performances.展开更多
Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing te...Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing technology is used to optimize and decrease the maximum collision force (MCF) and increase specific energy absorption (SEA) while ensure mass is not increased. Because MCF and SEA are two conflicting objectives, grey relational analysis (GRA) and principal component analysis (PCA) are employed for design optimization of the crash box. Furthermore, multi-objective analysis can convert to a single objective using the grey relational grade (GRG) simultaneously, hence, the proposed method can obtain the optimal combination of design parameters for the crash box. It can be concluded that the proposed method decreases the MCF and weight to 16.7% and 29.4% respectively, while increasing SEA to 16.4%. Meanwhile, the proposed method in comparison to the conventional NSGA-Ⅱ method, reduces the time cost by 103%. Hence, the proposed method can be properly applied to the optimization of the crash box.展开更多
In order to directly construct the mapping between multiple state parameters and remaining useful life(RUL),and reduce the interference of random error on prediction accuracy,a RUL prediction model of aeroengine based...In order to directly construct the mapping between multiple state parameters and remaining useful life(RUL),and reduce the interference of random error on prediction accuracy,a RUL prediction model of aeroengine based on principal component analysis(PCA)and one-dimensional convolution neural network(1D-CNN)is proposed in this paper.Firstly,multiple state parameters corresponding to massive cycles of aeroengine are collected and brought into PCA for dimensionality reduction,and principal components are extracted for further time series prediction.Secondly,the 1D-CNN model is constructed to directly study the mapping between principal components and RUL.Multiple convolution and pooling operations are applied for deep feature extraction,and the end-to-end RUL prediction of aeroengine can be realized.Experimental results show that the most effective principal component from the multiple state parameters can be obtained by PCA,and the long time series of multiple state parameters can be directly mapped to RUL by 1D-CNN,so as to improve the efficiency and accuracy of RUL prediction.Compared with other traditional models,the proposed method also has lower prediction error and better robustness.展开更多
On the basis of machine leaning,suitable algorithms can make advanced time series analysis.This paper proposes a complex k-nearest neighbor(KNN)model for predicting financial time series.This model uses a complex feat...On the basis of machine leaning,suitable algorithms can make advanced time series analysis.This paper proposes a complex k-nearest neighbor(KNN)model for predicting financial time series.This model uses a complex feature extraction process integrating a forward rolling empirical mode decomposition(EMD)for financial time series signal analysis and principal component analysis(PCA)for the dimension reduction.The information-rich features are extracted then input to a weighted KNN classifier where the features are weighted with PCA loading.Finally,prediction is generated via regression on the selected nearest neighbors.The structure of the model as a whole is original.The test results on real historical data sets confirm the effectiveness of the models for predicting the Chinese stock index,an individual stock,and the EUR/USD exchange rate.展开更多
This article presents an anomaly detection system based on principal component analysis (PCA) and support vector machine (SVM). The system first creates a profile defining a normal behavior by frequency-based sche...This article presents an anomaly detection system based on principal component analysis (PCA) and support vector machine (SVM). The system first creates a profile defining a normal behavior by frequency-based scheme, and then compares the similarity of a current behavior with the created profile to decide whether the input instance is norreal or anomaly. In order to avoid overfitting and reduce the computational burden, normal behavior principal features are extracted by the PCA method. SVM is used to distinguish normal or anomaly for user behavior after training procedure has been completed by learning. In the experiments for performance evaluation the system achieved a correct detection rate equal to 92.2% and a false detection rate equal to 2.8%.展开更多
Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study co...Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study compares unweighted and weighted similarity factors(SFs),which measure the similarity of the principal component subspace corresponding to the first k main components of two datasets.The fault detection employs the principal component subspace corresponding to the current measured data and the historical fault-free data.From the historical fault-free database,the load parameters are employed to locate the candidate data similar to the current operating data.Fault detection method for air conditioning systems is based on principal component.The results show that the weighted principal component SF can improve the effects of the fault-free detection and the fault detection.Compared with the unweighted SF,the average fault-free detection rate of the weighted SF is 17.33%higher than that of the unweighted,and the average fault detection rate is 7.51%higher than unweighted.展开更多
To extract features of fabric defects effectively and reduce dimension of feature space,a feature extraction method of fabric defects based on complex contourlet transform (CCT) and principal component analysis (PC...To extract features of fabric defects effectively and reduce dimension of feature space,a feature extraction method of fabric defects based on complex contourlet transform (CCT) and principal component analysis (PCA) is proposed.Firstly,training samples of fabric defect images are decomposed by CCT.Secondly,PCA is applied in the obtained low-frequency component and part of highfrequency components to get a lower dimensional feature space.Finally,components of testing samples obtained by CCT are projected onto the feature space where different types of fabric defects are distinguished by the minimum Euclidean distance method.A large number of experimental results show that,compared with PCA,the method combining wavdet low-frequency component with PCA (WLPCA),the method combining contourlet transform with PCA (CPCA),and the method combining wavelet low-frequency and highfrequency components with PCA (WPCA),the proposed method can extract features of common fabric defect types effectively.The recognition rate is greatly improved while the dimension is reduced.展开更多
The spaceborne synthetic aperture radar(SAR)sparse flight 3-D imaging technology through multiple observations of the cross-track direction is designed to form the cross-track equivalent aperture,and achieve the third...The spaceborne synthetic aperture radar(SAR)sparse flight 3-D imaging technology through multiple observations of the cross-track direction is designed to form the cross-track equivalent aperture,and achieve the third dimensionality recognition.In this paper,combined with the actual triple star orbits,a sparse flight spaceborne SAR 3-D imaging method based on the sparse spectrum of interferometry and the principal component analysis(PCA)is presented.Firstly,interferometric processing is utilized to reach an effective sparse representation of radar images in the frequency domain.Secondly,as a method with simple principle and fast calculation,the PCA is introduced to extract the main features of the image spectrum according to its principal characteristics.Finally,the 3-D image can be obtained by inverse transformation of the reconstructed spectrum by the PCA.The simulation results of 4.84 km equivalent cross-track aperture and corresponding 1.78 m cross-track resolution verify the effective suppression of this method on high-frequency sidelobe noise introduced by sparse flight with a sparsity of 49%and random noise introduced by the receiver.Meanwhile,due to the influence of orbit distribution of the actual triple star orbits,the simulation results of the sparse flight with the 7-bit Barker code orbits are given as a comparison and reference to illuminate the significance of orbit distribution for this reconstruction results.This method has prospects for sparse flight 3-D imaging in high latitude areas for its short revisit period.展开更多
基金This work was supported by the Pilot Seed Grant(Grant No.RES0049944)the Collaborative Research Project(Grant No.RES0043251)from the University of Alberta.
文摘Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.
文摘Popular descriptive multivariate statistical method currently employed is the principal component analyses (PCA) method. PCA is used to develop linear combinations that successively maximize the total variance of a sample where there is no known group structure. This study aimed at demonstrating the performance evaluation of pilot activated sludge treatment system by inoculating a strain of Pseudomonas capable of degrading malathion which was isolated by enrichment technique. An intensive analytical program was followed for evaluating the efficiency of biosimulator by maintaining the dissolved oxygen (DO) concentration at 4.0 mg/L. Analyses by high performance liquid chromatographic technique revealed that 90% of malathion removal was achieved within 29 h of treatment whereas COD got reduced considerably during the treatment process and mean removal efficiency was found to be 78%. The mean pH values increased gradually during the treatment process ranging from 7.36-8.54. Similarly the mean ammonia-nitrogen (NH3-N) values were found to be fluctuating between 19.425-28.488 mg/L, mean nitrite-nitrogen (NO3-N) ranging between 1.301- 2.940 mg/L and mean nitrate-nitrogen (NO3-N) ranging between 0.0071-0.0711 mg/L. The study revealed that inoculation of bacterial culture under laboratory conditions could be used in bioremediation of environmental pollution caused by xenobiotics. The PCA analyses showed that pH, COD, organic load and total malathion concentration were highly correlated and emerged as the variables controlling the first component, whereas dissolved oxygen, NO3-N and NH3-N governed the second component. The third component repeated the trend exhibited by the first two components.
基金National Natural Science Foundation of China(No.51805079)Shanghai Natural Science Foundation,China(No.17ZR1400600)Fundamental Research Funds for the Central Universities,China(No.16D110309)
文摘The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cutter at the right time.In order to recognize the health condition of the milling cutter,a method based on the long short term memory(LSTM)was proposed to recognize tool health state in this paper.The various signals collected in the tool wear experiments were analyzed by time-domain statistics,and then the extracted data were generated by principal component analysis(PCA)method.The preprocessed data extracted by PCA is transmitted to the LSTM model for recognition.Compared with back propagation neural network(BPNN)and support vector machine(SVM),the proposed method can effectively utilize the time-domain regulation in the data to achieve higher recognition speed and accuracy.
文摘Matrix principal component analysis (MatPCA), as an effective feature extraction method, can deal with the matrix pattern and the vector pattern. However, like PCA, MatPCA does not use the class information of samples. As a result, the extracted features cannot provide enough useful information for distinguishing pat- tern from one another, and further resulting in degradation of classification performance. To fullly use class in- formation of samples, a novel method, called the fuzzy within-class MatPCA (F-WMatPCA)is proposed. F-WMatPCA utilizes the fuzzy K-nearest neighbor method(FKNN) to fuzzify the class membership degrees of a training sample and then performs fuzzy MatPCA within these patterns having the same class label. Due to more class information is used in feature extraction, F-WMatPCA can intuitively improve the classification perfor- mance. Experimental results in face databases and some benchmark datasets show that F-WMatPCA is effective and competitive than MatPCA. The experimental analysis on face image databases indicates that F-WMatPCA im- proves the recognition accuracy and is more stable and robust in performing classification than the existing method of fuzzy-based F-Fisherfaces.
基金supported by the National Natural Science Foundation of China(71401052)the Key Project of National Social Science Fund of China(12AZD108)+2 种基金the Doctoral Fund of Ministry of Education(20120094120024)the Philosophy and Social Science Fund of Jiangsu Province Universities(2013SJD630073)the Central University Basic Service Project Fee of Hohai University(2011B09914)
文摘To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on principal component analysis (PCA) are proposed. Firstly, the PCA method is introduced to extract the feature sequences of a behavioral matrix. Then, the grey incidence analysis between two behavioral matrices is transformed into the similarity and nearness measure between their feature sequences. Based on the classic grey incidence analysis theory, absolute and relative incidence degree models for feature sequences are constructed, and a comprehensive grey incidence model is proposed. Furthermore, the properties of models are researched. It proves that the proposed models satisfy the properties of translation invariance, multiple transformation invariance, and axioms of the grey incidence analysis, respectively. Finally, a case is studied. The results illustrate that the model is effective than other multivariate grey incidence analysis models.
基金Project supported by the National Natural Science Foundation of China(Grant No.11075184)the Knowledge Innovation Program of the Chinese Academy of Sciences(CAS)(Grant No.Y03RC21124)the CAS President’s International Fellowship Initiative Foundation(Grant No.2015VMA007)
文摘Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is applied to rock analysis.Fourteen emission lines including Fe,Mg,Ca,Al,Si,and Ti are selected as analysis lines.A good accuracy(91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA.It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program,but also solve the problem of linear inseparability by combining PCA and SVM.By this method,the ability of LIBS to classify rock is validated.
基金Project supported by the National Natural Science Foundation of China(Nos.91441117 and51576182)the Natural Key Program of Chizhou University(No.2016ZRZ007)
文摘The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger contribution ratios axe chosen as the tabu- lated scalars to build the look-up chemistry table. Prior tests show that strained premixed flame structure can be well reconstructed. To highlight the physical meanings of the tabu- lated scalars in stratified flames, a modified PCA method is developed, where the mixture fraction is used to replace one of the PCs with the highest correlation coefficient. The other two tabulated scalars are then modified with the Schmidt orthogonalization. The modified tabulated scalars not only have clear physical meanings, but also contain passive scalars. The PCA method has good commonality, and can be extended for building the thermo-chemistry table including strain rate effects when different fuels are used.
文摘A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are presented, showing that the PCA based watermarks can resist malicious attacks including lowpass filtering, re scaling, and compression coding.
文摘This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzhou, China. First, citizens registered various items constituting desirable values of residential outdoor spaces through a preliminary questionnaire. The result proposed three general attributes (functional, aesthetic and ecological) and ten specific qualities of residential outdoor spaces. An analytic hierarchy process (AHP) was applied to an interview survey in order to clarify the weights among these attributes and qualities. Second, principal factors were extracted from the ten specific qualities with principal component analysis (PCA) for both the common case and the campus case. In addition, the variations of respondents’ groups were classified with cluster analysis (CA) using the results of the PCA. The results of the AHP application found that the public prefers the functional attribute, rather than the aesthetic attribute. The latter is always viewed as the core value of open spaces in the eyes of architects and designers. Fur-thermore, comparisons of ten specific qualities showed that the public prefers the open spaces that can be utilized conveniently and easily for group activities, because such spaces keep an active lifestyle of neighborhood communication, which is also seen to protect human-regarding residential environments. Moreover, different groups of respondents diverge largely in terms of gender, age, behavior and preference.
基金supported by The Ministry of Trade,Industry,and Energy(20172510102090,20142520100440,20162010201980)Global PhD Fellowship Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2015H1A2A1030756)supported by the National Research Foundation of Korea(NRF)Grant(No.2018R1C1B5045260).
文摘Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir models.For stable convergence in ensemble Kalman filter(EnKF),increasing ensemble size can be one of the solutions,but it causes high computational cost in large-scale reservoir systems.In this paper,we propose a preprocessing of good initial model selection to reduce the ensemble size,and then,EnKF is utilized to predict production performances stochastically.In the model selection scheme,representative models are chosen by using principal component analysis(PCA)and clustering analysis.The dimension of initial models is reduced using PCA,and the reduced models are grouped by clustering.Then,we choose and simulate representative models from the cluster groups to compare errors of production predictions with historical observation data.One representative model with the minimum error is considered as the best model,and we use the ensemble members near the best model in the cluster plane for applying EnKF.We demonstrate the proposed scheme for two 3D models that EnKF provides reliable assimilation results with much reduced computation time.
基金Supported by the National Natural Science Foundation of China(No.90818004and61100054)Program for New Century Excellent Talents in University(No.NCET-10-0140)+1 种基金Excellent Youth Foundation of Hunan Scientific Committee(No.11JJ1011)Scientific Research Fundof Hunan Educational Committee(No.09K085and11B048)
文摘Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractical for users to specify quantitative and exact preferences. Moreover, due to that Quality of Service (QoS) attributes are often interrelated, existing Web service selection approaches which employ weighted summation of QoS attribute values to compute the overall QoS of Web services may produce inaccurate results, since they do not take correlations among QoS attributes into account. To resolve these problems, a Web service selection framework considering user's preference priority is proposed, which incorporates a searching mechanism with QoS range setting to identify services satisfying the user's QoS constraints. With the identified service candidates, based on the idea of Principal Component Analysis (PCA), an algorithm of Web service selection named PCA-WSS (Web Service Selection based on PCA) is proposed, which can eliminate the correlations among QoS attributes and compute the overall QoS of Web services accurately. After computing the overall QoS for each service, the algorithm ranks the Web service candidates based on their overall QoS and recommends services with top QoS values to users. Finally, the effectiveness and feasibility of our approach are validated by experiments, i.e. the selected Web service by our approach is given high average evaluation than other ones by users and the time cost of PCA-WSS algorithm is not affected acutely by the number of service candidates.
文摘Principal Component Analysis(PCA)is one of the most important feature extraction methods,and Kernel Principal Component Analysis(KPCA)is a nonlinear extension of PCA based on kernel methods.In real world,each input data may not be fully assigned to one class and it may partially belong to other classes.Based on the theory of fuzzy sets,this paper presents Fuzzy Principal Component Analysis(FPCA)and its nonlinear extension model,i.e.,Kernel-based Fuzzy Principal Component Analysis(KFPCA).The experimental results indicate that the proposed algorithms have good performances.
基金Supported by the National Key Research and Development Project(2016YFB0101601)
文摘Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing technology is used to optimize and decrease the maximum collision force (MCF) and increase specific energy absorption (SEA) while ensure mass is not increased. Because MCF and SEA are two conflicting objectives, grey relational analysis (GRA) and principal component analysis (PCA) are employed for design optimization of the crash box. Furthermore, multi-objective analysis can convert to a single objective using the grey relational grade (GRG) simultaneously, hence, the proposed method can obtain the optimal combination of design parameters for the crash box. It can be concluded that the proposed method decreases the MCF and weight to 16.7% and 29.4% respectively, while increasing SEA to 16.4%. Meanwhile, the proposed method in comparison to the conventional NSGA-Ⅱ method, reduces the time cost by 103%. Hence, the proposed method can be properly applied to the optimization of the crash box.
基金supported by Jiangsu Social Science Foundation(No.20GLD008)Science,Technology Projects of Jiangsu Provincial Department of Communications(No.2020Y14)Joint Fund for Civil Aviation Research(No.U1933202)。
文摘In order to directly construct the mapping between multiple state parameters and remaining useful life(RUL),and reduce the interference of random error on prediction accuracy,a RUL prediction model of aeroengine based on principal component analysis(PCA)and one-dimensional convolution neural network(1D-CNN)is proposed in this paper.Firstly,multiple state parameters corresponding to massive cycles of aeroengine are collected and brought into PCA for dimensionality reduction,and principal components are extracted for further time series prediction.Secondly,the 1D-CNN model is constructed to directly study the mapping between principal components and RUL.Multiple convolution and pooling operations are applied for deep feature extraction,and the end-to-end RUL prediction of aeroengine can be realized.Experimental results show that the most effective principal component from the multiple state parameters can be obtained by PCA,and the long time series of multiple state parameters can be directly mapped to RUL by 1D-CNN,so as to improve the efficiency and accuracy of RUL prediction.Compared with other traditional models,the proposed method also has lower prediction error and better robustness.
基金supported by the Social Science Foundation of China under Grant No.17BGL231。
文摘On the basis of machine leaning,suitable algorithms can make advanced time series analysis.This paper proposes a complex k-nearest neighbor(KNN)model for predicting financial time series.This model uses a complex feature extraction process integrating a forward rolling empirical mode decomposition(EMD)for financial time series signal analysis and principal component analysis(PCA)for the dimension reduction.The information-rich features are extracted then input to a weighted KNN classifier where the features are weighted with PCA loading.Finally,prediction is generated via regression on the selected nearest neighbors.The structure of the model as a whole is original.The test results on real historical data sets confirm the effectiveness of the models for predicting the Chinese stock index,an individual stock,and the EUR/USD exchange rate.
基金Supported by the Natural Science Foundation ofHubei Province (2005ABA256)
文摘This article presents an anomaly detection system based on principal component analysis (PCA) and support vector machine (SVM). The system first creates a profile defining a normal behavior by frequency-based scheme, and then compares the similarity of a current behavior with the created profile to decide whether the input instance is norreal or anomaly. In order to avoid overfitting and reduce the computational burden, normal behavior principal features are extracted by the PCA method. SVM is used to distinguish normal or anomaly for user behavior after training procedure has been completed by learning. In the experiments for performance evaluation the system achieved a correct detection rate equal to 92.2% and a false detection rate equal to 2.8%.
基金Research Project of China Ship Development and Design Center。
文摘Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study compares unweighted and weighted similarity factors(SFs),which measure the similarity of the principal component subspace corresponding to the first k main components of two datasets.The fault detection employs the principal component subspace corresponding to the current measured data and the historical fault-free data.From the historical fault-free database,the load parameters are employed to locate the candidate data similar to the current operating data.Fault detection method for air conditioning systems is based on principal component.The results show that the weighted principal component SF can improve the effects of the fault-free detection and the fault detection.Compared with the unweighted SF,the average fault-free detection rate of the weighted SF is 17.33%higher than that of the unweighted,and the average fault detection rate is 7.51%higher than unweighted.
基金National Natural Science Foundation of China(No.60872065)the Key Laboratory of Textile Science&Technology,Ministry of Education,China(No.P1111)+1 种基金the Key Laboratory of Advanced Textile Materials and Manufacturing Technology,Ministry of Education,China(No.2010001)the Priority Academic Program Development of Jiangsu Higher Education Institution,China
文摘To extract features of fabric defects effectively and reduce dimension of feature space,a feature extraction method of fabric defects based on complex contourlet transform (CCT) and principal component analysis (PCA) is proposed.Firstly,training samples of fabric defect images are decomposed by CCT.Secondly,PCA is applied in the obtained low-frequency component and part of highfrequency components to get a lower dimensional feature space.Finally,components of testing samples obtained by CCT are projected onto the feature space where different types of fabric defects are distinguished by the minimum Euclidean distance method.A large number of experimental results show that,compared with PCA,the method combining wavdet low-frequency component with PCA (WLPCA),the method combining contourlet transform with PCA (CPCA),and the method combining wavelet low-frequency and highfrequency components with PCA (WPCA),the proposed method can extract features of common fabric defect types effectively.The recognition rate is greatly improved while the dimension is reduced.
基金This work was supported by the General Design Department,China Academy of Space Technology(10377).
文摘The spaceborne synthetic aperture radar(SAR)sparse flight 3-D imaging technology through multiple observations of the cross-track direction is designed to form the cross-track equivalent aperture,and achieve the third dimensionality recognition.In this paper,combined with the actual triple star orbits,a sparse flight spaceborne SAR 3-D imaging method based on the sparse spectrum of interferometry and the principal component analysis(PCA)is presented.Firstly,interferometric processing is utilized to reach an effective sparse representation of radar images in the frequency domain.Secondly,as a method with simple principle and fast calculation,the PCA is introduced to extract the main features of the image spectrum according to its principal characteristics.Finally,the 3-D image can be obtained by inverse transformation of the reconstructed spectrum by the PCA.The simulation results of 4.84 km equivalent cross-track aperture and corresponding 1.78 m cross-track resolution verify the effective suppression of this method on high-frequency sidelobe noise introduced by sparse flight with a sparsity of 49%and random noise introduced by the receiver.Meanwhile,due to the influence of orbit distribution of the actual triple star orbits,the simulation results of the sparse flight with the 7-bit Barker code orbits are given as a comparison and reference to illuminate the significance of orbit distribution for this reconstruction results.This method has prospects for sparse flight 3-D imaging in high latitude areas for its short revisit period.