Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challe...Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.展开更多
The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cut...The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cutter at the right time.In order to recognize the health condition of the milling cutter,a method based on the long short term memory(LSTM)was proposed to recognize tool health state in this paper.The various signals collected in the tool wear experiments were analyzed by time-domain statistics,and then the extracted data were generated by principal component analysis(PCA)method.The preprocessed data extracted by PCA is transmitted to the LSTM model for recognition.Compared with back propagation neural network(BPNN)and support vector machine(SVM),the proposed method can effectively utilize the time-domain regulation in the data to achieve higher recognition speed and accuracy.展开更多
As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the pl...As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the planning and implementation of equipment maintenance tasks, the equipment damage measurement is an important guarantee for the effective implementation of maintenance support. Firstly,this article comprehensively analyses the influence factors to damage measurement from the enemy’s attributes, our attributes and the battlefield environment starting from the basic problem of wartime equipment damage measurement. Secondly, this article determines the key factors based on fuzzy comprehensive evaluation(FCE) and performed principal component analysis (PCA) on the key factors. Finally, the principal components representing more than 85%of the data features are taken as the input and the equipment damage quantity is taken as the output. The data are trained and tested by artificial neural network (ANN) and random forest (RF). In a word, FCE-PCA-RF can be used as a reference for the research of equipment damage estimation in wartime.展开更多
The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger cont...The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger contribution ratios axe chosen as the tabu- lated scalars to build the look-up chemistry table. Prior tests show that strained premixed flame structure can be well reconstructed. To highlight the physical meanings of the tabu- lated scalars in stratified flames, a modified PCA method is developed, where the mixture fraction is used to replace one of the PCs with the highest correlation coefficient. The other two tabulated scalars are then modified with the Schmidt orthogonalization. The modified tabulated scalars not only have clear physical meanings, but also contain passive scalars. The PCA method has good commonality, and can be extended for building the thermo-chemistry table including strain rate effects when different fuels are used.展开更多
Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing te...Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing technology is used to optimize and decrease the maximum collision force (MCF) and increase specific energy absorption (SEA) while ensure mass is not increased. Because MCF and SEA are two conflicting objectives, grey relational analysis (GRA) and principal component analysis (PCA) are employed for design optimization of the crash box. Furthermore, multi-objective analysis can convert to a single objective using the grey relational grade (GRG) simultaneously, hence, the proposed method can obtain the optimal combination of design parameters for the crash box. It can be concluded that the proposed method decreases the MCF and weight to 16.7% and 29.4% respectively, while increasing SEA to 16.4%. Meanwhile, the proposed method in comparison to the conventional NSGA-Ⅱ method, reduces the time cost by 103%. Hence, the proposed method can be properly applied to the optimization of the crash box.展开更多
This paper studies the problem of tensor principal component analysis (PCA). Usually the tensor PCA is viewed as a low-rank matrix completion problem via matrix factorization technique, and nuclear norm is used as a c...This paper studies the problem of tensor principal component analysis (PCA). Usually the tensor PCA is viewed as a low-rank matrix completion problem via matrix factorization technique, and nuclear norm is used as a convex approximation of the rank operator under mild condition. However, most nuclear norm minimization approaches are based on SVD operations. Given a matrix , the time complexity of SVD operation is O(mn2), which brings prohibitive computational complexity in large-scale problems. In this paper, an efficient and scalable algorithm for tensor principal component analysis is proposed which is called Linearized Alternating Direction Method with Vectorized technique for Tensor Principal Component Analysis (LADMVTPCA). Different from traditional matrix factorization methods, LADMVTPCA utilizes the vectorized technique to formulate the tensor as an outer product of vectors, which greatly improves the computational efficacy compared to matrix factorization method. In the experiment part, synthetic tensor data with different orders are used to empirically evaluate the proposed algorithm LADMVTPCA. Results have shown that LADMVTPCA outperforms matrix factorization based method.展开更多
To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on princip...To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on principal component analysis (PCA) are proposed. Firstly, the PCA method is introduced to extract the feature sequences of a behavioral matrix. Then, the grey incidence analysis between two behavioral matrices is transformed into the similarity and nearness measure between their feature sequences. Based on the classic grey incidence analysis theory, absolute and relative incidence degree models for feature sequences are constructed, and a comprehensive grey incidence model is proposed. Furthermore, the properties of models are researched. It proves that the proposed models satisfy the properties of translation invariance, multiple transformation invariance, and axioms of the grey incidence analysis, respectively. Finally, a case is studied. The results illustrate that the model is effective than other multivariate grey incidence analysis models.展开更多
Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is...Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is applied to rock analysis.Fourteen emission lines including Fe,Mg,Ca,Al,Si,and Ti are selected as analysis lines.A good accuracy(91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA.It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program,but also solve the problem of linear inseparability by combining PCA and SVM.By this method,the ability of LIBS to classify rock is validated.展开更多
A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are p...A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are presented, showing that the PCA based watermarks can resist malicious attacks including lowpass filtering, re scaling, and compression coding.展开更多
Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir mode...Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir models.For stable convergence in ensemble Kalman filter(EnKF),increasing ensemble size can be one of the solutions,but it causes high computational cost in large-scale reservoir systems.In this paper,we propose a preprocessing of good initial model selection to reduce the ensemble size,and then,EnKF is utilized to predict production performances stochastically.In the model selection scheme,representative models are chosen by using principal component analysis(PCA)and clustering analysis.The dimension of initial models is reduced using PCA,and the reduced models are grouped by clustering.Then,we choose and simulate representative models from the cluster groups to compare errors of production predictions with historical observation data.One representative model with the minimum error is considered as the best model,and we use the ensemble members near the best model in the cluster plane for applying EnKF.We demonstrate the proposed scheme for two 3D models that EnKF provides reliable assimilation results with much reduced computation time.展开更多
Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractic...Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractical for users to specify quantitative and exact preferences. Moreover, due to that Quality of Service (QoS) attributes are often interrelated, existing Web service selection approaches which employ weighted summation of QoS attribute values to compute the overall QoS of Web services may produce inaccurate results, since they do not take correlations among QoS attributes into account. To resolve these problems, a Web service selection framework considering user's preference priority is proposed, which incorporates a searching mechanism with QoS range setting to identify services satisfying the user's QoS constraints. With the identified service candidates, based on the idea of Principal Component Analysis (PCA), an algorithm of Web service selection named PCA-WSS (Web Service Selection based on PCA) is proposed, which can eliminate the correlations among QoS attributes and compute the overall QoS of Web services accurately. After computing the overall QoS for each service, the algorithm ranks the Web service candidates based on their overall QoS and recommends services with top QoS values to users. Finally, the effectiveness and feasibility of our approach are validated by experiments, i.e. the selected Web service by our approach is given high average evaluation than other ones by users and the time cost of PCA-WSS algorithm is not affected acutely by the number of service candidates.展开更多
In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different ...In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different sources are collected at different sampling rates.To build a complete process monitoring strategy,all these multi-rate measurements should be considered for data-based modeling and monitoring.In this paper,a novel kernel multi-rate probabilistic principal component analysis(K-MPPCA)model is proposed to extract the nonlinear correlations among different sampling rates.In the proposed model,the model parameters are calibrated using the kernel trick and the expectation-maximum(EM)algorithm.Also,the corresponding fault detection methods based on the nonlinear features are developed.Finally,a simulated nonlinear case and an actual pre-decarburization unit in the ammonia synthesis process are tested to demonstrate the efficiency of the proposed method.展开更多
This study aimed to explore the application of surface-enhanced Raman scattering(SERS)in the rapid diagnosis of gastric cancer.The SERS spectra of 68 serum samples from gastric cancer patients and healthy volunteers w...This study aimed to explore the application of surface-enhanced Raman scattering(SERS)in the rapid diagnosis of gastric cancer.The SERS spectra of 68 serum samples from gastric cancer patients and healthy volunteers were acquired.The characteristic ratio method(CRM)and principal component analysis(PCA)were used to differentiate gastric cancer serum from normal serum.Compared with healthy volunteers,the serum SERS intensity of gastric cancer patients was relatively high at 722 cm^(-1),while it was relatively low at 588,644,861,1008,1235,1397,1445 and 1586 cm^(-1).These results indicated that the relative content of nucleic acids in the serum of gastric cancer patients rises while the relative content of amino acids and carbohydrates decreases.In PCA,the sensitivity and specificity of discriminating gastric cancer were 94.1%and 94.1%,respectively,with the accuracy of 94.1%.Based on the intensity ratios of four characteristic peaks at 722,861,1008 and 1397 cm^(-1),CRM presented the diagnostic sensitivity and specificity of 100%and 97.4%,respectively,and the accuracy of 98.5%.Therefore,the three peak intensity ratios of I_(722)/I_(861),I_(722)/I_(1008)and I_(722)/I_(1397)can be considered as biologicalfingerprint information for gastric cancer diagnosis and can rapidly and directly reflect the physiological and pathological changes associated with gastric cancer development.This study provides an important basis and standards for the early diagnosis of gastric cancer.展开更多
基金This work was supported by the Pilot Seed Grant(Grant No.RES0049944)the Collaborative Research Project(Grant No.RES0043251)from the University of Alberta.
文摘Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.
基金National Natural Science Foundation of China(No.51805079)Shanghai Natural Science Foundation,China(No.17ZR1400600)Fundamental Research Funds for the Central Universities,China(No.16D110309)
文摘The healthy condition of the milling tool has a very high impact on the machining quality of the titanium components.Therefore,it is important to recognize the healthy condition of the tool and replace the damaged cutter at the right time.In order to recognize the health condition of the milling cutter,a method based on the long short term memory(LSTM)was proposed to recognize tool health state in this paper.The various signals collected in the tool wear experiments were analyzed by time-domain statistics,and then the extracted data were generated by principal component analysis(PCA)method.The preprocessed data extracted by PCA is transmitted to the LSTM model for recognition.Compared with back propagation neural network(BPNN)and support vector machine(SVM),the proposed method can effectively utilize the time-domain regulation in the data to achieve higher recognition speed and accuracy.
文摘As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the planning and implementation of equipment maintenance tasks, the equipment damage measurement is an important guarantee for the effective implementation of maintenance support. Firstly,this article comprehensively analyses the influence factors to damage measurement from the enemy’s attributes, our attributes and the battlefield environment starting from the basic problem of wartime equipment damage measurement. Secondly, this article determines the key factors based on fuzzy comprehensive evaluation(FCE) and performed principal component analysis (PCA) on the key factors. Finally, the principal components representing more than 85%of the data features are taken as the input and the equipment damage quantity is taken as the output. The data are trained and tested by artificial neural network (ANN) and random forest (RF). In a word, FCE-PCA-RF can be used as a reference for the research of equipment damage estimation in wartime.
基金Project supported by the National Natural Science Foundation of China(Nos.91441117 and51576182)the Natural Key Program of Chizhou University(No.2016ZRZ007)
文摘The principal component analysis (PCA) is used to analyze the high dimen- sional chemistry data of laminar premixed/stratified flames under strain effects. The first few principal components (PCs) with larger contribution ratios axe chosen as the tabu- lated scalars to build the look-up chemistry table. Prior tests show that strained premixed flame structure can be well reconstructed. To highlight the physical meanings of the tabu- lated scalars in stratified flames, a modified PCA method is developed, where the mixture fraction is used to replace one of the PCs with the highest correlation coefficient. The other two tabulated scalars are then modified with the Schmidt orthogonalization. The modified tabulated scalars not only have clear physical meanings, but also contain passive scalars. The PCA method has good commonality, and can be extended for building the thermo-chemistry table including strain rate effects when different fuels are used.
基金Supported by the National Key Research and Development Project(2016YFB0101601)
文摘Crashworthiness and lightweight optimization design of the crash box are studied in this paper. For the initial model, a physical test was performed to verify the model. Then, a parametric model using mesh morphing technology is used to optimize and decrease the maximum collision force (MCF) and increase specific energy absorption (SEA) while ensure mass is not increased. Because MCF and SEA are two conflicting objectives, grey relational analysis (GRA) and principal component analysis (PCA) are employed for design optimization of the crash box. Furthermore, multi-objective analysis can convert to a single objective using the grey relational grade (GRG) simultaneously, hence, the proposed method can obtain the optimal combination of design parameters for the crash box. It can be concluded that the proposed method decreases the MCF and weight to 16.7% and 29.4% respectively, while increasing SEA to 16.4%. Meanwhile, the proposed method in comparison to the conventional NSGA-Ⅱ method, reduces the time cost by 103%. Hence, the proposed method can be properly applied to the optimization of the crash box.
文摘This paper studies the problem of tensor principal component analysis (PCA). Usually the tensor PCA is viewed as a low-rank matrix completion problem via matrix factorization technique, and nuclear norm is used as a convex approximation of the rank operator under mild condition. However, most nuclear norm minimization approaches are based on SVD operations. Given a matrix , the time complexity of SVD operation is O(mn2), which brings prohibitive computational complexity in large-scale problems. In this paper, an efficient and scalable algorithm for tensor principal component analysis is proposed which is called Linearized Alternating Direction Method with Vectorized technique for Tensor Principal Component Analysis (LADMVTPCA). Different from traditional matrix factorization methods, LADMVTPCA utilizes the vectorized technique to formulate the tensor as an outer product of vectors, which greatly improves the computational efficacy compared to matrix factorization method. In the experiment part, synthetic tensor data with different orders are used to empirically evaluate the proposed algorithm LADMVTPCA. Results have shown that LADMVTPCA outperforms matrix factorization based method.
基金supported by the National Natural Science Foundation of China(71401052)the Key Project of National Social Science Fund of China(12AZD108)+2 种基金the Doctoral Fund of Ministry of Education(20120094120024)the Philosophy and Social Science Fund of Jiangsu Province Universities(2013SJD630073)the Central University Basic Service Project Fee of Hohai University(2011B09914)
文摘To overcome the too fine-grained granularity problem of multivariate grey incidence analysis and to explore the comprehensive incidence analysis model, three multivariate grey incidences degree models based on principal component analysis (PCA) are proposed. Firstly, the PCA method is introduced to extract the feature sequences of a behavioral matrix. Then, the grey incidence analysis between two behavioral matrices is transformed into the similarity and nearness measure between their feature sequences. Based on the classic grey incidence analysis theory, absolute and relative incidence degree models for feature sequences are constructed, and a comprehensive grey incidence model is proposed. Furthermore, the properties of models are researched. It proves that the proposed models satisfy the properties of translation invariance, multiple transformation invariance, and axioms of the grey incidence analysis, respectively. Finally, a case is studied. The results illustrate that the model is effective than other multivariate grey incidence analysis models.
基金Project supported by the National Natural Science Foundation of China(Grant No.11075184)the Knowledge Innovation Program of the Chinese Academy of Sciences(CAS)(Grant No.Y03RC21124)the CAS President’s International Fellowship Initiative Foundation(Grant No.2015VMA007)
文摘Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is applied to rock analysis.Fourteen emission lines including Fe,Mg,Ca,Al,Si,and Ti are selected as analysis lines.A good accuracy(91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA.It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program,but also solve the problem of linear inseparability by combining PCA and SVM.By this method,the ability of LIBS to classify rock is validated.
文摘A new watermarking scheme using principal component analysis (PCA) is described.The proposed method inserts highly robust watermarks into still images without degrading their visual quality. Experimental results are presented, showing that the PCA based watermarks can resist malicious attacks including lowpass filtering, re scaling, and compression coding.
基金supported by The Ministry of Trade,Industry,and Energy(20172510102090,20142520100440,20162010201980)Global PhD Fellowship Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2015H1A2A1030756)supported by the National Research Foundation of Korea(NRF)Grant(No.2018R1C1B5045260).
文摘Ensemble-based analyses are useful to compare equiprobable scenarios of the reservoir models.However,they require a large suite of reservoir models to cover high uncertainty in heterogeneous and complex reservoir models.For stable convergence in ensemble Kalman filter(EnKF),increasing ensemble size can be one of the solutions,but it causes high computational cost in large-scale reservoir systems.In this paper,we propose a preprocessing of good initial model selection to reduce the ensemble size,and then,EnKF is utilized to predict production performances stochastically.In the model selection scheme,representative models are chosen by using principal component analysis(PCA)and clustering analysis.The dimension of initial models is reduced using PCA,and the reduced models are grouped by clustering.Then,we choose and simulate representative models from the cluster groups to compare errors of production predictions with historical observation data.One representative model with the minimum error is considered as the best model,and we use the ensemble members near the best model in the cluster plane for applying EnKF.We demonstrate the proposed scheme for two 3D models that EnKF provides reliable assimilation results with much reduced computation time.
基金Supported by the National Natural Science Foundation of China(No.90818004and61100054)Program for New Century Excellent Talents in University(No.NCET-10-0140)+1 种基金Excellent Youth Foundation of Hunan Scientific Committee(No.11JJ1011)Scientific Research Fundof Hunan Educational Committee(No.09K085and11B048)
文摘Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users. However, due to the subjectivity and vagueness of preferences, it may be impractical for users to specify quantitative and exact preferences. Moreover, due to that Quality of Service (QoS) attributes are often interrelated, existing Web service selection approaches which employ weighted summation of QoS attribute values to compute the overall QoS of Web services may produce inaccurate results, since they do not take correlations among QoS attributes into account. To resolve these problems, a Web service selection framework considering user's preference priority is proposed, which incorporates a searching mechanism with QoS range setting to identify services satisfying the user's QoS constraints. With the identified service candidates, based on the idea of Principal Component Analysis (PCA), an algorithm of Web service selection named PCA-WSS (Web Service Selection based on PCA) is proposed, which can eliminate the correlations among QoS attributes and compute the overall QoS of Web services accurately. After computing the overall QoS for each service, the algorithm ranks the Web service candidates based on their overall QoS and recommends services with top QoS values to users. Finally, the effectiveness and feasibility of our approach are validated by experiments, i.e. the selected Web service by our approach is given high average evaluation than other ones by users and the time cost of PCA-WSS algorithm is not affected acutely by the number of service candidates.
基金supported by Zhejiang Provincial Natural Science Foundation of China(LY19F030003)Key Research and Development Project of Zhejiang Province(2021C04030)+1 种基金the National Natural Science Foundation of China(62003306)Educational Commission Research Program of Zhejiang Province(Y202044842)。
文摘In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different sources are collected at different sampling rates.To build a complete process monitoring strategy,all these multi-rate measurements should be considered for data-based modeling and monitoring.In this paper,a novel kernel multi-rate probabilistic principal component analysis(K-MPPCA)model is proposed to extract the nonlinear correlations among different sampling rates.In the proposed model,the model parameters are calibrated using the kernel trick and the expectation-maximum(EM)algorithm.Also,the corresponding fault detection methods based on the nonlinear features are developed.Finally,a simulated nonlinear case and an actual pre-decarburization unit in the ammonia synthesis process are tested to demonstrate the efficiency of the proposed method.
基金This work was supported by the Natural Science Foundation of Guangdong Province,China(2018 A0303131000)the project of Academician workstation of Guangdong Province,China(2014B090905001)the Fundamental Research Funds for the Central Universities,China(21617406)and the key project of Scientific and Technological projects of Guang Zhou,China(201604040007,201604020168).
文摘This study aimed to explore the application of surface-enhanced Raman scattering(SERS)in the rapid diagnosis of gastric cancer.The SERS spectra of 68 serum samples from gastric cancer patients and healthy volunteers were acquired.The characteristic ratio method(CRM)and principal component analysis(PCA)were used to differentiate gastric cancer serum from normal serum.Compared with healthy volunteers,the serum SERS intensity of gastric cancer patients was relatively high at 722 cm^(-1),while it was relatively low at 588,644,861,1008,1235,1397,1445 and 1586 cm^(-1).These results indicated that the relative content of nucleic acids in the serum of gastric cancer patients rises while the relative content of amino acids and carbohydrates decreases.In PCA,the sensitivity and specificity of discriminating gastric cancer were 94.1%and 94.1%,respectively,with the accuracy of 94.1%.Based on the intensity ratios of four characteristic peaks at 722,861,1008 and 1397 cm^(-1),CRM presented the diagnostic sensitivity and specificity of 100%and 97.4%,respectively,and the accuracy of 98.5%.Therefore,the three peak intensity ratios of I_(722)/I_(861),I_(722)/I_(1008)and I_(722)/I_(1397)can be considered as biologicalfingerprint information for gastric cancer diagnosis and can rapidly and directly reflect the physiological and pathological changes associated with gastric cancer development.This study provides an important basis and standards for the early diagnosis of gastric cancer.