We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were use...We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were used to develop double wall angle pyramid with aid of tungsten carbide tool. GRA coupled with PCA was used to plan the experiment conditions. Control factors such as Tool Diameter(TD), Step Depth(SD), Bottom Wall Angle(BWA), Feed Rate(FR) and Spindle Speed(SS) on Top Wall Angle(TWA) and Top Wall Angle Surface Roughness(TWASR) have been studied. Wall angle increases with increasing tool diameter due to large contact area between tool and workpiece. As the step depth, feed rate and spindle speed increase,TWASR decreases with increasing tool diameter. As the step depth increasing, the hydrostatic stress is raised causing severe cracks in the deformed surface. Hence it was concluded that the proposed hybrid method was suitable for optimizing the factors and response.展开更多
Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challe...Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.展开更多
The safety and integrity requirements of aerospace composite structures necessitate real-time health monitoring throughout their service life.To this end,distributed optical fiber sensors utilizing back Rayleigh scatt...The safety and integrity requirements of aerospace composite structures necessitate real-time health monitoring throughout their service life.To this end,distributed optical fiber sensors utilizing back Rayleigh scattering have been extensively deployed in structural health monitoring due to their advantages,such as lightweight and ease of embedding.However,identifying the precise location of damage from the optical fiber signals remains a critical challenge.In this paper,a novel approach which namely Modified Sliding Window Principal Component Analysis(MSWPCA)was proposed to facilitate automatic damage identification and localization via distributed optical fiber sensors.The proposed method is able to extract signal characteristics interfered by measurement noise to improve the accuracy of damage detection.Specifically,we applied the MSWPCA method to monitor and analyze the debonding propagation process in honeycomb sandwich panel structures.Our findings demonstrate that the training model exhibits high precision in detecting the location and size of honeycomb debonding,thereby facilitating reliable and efficient online assessment of the structural health state.展开更多
Principal Component Analysis (PCA) is a widely used technique for data analysis and dimensionality reduction, but its sensitivity to feature scale and outliers limits its applicability. Robust Principal Component Anal...Principal Component Analysis (PCA) is a widely used technique for data analysis and dimensionality reduction, but its sensitivity to feature scale and outliers limits its applicability. Robust Principal Component Analysis (RPCA) addresses these limitations by decomposing data into a low-rank matrix capturing the underlying structure and a sparse matrix identifying outliers, enhancing robustness against noise and outliers. This paper introduces a novel RPCA variant, Robust PCA Integrating Sparse and Low-rank Priors (RPCA-SL). Each prior targets a specific aspect of the data’s underlying structure and their combination allows for a more nuanced and accurate separation of the main data components from outliers and noise. Then RPCA-SL is solved by employing a proximal gradient algorithm for improved anomaly detection and data decomposition. Experimental results on simulation and real data demonstrate significant advancements.展开更多
In this research,the performance of regular rapeseed oil(RSO)and modified low-linolenic rapeseed oil(LLRO)during frying was assessed using a frying procedure that commonly found in fast-food restaurants.Key physicoche...In this research,the performance of regular rapeseed oil(RSO)and modified low-linolenic rapeseed oil(LLRO)during frying was assessed using a frying procedure that commonly found in fast-food restaurants.Key physicochemical attributes of these oils were investigated.RSO and LLRO differed for initial linolenic acid(12.21%vs.2.59%),linoleic acid(19.15%vs.24.73%).After 6 successive days frying period of French fries,the ratio of linoleic acid to palmitic acid dropped by 54.49%in RSO,higher than that in LLRO(51.54%).The increment in total oxidation value for LLRO(40.46 unit)was observed to be significantly lower than those of RSO(42.58 unit).The changes in carbonyl group value and iodine value throughout the frying trial were also lower in LLRO compared to RSO.The formation rate in total polar compounds for LLRO was 1.08%per frying day,lower than that of RSO(1.31%).In addition,the formation in color component and degradation in tocopherols were proportional to the frying time for two frying oils.Besides,a longer induction period was also observed in LLRO(8.87 h)compared to RSO(7.68 h)after frying period.Overall,LLRO exhibited the better frying stability,which was confirmed by principal component analysis(PCA).展开更多
The composition control of molten steel is one of the main functions in the ladle furnace(LF)refining process.In this study,a feasible model was established to predict the alloying element yield using principal compon...The composition control of molten steel is one of the main functions in the ladle furnace(LF)refining process.In this study,a feasible model was established to predict the alloying element yield using principal component analysis(PCA)and deep neural network(DNN).The PCA was used to eliminate collinearity and reduce the dimension of the input variables,and then the data processed by PCA were used to establish the DNN model.The prediction hit ratios for the Si element yield in the error ranges of±1%,±3%,and±5%are 54.0%,93.8%,and98.8%,respectively,whereas those of the Mn element yield in the error ranges of±1%,±2%,and±3%are 77.0%,96.3%,and 99.5%,respectively,in the PCA-DNN model.The results demonstrate that the PCA-DNN model performs better than the known models,such as the reference heat method,multiple linear regression,modified backpropagation,and DNN model.Meanwhile,the accurate prediction of the alloying element yield can greatly contribute to realizing a“narrow window”control of composition in molten steel.The construction of the prediction model for the element yield can also provide a reference for the development of an alloying control model in LF intelligent refining in the modern iron and steel industry.展开更多
The large blast furnace is essential equipment in the process of iron and steel manufacturing. Due to the complex operation process and frequent fluctuations of variables, conventional monitoring methods often bring f...The large blast furnace is essential equipment in the process of iron and steel manufacturing. Due to the complex operation process and frequent fluctuations of variables, conventional monitoring methods often bring false alarms. To address the above problem, an ensemble of greedy dynamic principal component analysis-Gaussian mixture model(EGDPCA-GMM) is proposed in this paper. First, PCA-GMM is introduced to deal with the collinearity and the non-Gaussian distribution of blast furnace data.Second, in order to explain the dynamics of data, the greedy algorithm is used to determine the extended variables and their corresponding time lags, so as to avoid introducing unnecessary noise. Then the bagging ensemble is adopted to cooperate with greedy extension to eliminate the randomness brought by the greedy algorithm and further reduce the false alarm rate(FAR) of monitoring results. Finally, the algorithm is applied to the blast furnace of a large iron and steel group in South China to verify performance.Compared with the basic algorithms, the proposed method achieves lowest FAR, while keeping missed alarm rate(MAR) remain stable.展开更多
Total organic carbon(TOC)content is one of the most important parameters for characterizing the quality of source rocks and assessing the hydrocarbon-generating potential of shales.The Lucaogou Formation shale reservo...Total organic carbon(TOC)content is one of the most important parameters for characterizing the quality of source rocks and assessing the hydrocarbon-generating potential of shales.The Lucaogou Formation shale reservoirs in the Jimusaer Sag,Junggar Basin,NW China,is characterized by extremely complex lithology and a wide variety of mineral compositions with source rocks mainly consisting of carbonaceous mudstone and dolomitic mudstone.The logging responses of organic matter in the shale reservoirs is quite different from those in conventional reservoirs.Analyses show that the traditional△logR method is not suitable for evaluating the TOC content in the study area.Analysis of the sensitivity characteristics of TOC content to well logs reveals that the TOC content has good correlation with the separation degree of porosity logs.After a dimension reduction processing by the principal component analysis technology,the principal components are determined through correlation analysis of porosity logs.The results show that the TOC values obtained by the new method are in good agreement with that measured by core analysis.The average absolute error of the new method is only 0.555,much less when compared with 1.222 of using traditional△logR method.The proposed method can be used to produce more accurate TOC estimates,thus providing a reliable basis for source rock mapping.展开更多
The evaluation model was established to estimate the number of houses collapsed during typhoon disaster for Zhejiang Province.The factor leading to disaster,the environment fostering disaster and the exposure of build...The evaluation model was established to estimate the number of houses collapsed during typhoon disaster for Zhejiang Province.The factor leading to disaster,the environment fostering disaster and the exposure of buildings were processed by Principal Component Analysis.The key factor was extracted to support input of vector machine model and to build an evaluation model;the historical fitting result kept in line with the fact.In the real evaluation of two typhoons landed in Zhejiang Province in 2008 and 2009,the coincidence of evaluating result and actual value proved the feasibility of this model.展开更多
Principal component analysis(PCA)was employed to determine the implications of geochemical and isotopic data from Cenozoic volcanic activities in the Southeast Asian region,including China(South China Sea(SCS),Hainan ...Principal component analysis(PCA)was employed to determine the implications of geochemical and isotopic data from Cenozoic volcanic activities in the Southeast Asian region,including China(South China Sea(SCS),Hainan Island,Fujian-Zhejiang coast,Taiwan Island),and parts of Vietnam and Thailand.We analyzed 15 trace element indicators and 5 isotopic indicators for 623 volcanic rock samples collected from the study region.Two principal components(PCs)were extracted by PCA based on the trace elements and Sr-Nd-Pb isotopic ratios,which probably indicate an enriched oceanic island basalt-type mantle plume and a depleted mid-ocean ridge basalt-type spreading ridge.The results show that the influence of the Hainan mantle plume on younger volcanic activities(<13 Ma)is stronger than that on older ones(>13 Ma)at the same location in the Southeast Asian region.PCA was employed to verify the mantle-plume-ridge interaction model of volcanic activities beneath the expansion center of SCS and refute the hypothesis that the tension of SCS is triggered by the Hainan plume.This study reveals the efficiency and applicability of PCA in discussing mantle sources of volcanic activities;thus,PCA is a suitable research method for analyzing geochemical data.展开更多
In this research,an integrated classification method based on principal component analysis-simulated annealing genetic algorithm-fuzzy cluster means(PCA-SAGA-FCM)was proposed for the unsupervised classification of tig...In this research,an integrated classification method based on principal component analysis-simulated annealing genetic algorithm-fuzzy cluster means(PCA-SAGA-FCM)was proposed for the unsupervised classification of tight sandstone reservoirs which lack the prior information and core experiments.A variety of evaluation parameters were selected,including lithology characteristic parameters,poro-permeability quality characteristic parameters,engineering quality characteristic parameters,and pore structure characteristic parameters.The PCA was used to reduce the dimension of the evaluation pa-rameters,and the low-dimensional data was used as input.The unsupervised reservoir classification of tight sandstone reservoir was carried out by the SAGA-FCM,the characteristics of reservoir at different categories were analyzed and compared with the lithological profiles.The analysis results of numerical simulation and actual logging data show that:1)compared with FCM algorithm,SAGA-FCM has stronger stability and higher accuracy;2)the proposed method can cluster the reservoir flexibly and effectively according to the degree of membership;3)the results of reservoir integrated classification match well with the lithologic profle,which demonstrates the reliability of the classification method.展开更多
This work utilizes a statistical approach of Principal Component Ana-lysis(PCA)towards the detection of Methane(CH_(4))-Carbon Monoxide(CO)Poi-soning occurring in coal mines,forestfires,drainage systems etc.where the ...This work utilizes a statistical approach of Principal Component Ana-lysis(PCA)towards the detection of Methane(CH_(4))-Carbon Monoxide(CO)Poi-soning occurring in coal mines,forestfires,drainage systems etc.where the CH_(4) and CO emissions are very high in closed buildings or confined spaces during oxi-dation processes.Both methane and carbon monoxide are highly toxic,colorless and odorless gases.Both of the gases have their own toxic levels to be detected.But during their combined presence,the toxicity of the either one goes unidentified may be due to their low levels which may lead to an explosion.By using PCA,the correlation of CO and CH_(4) data is carried out and by identifying the areas of high correlation(along the principal component axis)the explosion suppression action can be triggered earlier thus avoiding adverse effects of massive explosions.Wire-less Sensor Network is deployed and simulations are carried with heterogeneous sensors(Carbon Monoxide and Methane sensors)in NS-2 Mannasim framework.The rise in the value of CO even when CH_(4) is below the toxic level may become hazardous to the people around.Thus our proposed methodology will detect the combined presence of both the gases(CH_(4) and CO)and provide an early warning in order to avoid any human losses or toxic effects.展开更多
Machine learning algorithms (MLs) can potentially improve disease diagnostics, leading to early detection and treatment of these diseases. As a malignant tumor whose primary focus is located in the bronchial mucosal e...Machine learning algorithms (MLs) can potentially improve disease diagnostics, leading to early detection and treatment of these diseases. As a malignant tumor whose primary focus is located in the bronchial mucosal epithelium, lung cancer has the highest mortality and morbidity among cancer types, threatening health and life of patients suffering from the disease. Machine learning algorithms such as Random Forest (RF), Support Vector Machine (SVM), K-Nearest Neighbor (KNN) and Naïve Bayes (NB) have been used for lung cancer prediction. However they still face challenges such as high dimensionality of the feature space, over-fitting, high computational complexity, noise and missing data, low accuracies, low precision and high error rates. Ensemble learning, which combines classifiers, may be helpful to boost prediction on new data. However, current ensemble ML techniques rarely consider comprehensive evaluation metrics to evaluate the performance of individual classifiers. The main purpose of this study was to develop an ensemble classifier that improves lung cancer prediction. An ensemble machine learning algorithm is developed based on RF, SVM, NB, and KNN. Feature selection is done based on Principal Component Analysis (PCA) and Analysis of Variance (ANOVA). This algorithm is then executed on lung cancer data and evaluated using execution time, true positives (TP), true negatives (TN), false positives (FP), false negatives (FN), false positive rate (FPR), recall (R), precision (P) and F-measure (FM). Experimental results show that the proposed ensemble classifier has the best classification of 0.9825% with the lowest error rate of 0.0193. This is followed by SVM in which the probability of having the best classification is 0.9652% at an error rate of 0.0206. On the other hand, NB had the worst performance of 0.8475% classification at 0.0738 error rate.展开更多
Principal component analysis (PCA) was employed to examine the effect of nutritional and bioactive compounds of legume milk chocolate as well as the sensory to document the extend of variations and their significance ...Principal component analysis (PCA) was employed to examine the effect of nutritional and bioactive compounds of legume milk chocolate as well as the sensory to document the extend of variations and their significance with plant sources. PCA identified eight significant principle components, that reduce the size of the variables into one principal component in physiochemical analysis interpreting 73.5% of the total variability with/and 78.6% of total variability explained in sensory evaluation. Score plot indicates that Double Bean milk chocolate in-corporated with MOL and CML in nutritional profile have high positive correlations. In nutritional evaluation, carbohydrates and fat content shows negative/minimal correlations whereas no negative correlations were found in sensory evaluation which implies every sensorial variable had high correlation with each other.展开更多
With recent advances in biotechnology, genome-wide association study (GWAS) has been widely used to identify genetic variants that underlie human complex diseases and traits. In case-control GWAS, typical statistica...With recent advances in biotechnology, genome-wide association study (GWAS) has been widely used to identify genetic variants that underlie human complex diseases and traits. In case-control GWAS, typical statistical strategy is traditional logistical regression (LR) based on single-locus analysis. However, such a single-locus analysis leads to the well-known multiplicity problem, with a risk of inflating type I error and reducing power. Dimension reduction-based techniques, such as principal component-based logistic regression (PC-LR), partial least squares-based logistic regression (PLS-LR), have recently gained much attention in the analysis of high dimensional genomic data. However, the perfor- mance of these methods is still not clear, especially in GWAS. We conducted simulations and real data application to compare the type I error and power of PC-LR, PLS-LR and LR applicable to GWAS within a defined single nucleotide polymorphism (SNP) set region. We found that PC-LR and PLS can reasonably control type I error under null hypothesis. On contrast, LR, which is corrected by Bonferroni method, was more conserved in all simulation settings. In particular, we found that PC-LR and PLS-LR had comparable power and they both outperformed LR, especially when the causal SNP was in high linkage disequilibrium with genotyped ones and with a small effective size in simulation. Based on SNP set analysis, we applied all three methods to analyze non-small cell lung cancer GWAS data.展开更多
A combined model based on principal components analysis (PCA) and generalized regression neural network (GRNN) was adopted to forecast electricity price in day-ahead electricity market. PCA was applied to mine the mai...A combined model based on principal components analysis (PCA) and generalized regression neural network (GRNN) was adopted to forecast electricity price in day-ahead electricity market. PCA was applied to mine the main influence on day-ahead price, avoiding the strong correlation between the input factors that might influence electricity price, such as the load of the forecasting hour, other history loads and prices, weather and temperature; then GRNN was employed to forecast electricity price according to the main information extracted by PCA. To prove the efficiency of the combined model, a case from PJM (Pennsylvania-New Jersey-Maryland) day-ahead electricity market was evaluated. Compared to back-propagation (BP) neural network and standard GRNN, the combined method reduces the mean absolute percentage error about 3%.展开更多
AIM: To investigate the metabolic profiles of xenograft pancreatic cancer before and after radiotherapy by high-resolution magic angle spinning proton magnetic resonance spectroscopy (HRMAS 1H NMR) combined with princ...AIM: To investigate the metabolic profiles of xenograft pancreatic cancer before and after radiotherapy by high-resolution magic angle spinning proton magnetic resonance spectroscopy (HRMAS 1H NMR) combined with principal components analysis (PCA) and evaluate the radiotherapeutic effect. METHODS: The nude mouse xenograft model of human pancreatic cancer was established by injecting human pancreatic cancer cell SW1990 subcutaneously into the nude mice. When the tumors volume reached 800 mm3 , the mice received various radiation doses. Two weeks later, tumor tissue sections were prepared for running the NMR measurements. 1H NMR and PCA were used to determine the changes in the metabolic profiles of tumor tissues after radiotherapy. Metabolic profiles of normal pancreas, pancreatic tumor tissues, and radiationtreated pancreatic tumor tissues were compared. RESULTS: Compared with 1H NMR spectra of the normal nude mouse pancreas, the levels of choline, taurine, alanine, isoleucine, leucine, valine, lactate, and glutamic acid of the pancreatic cancer group were increased, whereas an opposite trend for phosphocholine, glycerophosphocholine, and betaine was observed. The ratio of phosphocholine to creatine, and glycerophosphocholine to creatine showed noticeable decrease in the pancreatic cancer group. After further evaluation of the tissue metabolic profile after treatment with three different radiation doses, no significant change in metabolites was observed in the 1H NMR spectra, while the inhibition of tumor growth was in proportion to the radiation doses. However, PCA results showed that the levels of choline and betaine were decreased with the increased radiation dose, and conversely, the level of acetic acid was dramatically increased. CONCLUSION: The combined methods were demonstrated to have the potential for allowing early diagnosis and assessment of pancreatic cancer response to radiotherapy.展开更多
Water quality monitoring has one of the highest priorities in surface water protection policy. Many variety approaches are being used to interpret and analyze the concealed variables that determine the variance of obs...Water quality monitoring has one of the highest priorities in surface water protection policy. Many variety approaches are being used to interpret and analyze the concealed variables that determine the variance of observed water quality of various source points. A considerable proportion of these approaches are mainly based on statistical methods, multivariate statistical techniques in particular. In the present study, the use of multivariate techniques is required to reduce the large variables number of Nile River water quality upstream Cairo Drinking Water Plants (CDWPs) and determination of relationships among them for easy and robust evaluation. By means of multivariate statistics of principal components analysis (PCA), Fuzzy C-Means (FCM) and K-means algorithm for clustering analysis, this study attempted to determine the major dominant factors responsible for the variations of Nile River water quality upstream Cairo Drinking Water Plants (CDWPs). Furthermore, cluster analysis classified 21 sampling stations into three clusters based on similarities of water quality features. The result of PCA shows that 6 principal components contain the key variables and account for 75.82% of total variance of the study area surface water quality and the dominant water quality parameters were: Conductivity, Iron, Biological Oxygen Demand (BOD), Total Coliform (TC), Ammonia (NH3), and pH. However, the results from both of FCM clustering and K-means algorithm, based on the dominant parameters concentrations, determined 3 cluster groups and produced cluster centers (prototypes). Based on clustering classification, a noted water quality deteriorating as the cluster number increased from 1 to 3. However the cluster grouping can be used to identify the physical, chemical and biological processes creating the variations in the water quality parameters. This study revealed that multivariate analysis techniques, as the extracted water quality dominant parameters and clustered information can be used in reducing the number of sampling parameters on the Nile River in a cost effective and efficient way instead of using a large set of parameters without missing much information. These techniques can be helpful for decision makers to obtain a global view on the water quality in any surface water or other water bodies when analyzing large data sets especially without a priori knowledge about relationships between them.展开更多
5 critical quality characteristics must be controlled in the surface mount and wire-bond process in semiconductor packaging. And these characteristics are correlated with each other. So the principal components analy...5 critical quality characteristics must be controlled in the surface mount and wire-bond process in semiconductor packaging. And these characteristics are correlated with each other. So the principal components analysis(PCA) is used in the analysis of the sample data firstly. And then the process is controlled with hotelling T^2 control chart for the first several principal components which contain sufficient information. Furthermore, a software tool is developed for this kind of problems. And with sample data from a surface mounting device(SMD) process, it is demonstrated that the T^2 control chart with PCA gets the same conclusion as without PCA, but the problem is transformed from high-dimensional one to a lower dimensional one, i.e., from 5 to 2 in this demonstration.展开更多
[Objective] This study was conducted to provide certain theoretical reference for the comprehensive evaluation and breeding of new fresh waxy corn vari- eties. [Method] With 5 good fresh waxy corn varieties as experim...[Objective] This study was conducted to provide certain theoretical reference for the comprehensive evaluation and breeding of new fresh waxy corn vari- eties. [Method] With 5 good fresh waxy corn varieties as experimental materials, correlation analysis and principal component anatysis were performed on 13 agronomic traits, i.e., plant height, ear position, ear weight, ear diameter, axis diameter, ear length, bald tip length, ear row number, number of grains per row, 100-kernel weight, fresh ear yield, tassel length, and tassel branch number. [Result] The principal component analysis performed to the 13 agronomic traits showed that the first three principal components, i.e., the fresh ear yield factors, the tassel factors and the bald top factors, had an accumulative contribution rate over 87.2767%, and could basically represent the genetic information represented by the 13 traits. The first principal component is the main index for the selection and evaluation of good corn varieties which should have large ear, large ear diameter but small axis diameter, i.e., longer grains, larger number of grains per ear, higher, 100-grain weight and higher plant height. As to the second principal component, the plants of fresh corn varieties are best to have longer tassel and not too many branches, and under the premise of ensuring enough pollen for the female spike, the varieties with fewer tassel branches shoud be selected as far as possible. From the point of the third principal component, bald tip length affects the marketing quality of fresh corn, and during fariety evaluation and breeding, the bald top length should be control at the Iowest standard. [Conclusion] The fresh ear yield of corn is in close positive correlation with ear weight, 100-grain weight, ear diameter, number of grains per row and ear length, and plant height also affects fresh ear yield.展开更多
文摘We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were used to develop double wall angle pyramid with aid of tungsten carbide tool. GRA coupled with PCA was used to plan the experiment conditions. Control factors such as Tool Diameter(TD), Step Depth(SD), Bottom Wall Angle(BWA), Feed Rate(FR) and Spindle Speed(SS) on Top Wall Angle(TWA) and Top Wall Angle Surface Roughness(TWASR) have been studied. Wall angle increases with increasing tool diameter due to large contact area between tool and workpiece. As the step depth, feed rate and spindle speed increase,TWASR decreases with increasing tool diameter. As the step depth increasing, the hydrostatic stress is raised causing severe cracks in the deformed surface. Hence it was concluded that the proposed hybrid method was suitable for optimizing the factors and response.
基金This work was supported by the Pilot Seed Grant(Grant No.RES0049944)the Collaborative Research Project(Grant No.RES0043251)from the University of Alberta.
文摘Ore production is usually affected by multiple influencing inputs at open-pit mines.Nevertheless,the complex nonlinear relationships between these inputs and ore production remain unclear.This becomes even more challenging when training data(e.g.truck haulage information and weather conditions)are massive.In machine learning(ML)algorithms,deep neural network(DNN)is a superior method for processing nonlinear and massive data by adjusting the amount of neurons and hidden layers.This study adopted DNN to forecast ore production using truck haulage information and weather conditions at open-pit mines as training data.Before the prediction models were built,principal component analysis(PCA)was employed to reduce the data dimensionality and eliminate the multicollinearity among highly correlated input variables.To verify the superiority of DNN,three ANNs containing only one hidden layer and six traditional ML models were established as benchmark models.The DNN model with multiple hidden layers performed better than the ANN models with a single hidden layer.The DNN model outperformed the extensively applied benchmark models in predicting ore production.This can provide engineers and researchers with an accurate method to forecast ore production,which helps make sound budgetary decisions and mine planning at open-pit mines.
基金supported by the National Key Research and Development Program of China(No.2018YFA0702800)the National Natural Science Foundation of China(No.12072056)supported by National Defense Fundamental Scientific Research Project(XXXX2018204BXXX).
文摘The safety and integrity requirements of aerospace composite structures necessitate real-time health monitoring throughout their service life.To this end,distributed optical fiber sensors utilizing back Rayleigh scattering have been extensively deployed in structural health monitoring due to their advantages,such as lightweight and ease of embedding.However,identifying the precise location of damage from the optical fiber signals remains a critical challenge.In this paper,a novel approach which namely Modified Sliding Window Principal Component Analysis(MSWPCA)was proposed to facilitate automatic damage identification and localization via distributed optical fiber sensors.The proposed method is able to extract signal characteristics interfered by measurement noise to improve the accuracy of damage detection.Specifically,we applied the MSWPCA method to monitor and analyze the debonding propagation process in honeycomb sandwich panel structures.Our findings demonstrate that the training model exhibits high precision in detecting the location and size of honeycomb debonding,thereby facilitating reliable and efficient online assessment of the structural health state.
文摘Principal Component Analysis (PCA) is a widely used technique for data analysis and dimensionality reduction, but its sensitivity to feature scale and outliers limits its applicability. Robust Principal Component Analysis (RPCA) addresses these limitations by decomposing data into a low-rank matrix capturing the underlying structure and a sparse matrix identifying outliers, enhancing robustness against noise and outliers. This paper introduces a novel RPCA variant, Robust PCA Integrating Sparse and Low-rank Priors (RPCA-SL). Each prior targets a specific aspect of the data’s underlying structure and their combination allows for a more nuanced and accurate separation of the main data components from outliers and noise. Then RPCA-SL is solved by employing a proximal gradient algorithm for improved anomaly detection and data decomposition. Experimental results on simulation and real data demonstrate significant advancements.
基金This work was financially supported by the Science and Technology Research Project of Jiangxi Provincial Education Department(GJJ210322)the National Natural Science Foundation of China(No.32260635).
文摘In this research,the performance of regular rapeseed oil(RSO)and modified low-linolenic rapeseed oil(LLRO)during frying was assessed using a frying procedure that commonly found in fast-food restaurants.Key physicochemical attributes of these oils were investigated.RSO and LLRO differed for initial linolenic acid(12.21%vs.2.59%),linoleic acid(19.15%vs.24.73%).After 6 successive days frying period of French fries,the ratio of linoleic acid to palmitic acid dropped by 54.49%in RSO,higher than that in LLRO(51.54%).The increment in total oxidation value for LLRO(40.46 unit)was observed to be significantly lower than those of RSO(42.58 unit).The changes in carbonyl group value and iodine value throughout the frying trial were also lower in LLRO compared to RSO.The formation rate in total polar compounds for LLRO was 1.08%per frying day,lower than that of RSO(1.31%).In addition,the formation in color component and degradation in tocopherols were proportional to the frying time for two frying oils.Besides,a longer induction period was also observed in LLRO(8.87 h)compared to RSO(7.68 h)after frying period.Overall,LLRO exhibited the better frying stability,which was confirmed by principal component analysis(PCA).
基金supported by the National Natural Science Foundation of China(No.51974023)State Key Laboratory of Advanced Metallurgy,University of Science and Technology Beijing(No.41621005)。
文摘The composition control of molten steel is one of the main functions in the ladle furnace(LF)refining process.In this study,a feasible model was established to predict the alloying element yield using principal component analysis(PCA)and deep neural network(DNN).The PCA was used to eliminate collinearity and reduce the dimension of the input variables,and then the data processed by PCA were used to establish the DNN model.The prediction hit ratios for the Si element yield in the error ranges of±1%,±3%,and±5%are 54.0%,93.8%,and98.8%,respectively,whereas those of the Mn element yield in the error ranges of±1%,±2%,and±3%are 77.0%,96.3%,and 99.5%,respectively,in the PCA-DNN model.The results demonstrate that the PCA-DNN model performs better than the known models,such as the reference heat method,multiple linear regression,modified backpropagation,and DNN model.Meanwhile,the accurate prediction of the alloying element yield can greatly contribute to realizing a“narrow window”control of composition in molten steel.The construction of the prediction model for the element yield can also provide a reference for the development of an alloying control model in LF intelligent refining in the modern iron and steel industry.
基金supported by the National Natural Science Foundation of China (61903326, 61933015)。
文摘The large blast furnace is essential equipment in the process of iron and steel manufacturing. Due to the complex operation process and frequent fluctuations of variables, conventional monitoring methods often bring false alarms. To address the above problem, an ensemble of greedy dynamic principal component analysis-Gaussian mixture model(EGDPCA-GMM) is proposed in this paper. First, PCA-GMM is introduced to deal with the collinearity and the non-Gaussian distribution of blast furnace data.Second, in order to explain the dynamics of data, the greedy algorithm is used to determine the extended variables and their corresponding time lags, so as to avoid introducing unnecessary noise. Then the bagging ensemble is adopted to cooperate with greedy extension to eliminate the randomness brought by the greedy algorithm and further reduce the false alarm rate(FAR) of monitoring results. Finally, the algorithm is applied to the blast furnace of a large iron and steel group in South China to verify performance.Compared with the basic algorithms, the proposed method achieves lowest FAR, while keeping missed alarm rate(MAR) remain stable.
基金This research was funded by the National Natural Science Foundation of China(Grant No.41504103).
文摘Total organic carbon(TOC)content is one of the most important parameters for characterizing the quality of source rocks and assessing the hydrocarbon-generating potential of shales.The Lucaogou Formation shale reservoirs in the Jimusaer Sag,Junggar Basin,NW China,is characterized by extremely complex lithology and a wide variety of mineral compositions with source rocks mainly consisting of carbonaceous mudstone and dolomitic mudstone.The logging responses of organic matter in the shale reservoirs is quite different from those in conventional reservoirs.Analyses show that the traditional△logR method is not suitable for evaluating the TOC content in the study area.Analysis of the sensitivity characteristics of TOC content to well logs reveals that the TOC content has good correlation with the separation degree of porosity logs.After a dimension reduction processing by the principal component analysis technology,the principal components are determined through correlation analysis of porosity logs.The results show that the TOC values obtained by the new method are in good agreement with that measured by core analysis.The average absolute error of the new method is only 0.555,much less when compared with 1.222 of using traditional△logR method.The proposed method can be used to produce more accurate TOC estimates,thus providing a reliable basis for source rock mapping.
基金Supported by Scientific Research Project for Commonwealth (GYHY200806017)Innovation Project for Graduate of Jiangsu Province (CX09S-018Z)
文摘The evaluation model was established to estimate the number of houses collapsed during typhoon disaster for Zhejiang Province.The factor leading to disaster,the environment fostering disaster and the exposure of buildings were processed by Principal Component Analysis.The key factor was extracted to support input of vector machine model and to build an evaluation model;the historical fitting result kept in line with the fact.In the real evaluation of two typhoons landed in Zhejiang Province in 2008 and 2009,the coincidence of evaluating result and actual value proved the feasibility of this model.
基金Supported by the State Key Laboratory of Marine Environmental Science Visiting Fellowship(No.MELRS2233)the State Key Laboratory of Marine Geology,Tongji University(No.MGK202302)+4 种基金the Innovation Group Project of Southern Marine Science and Engineering Guangdong Laboratory(Zhuhai)(No.311021003)the Zhujiang Talent Project Foundation of Guangdong Province(No.2017ZT07Z066)the Fundamental Research Funds for the Central Universities,Sun Yat-sen University(Nos.22qntd2101,2021qntd23)the Major Projects of the National Natural Science Foundation of China(Nos.41790465,41590863)the National Natural Science Foundation of China(Nos.42102333,41806077,41904045)。
文摘Principal component analysis(PCA)was employed to determine the implications of geochemical and isotopic data from Cenozoic volcanic activities in the Southeast Asian region,including China(South China Sea(SCS),Hainan Island,Fujian-Zhejiang coast,Taiwan Island),and parts of Vietnam and Thailand.We analyzed 15 trace element indicators and 5 isotopic indicators for 623 volcanic rock samples collected from the study region.Two principal components(PCs)were extracted by PCA based on the trace elements and Sr-Nd-Pb isotopic ratios,which probably indicate an enriched oceanic island basalt-type mantle plume and a depleted mid-ocean ridge basalt-type spreading ridge.The results show that the influence of the Hainan mantle plume on younger volcanic activities(<13 Ma)is stronger than that on older ones(>13 Ma)at the same location in the Southeast Asian region.PCA was employed to verify the mantle-plume-ridge interaction model of volcanic activities beneath the expansion center of SCS and refute the hypothesis that the tension of SCS is triggered by the Hainan plume.This study reveals the efficiency and applicability of PCA in discussing mantle sources of volcanic activities;thus,PCA is a suitable research method for analyzing geochemical data.
基金funded by the National Natural Science Foundation of China(42174131)the Strategic Cooperation Technology Projects of CNPC and CUPB(ZLZX2020-03).
文摘In this research,an integrated classification method based on principal component analysis-simulated annealing genetic algorithm-fuzzy cluster means(PCA-SAGA-FCM)was proposed for the unsupervised classification of tight sandstone reservoirs which lack the prior information and core experiments.A variety of evaluation parameters were selected,including lithology characteristic parameters,poro-permeability quality characteristic parameters,engineering quality characteristic parameters,and pore structure characteristic parameters.The PCA was used to reduce the dimension of the evaluation pa-rameters,and the low-dimensional data was used as input.The unsupervised reservoir classification of tight sandstone reservoir was carried out by the SAGA-FCM,the characteristics of reservoir at different categories were analyzed and compared with the lithological profiles.The analysis results of numerical simulation and actual logging data show that:1)compared with FCM algorithm,SAGA-FCM has stronger stability and higher accuracy;2)the proposed method can cluster the reservoir flexibly and effectively according to the degree of membership;3)the results of reservoir integrated classification match well with the lithologic profle,which demonstrates the reliability of the classification method.
文摘This work utilizes a statistical approach of Principal Component Ana-lysis(PCA)towards the detection of Methane(CH_(4))-Carbon Monoxide(CO)Poi-soning occurring in coal mines,forestfires,drainage systems etc.where the CH_(4) and CO emissions are very high in closed buildings or confined spaces during oxi-dation processes.Both methane and carbon monoxide are highly toxic,colorless and odorless gases.Both of the gases have their own toxic levels to be detected.But during their combined presence,the toxicity of the either one goes unidentified may be due to their low levels which may lead to an explosion.By using PCA,the correlation of CO and CH_(4) data is carried out and by identifying the areas of high correlation(along the principal component axis)the explosion suppression action can be triggered earlier thus avoiding adverse effects of massive explosions.Wire-less Sensor Network is deployed and simulations are carried with heterogeneous sensors(Carbon Monoxide and Methane sensors)in NS-2 Mannasim framework.The rise in the value of CO even when CH_(4) is below the toxic level may become hazardous to the people around.Thus our proposed methodology will detect the combined presence of both the gases(CH_(4) and CO)and provide an early warning in order to avoid any human losses or toxic effects.
文摘Machine learning algorithms (MLs) can potentially improve disease diagnostics, leading to early detection and treatment of these diseases. As a malignant tumor whose primary focus is located in the bronchial mucosal epithelium, lung cancer has the highest mortality and morbidity among cancer types, threatening health and life of patients suffering from the disease. Machine learning algorithms such as Random Forest (RF), Support Vector Machine (SVM), K-Nearest Neighbor (KNN) and Naïve Bayes (NB) have been used for lung cancer prediction. However they still face challenges such as high dimensionality of the feature space, over-fitting, high computational complexity, noise and missing data, low accuracies, low precision and high error rates. Ensemble learning, which combines classifiers, may be helpful to boost prediction on new data. However, current ensemble ML techniques rarely consider comprehensive evaluation metrics to evaluate the performance of individual classifiers. The main purpose of this study was to develop an ensemble classifier that improves lung cancer prediction. An ensemble machine learning algorithm is developed based on RF, SVM, NB, and KNN. Feature selection is done based on Principal Component Analysis (PCA) and Analysis of Variance (ANOVA). This algorithm is then executed on lung cancer data and evaluated using execution time, true positives (TP), true negatives (TN), false positives (FP), false negatives (FN), false positive rate (FPR), recall (R), precision (P) and F-measure (FM). Experimental results show that the proposed ensemble classifier has the best classification of 0.9825% with the lowest error rate of 0.0193. This is followed by SVM in which the probability of having the best classification is 0.9652% at an error rate of 0.0206. On the other hand, NB had the worst performance of 0.8475% classification at 0.0738 error rate.
文摘Principal component analysis (PCA) was employed to examine the effect of nutritional and bioactive compounds of legume milk chocolate as well as the sensory to document the extend of variations and their significance with plant sources. PCA identified eight significant principle components, that reduce the size of the variables into one principal component in physiochemical analysis interpreting 73.5% of the total variability with/and 78.6% of total variability explained in sensory evaluation. Score plot indicates that Double Bean milk chocolate in-corporated with MOL and CML in nutritional profile have high positive correlations. In nutritional evaluation, carbohydrates and fat content shows negative/minimal correlations whereas no negative correlations were found in sensory evaluation which implies every sensorial variable had high correlation with each other.
基金founded by the National Natural Science Foundation of China(81202283,81473070,81373102 and81202267)Key Grant of Natural Science Foundation of the Jiangsu Higher Education Institutions of China(10KJA330034 and11KJA330001)+1 种基金the Research Fund for the Doctoral Program of Higher Education of China(20113234110002)the Priority Academic Program for the Development of Jiangsu Higher Education Institutions(Public Health and Preventive Medicine)
文摘With recent advances in biotechnology, genome-wide association study (GWAS) has been widely used to identify genetic variants that underlie human complex diseases and traits. In case-control GWAS, typical statistical strategy is traditional logistical regression (LR) based on single-locus analysis. However, such a single-locus analysis leads to the well-known multiplicity problem, with a risk of inflating type I error and reducing power. Dimension reduction-based techniques, such as principal component-based logistic regression (PC-LR), partial least squares-based logistic regression (PLS-LR), have recently gained much attention in the analysis of high dimensional genomic data. However, the perfor- mance of these methods is still not clear, especially in GWAS. We conducted simulations and real data application to compare the type I error and power of PC-LR, PLS-LR and LR applicable to GWAS within a defined single nucleotide polymorphism (SNP) set region. We found that PC-LR and PLS can reasonably control type I error under null hypothesis. On contrast, LR, which is corrected by Bonferroni method, was more conserved in all simulation settings. In particular, we found that PC-LR and PLS-LR had comparable power and they both outperformed LR, especially when the causal SNP was in high linkage disequilibrium with genotyped ones and with a small effective size in simulation. Based on SNP set analysis, we applied all three methods to analyze non-small cell lung cancer GWAS data.
基金Project(70671039) supported by the National Natural Science Foundation of China
文摘A combined model based on principal components analysis (PCA) and generalized regression neural network (GRNN) was adopted to forecast electricity price in day-ahead electricity market. PCA was applied to mine the main influence on day-ahead price, avoiding the strong correlation between the input factors that might influence electricity price, such as the load of the forecasting hour, other history loads and prices, weather and temperature; then GRNN was employed to forecast electricity price according to the main information extracted by PCA. To prove the efficiency of the combined model, a case from PJM (Pennsylvania-New Jersey-Maryland) day-ahead electricity market was evaluated. Compared to back-propagation (BP) neural network and standard GRNN, the combined method reduces the mean absolute percentage error about 3%.
基金Supported by The Medical Imageology Special Purpose Foundation of Cancer Hospital/Cancer Institute Fudan University, No.YX200802
文摘AIM: To investigate the metabolic profiles of xenograft pancreatic cancer before and after radiotherapy by high-resolution magic angle spinning proton magnetic resonance spectroscopy (HRMAS 1H NMR) combined with principal components analysis (PCA) and evaluate the radiotherapeutic effect. METHODS: The nude mouse xenograft model of human pancreatic cancer was established by injecting human pancreatic cancer cell SW1990 subcutaneously into the nude mice. When the tumors volume reached 800 mm3 , the mice received various radiation doses. Two weeks later, tumor tissue sections were prepared for running the NMR measurements. 1H NMR and PCA were used to determine the changes in the metabolic profiles of tumor tissues after radiotherapy. Metabolic profiles of normal pancreas, pancreatic tumor tissues, and radiationtreated pancreatic tumor tissues were compared. RESULTS: Compared with 1H NMR spectra of the normal nude mouse pancreas, the levels of choline, taurine, alanine, isoleucine, leucine, valine, lactate, and glutamic acid of the pancreatic cancer group were increased, whereas an opposite trend for phosphocholine, glycerophosphocholine, and betaine was observed. The ratio of phosphocholine to creatine, and glycerophosphocholine to creatine showed noticeable decrease in the pancreatic cancer group. After further evaluation of the tissue metabolic profile after treatment with three different radiation doses, no significant change in metabolites was observed in the 1H NMR spectra, while the inhibition of tumor growth was in proportion to the radiation doses. However, PCA results showed that the levels of choline and betaine were decreased with the increased radiation dose, and conversely, the level of acetic acid was dramatically increased. CONCLUSION: The combined methods were demonstrated to have the potential for allowing early diagnosis and assessment of pancreatic cancer response to radiotherapy.
文摘Water quality monitoring has one of the highest priorities in surface water protection policy. Many variety approaches are being used to interpret and analyze the concealed variables that determine the variance of observed water quality of various source points. A considerable proportion of these approaches are mainly based on statistical methods, multivariate statistical techniques in particular. In the present study, the use of multivariate techniques is required to reduce the large variables number of Nile River water quality upstream Cairo Drinking Water Plants (CDWPs) and determination of relationships among them for easy and robust evaluation. By means of multivariate statistics of principal components analysis (PCA), Fuzzy C-Means (FCM) and K-means algorithm for clustering analysis, this study attempted to determine the major dominant factors responsible for the variations of Nile River water quality upstream Cairo Drinking Water Plants (CDWPs). Furthermore, cluster analysis classified 21 sampling stations into three clusters based on similarities of water quality features. The result of PCA shows that 6 principal components contain the key variables and account for 75.82% of total variance of the study area surface water quality and the dominant water quality parameters were: Conductivity, Iron, Biological Oxygen Demand (BOD), Total Coliform (TC), Ammonia (NH3), and pH. However, the results from both of FCM clustering and K-means algorithm, based on the dominant parameters concentrations, determined 3 cluster groups and produced cluster centers (prototypes). Based on clustering classification, a noted water quality deteriorating as the cluster number increased from 1 to 3. However the cluster grouping can be used to identify the physical, chemical and biological processes creating the variations in the water quality parameters. This study revealed that multivariate analysis techniques, as the extracted water quality dominant parameters and clustered information can be used in reducing the number of sampling parameters on the Nile River in a cost effective and efficient way instead of using a large set of parameters without missing much information. These techniques can be helpful for decision makers to obtain a global view on the water quality in any surface water or other water bodies when analyzing large data sets especially without a priori knowledge about relationships between them.
基金This project is supported by National Natural Science Foundation of China (No.70372062)Hi-Tech Program of Tianjin city,China (No.04310881R).
文摘5 critical quality characteristics must be controlled in the surface mount and wire-bond process in semiconductor packaging. And these characteristics are correlated with each other. So the principal components analysis(PCA) is used in the analysis of the sample data firstly. And then the process is controlled with hotelling T^2 control chart for the first several principal components which contain sufficient information. Furthermore, a software tool is developed for this kind of problems. And with sample data from a surface mounting device(SMD) process, it is demonstrated that the T^2 control chart with PCA gets the same conclusion as without PCA, but the problem is transformed from high-dimensional one to a lower dimensional one, i.e., from 5 to 2 in this demonstration.
文摘[Objective] This study was conducted to provide certain theoretical reference for the comprehensive evaluation and breeding of new fresh waxy corn vari- eties. [Method] With 5 good fresh waxy corn varieties as experimental materials, correlation analysis and principal component anatysis were performed on 13 agronomic traits, i.e., plant height, ear position, ear weight, ear diameter, axis diameter, ear length, bald tip length, ear row number, number of grains per row, 100-kernel weight, fresh ear yield, tassel length, and tassel branch number. [Result] The principal component analysis performed to the 13 agronomic traits showed that the first three principal components, i.e., the fresh ear yield factors, the tassel factors and the bald top factors, had an accumulative contribution rate over 87.2767%, and could basically represent the genetic information represented by the 13 traits. The first principal component is the main index for the selection and evaluation of good corn varieties which should have large ear, large ear diameter but small axis diameter, i.e., longer grains, larger number of grains per ear, higher, 100-grain weight and higher plant height. As to the second principal component, the plants of fresh corn varieties are best to have longer tassel and not too many branches, and under the premise of ensuring enough pollen for the female spike, the varieties with fewer tassel branches shoud be selected as far as possible. From the point of the third principal component, bald tip length affects the marketing quality of fresh corn, and during fariety evaluation and breeding, the bald top length should be control at the Iowest standard. [Conclusion] The fresh ear yield of corn is in close positive correlation with ear weight, 100-grain weight, ear diameter, number of grains per row and ear length, and plant height also affects fresh ear yield.