Online demand prediction plays an important role in transport network services from operations,controls to management,and information provision.However,the online prediction models are impacted by streaming data quali...Online demand prediction plays an important role in transport network services from operations,controls to management,and information provision.However,the online prediction models are impacted by streaming data quality issues with noise measurements and missing data.To address these,we develop a robust prediction method for online network-level demand prediction in public transport.It consists of a PCA method to extract eigen demand images and an optimization-based pattern recognition model to predict the weights of eigen demand images by making use of the partially observed real-time data up to the prediction time in a day.The prediction model is robust to data quality issues given that the eigen demand images are stable and the predicted weights of them are optimized using the network level data(less impacted by local data quality issues).In the case study,we validate the accuracy and transferability of the model by comparing it with benchmark models and evaluate the robustness in tolerating data quality issues of the proposed model.The experimental results demonstrate that the proposed Pattern Recognition Prediction based on PCA(PRP-PCA)consistently outperforms other benchmark models in accuracy and transferability.Moreover,the model shows high robustness in accommodating data quality issues.For example,the PRP-PCA model is robust to missing data up to 50%regardless of the noise level.We also discuss the hidden patterns behind the network level demand.The visualization analysis shows that eigen demand images are significantly connected to the network structure and station activity variabilities.Though the demand changes dramatically before and after the pandemic,the eigen demand images are consistent over time in Stockholm.展开更多
Block principle component analysis(BPCA) is a recently developed technique in computer vision and pattern classification. In this paper, we propose a robust and sparse BPCA with Lp-norm, referred to as BPCALp-S, which...Block principle component analysis(BPCA) is a recently developed technique in computer vision and pattern classification. In this paper, we propose a robust and sparse BPCA with Lp-norm, referred to as BPCALp-S, which inherits the robustness of BPCA-L1 due to the employment of adjustable Lp-norm. In order to perform a sparse modelling, the elastic net is integrated into the objective function. An iterative algorithm which extracts feature vectors one by one greedily is elaborately designed. The monotonicity of the proposed iterative procedure is theoretically guaranteed. Experiments of image classification and reconstruction on several benchmark sets show the effectiveness of the proposed approach.展开更多
Brain arteriovenous malformation(BAVM) is frequently described as vascular malformation. Although computer tomography(CT), magnetic resonance imaging(MRI) and angiography can clearly detect lesions, there are no diagn...Brain arteriovenous malformation(BAVM) is frequently described as vascular malformation. Although computer tomography(CT), magnetic resonance imaging(MRI) and angiography can clearly detect lesions, there are no diagnostic biological markers of BAVM available. Current study demonstrated that micro RNA(mi RNA)showed a feasible marker for vascular disease. To find key correlations between these mi RNAs and the onset of BAVM, we carried out chip analysis of serum mi RNAs by identifying 18 potential markers of BAVM. We then constructed a principle component analysis and logistic regression(PCA-LR) model to analyze the 18 mi RNAs collected from 77 patients. Another 9 independent samples were used to test the resulting model. The results showed that mi RNAs hsa-mir-126-3p and hsa-mir-140 are important protective factors, while hsa-mir-338 is a dominating risk factor, all of which have stronger correlation with BAVM than others. We also compared the testing results using PCA-LR model with those using LR model. The comparison revealed that PCA-LR model is better in predicting the disease.展开更多
NonLocal Means(NLM),taking fully advantage of image redundancy,has been proved to be very effective in noise removal.However,high computational load limits its wide application.Based on Principle Component Analysis(PC...NonLocal Means(NLM),taking fully advantage of image redundancy,has been proved to be very effective in noise removal.However,high computational load limits its wide application.Based on Principle Component Analysis(PCA),Principle Neighborhood Dictionary(PND) was proposed to reduce the computational load of NLM.Nevertheless,as the principle components in PND method are computed directly from noisy image neighborhoods,they are prone to be inaccurate due to the presence of noise.In this paper,an improved scheme for image denoising is proposed.This scheme is based on PND and uses preprocessing via Gaussian filter to eliminate the influence of noise.PCA is then used to project those filtered image neighborhood vectors onto a lower-dimensional space.With the preproc-essing process,the principle components computed are more accurate resulting in an improved de-noising performance.A comparison with some NLM based and state-of-art denoising methods shows that the proposed method performs well in terms of Peak Signal to Noise Ratio(PSNR) as well as image visual fidelity.The experimental results demonstrate that our method outperforms existing methods both subjectively and objectively.展开更多
Traditionally, Chinese indigenous cattle is geographically widespread. The present study analyzed based on genome-wide variants to evaluate the genetic background among 157 individuals from four representative indigen...Traditionally, Chinese indigenous cattle is geographically widespread. The present study analyzed based on genome-wide variants to evaluate the genetic background among 157 individuals from four representative indigenous cattle breeds of Hubei Province of China: Yiling yellow cattle (YL), Bashan cattle (BS), Wuling cattle (WL), Zaobei cattle (ZB), and 21 indi- viduals of Qinchuan cattle (QC) from the nearby Shanxi Province of China. Linkage disequilibrium (LD) analysis showed the LD of YL was the lowest (~=0.32) when the distance between markers was approximately 2 kb. Principle component analysis (PCA), and neighbor-joining (NJ)-tree revealed a separation of Yiling yellow cattle from other geographic nearby local cattle breeds. In PCA plot, the YL and QC groups were segregated as expected; moreover, YL individuals clustered together more obviously. In the N J-tree, the YL group formed an independent branch and BS, WL, ZB groups were mixed. We then used the FST statistic approach to reveal long-term selection sweep of YL and other 4 cattle breeds. According to the selective sweep, we identified the unique pathways of YL, associated with production traits. Based on the results, it can be proposed that YL has its unique genetic characteristics of excellence resource, and it is an indispensable cattle breed in China.展开更多
Optical coherence tomography(OCT)provides significant advantages of high resolution(approaching the histopathology level)realtime imaging of tsess without use of contrast agents.Based on these advantages,the microstru...Optical coherence tomography(OCT)provides significant advantages of high resolution(approaching the histopathology level)realtime imaging of tsess without use of contrast agents.Based on these advantages,the microstructural features of tumors can be visualized and detected intra-operatively.However,it is still not clinically accepted for tumor margin delin-eation due to poor specificity and accuracy.In contrast,Raman spectroscopy(RS)can obtain tissue information at the molecular level,but does not provide real-time inaging capability.Therefore,combining OCT and RS could provide synergy.To this end,we present a tissue analysis and dassification method using both the slope of OCT intensity signal Vs depth and the principle components from the RS spectrum as the indicators for tissuse characterization.The goal of this study was to understand prediction accuracy of OCT and combined OCT/RS method for dassification of optically similar tisues and organs.Our pilot experiments were performed on mouse kidneys,livers,and small intestines(SIs).The prediction accuracy with five-fold cross validation of the method has been evaluated by the support vector machine(SVM)method.The results demonstrate that tissue characterization based on the OCT/RS method was superior compared to using OCT structural information alone.This combined OCT/RS method is potentially useful as a noninvasive optical biopsy technique for rapid and automatic tissue characterization during surgery.展开更多
In this study,multivariate analysis methods,including a principal component analysis(PCA)and partial least square(PLS)analysis,were applied to reveal the inner relationship of the key variables in the process of H_(2)...In this study,multivariate analysis methods,including a principal component analysis(PCA)and partial least square(PLS)analysis,were applied to reveal the inner relationship of the key variables in the process of H_(2)O_(2)-assisted Na_(2)CO_(3)(HSC)pretreatment of corn stover.A total of 120 pretreatment experiments were implemented at the lab scale under different conditions by varying the particle size of the corn stover and process variables.The results showed that the Na_(2)CO_(3) dosage and pretreatment temperature had a strong influence on lignin removal,whereas pulp refining instrument(PFI)refining and Na_(2)CO_(3) dosage played positive roles in the final total sugar yield.Furthermore,it was found that pretreatment conditions had a more significant impact on the amelioration of pretreatment effectiveness compared with the properties of raw corn stover.In addition,a prediction of the effectiveness of the corn stover HSC pretreatment based on a PLS analysis was conducted for the first time,and the test results of the predictability based on additional pretreatment experiments proved that the developed PLS model achieved a good predictive performance(particularly for the final total sugar yield),indicating that the developed PLS model can be used to predict the effectiveness of HSC pretreatment.Therefore,multivariate analysis can be potentially used to monitor and control the pretreatment process in future large-scale biorefinery applications.展开更多
Background:Stem hardness is one of the major influencing factors for plant architecture in upland cotton(Gossypium hirsutum L.).Evaluating hardness phenotypic traits is very important for the selection of elite lines ...Background:Stem hardness is one of the major influencing factors for plant architecture in upland cotton(Gossypium hirsutum L.).Evaluating hardness phenotypic traits is very important for the selection of elite lines for resistance to lodging in Gossypium hirsutum L.Cotton breeders are interested in using diverse genotypes to enhance fiber quality and high-yield.Few pieces of research for hardness and its relationship with fiber quality and yield were found.This study was designed to find the relationship of stem hardness traits with fiber quality and yield contributing traits of upland cotton.Results:Experiments were carried out to measure the bending,acupuncture,and compression properties of the stem from a collection of upland cotton genotypes,comprising 237 accessions.The results showed that the genotypic difference in stem hardness was highly significant among the genotypes,and the stem hardness traits(BL,BU,AL,AU,CL,and CU)have a positive association with fiber quality traits and yield-related traits.Statistical analyses of the results showed that in descriptive statistics result bending(BL,BU)has a maximum coefficient of variance,but fiber length and fiber strength have less coefficient of variance among the genotypes.Principal component analysis(PCA)trimmed quantitative characters into nine principal components.The first nine principal components(PC)with Eigenvalues>1 explained 86%of the variation among 237 accessions of cotton.Both 2017 and 2018,PCA results indicated that BL,BU,FL,FE,and LI contributed to their variability in PC1,and BU,AU,CU,FD,LP,and FWPB have shown their variability in PC2.Conclusion:We describe here the systematic study of the mechanism involved in the regulation of enhancing fiber quality and yield by stem bending strength,acupuncture,and compression properties of G.hirsutum.展开更多
Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feat...Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feature to improve the accuracy and efficiency when they are used for pattern recognition and clustering analysis. A Poisson Model-based Kernel (PMK) was proposed based on the Poisson distribution of the SAGE data. Kernel Principle Component Analysis (KPCA) with PMK was proposed and used in feature-extract analysis of mouse retinal SAGE data. The computa-tional results show that this algorithm can extract feature effectively and reduce dimensions of SAGE data.展开更多
Today,securing devices connected to the internet is challenging as security threats are generated through various sources.The protection of cyber-physical systems from external attacks is a primary task.The presented ...Today,securing devices connected to the internet is challenging as security threats are generated through various sources.The protection of cyber-physical systems from external attacks is a primary task.The presented method is planned on the prime motive of detecting cybersecurity attacks and their impacted parameters.The proposed architecture employs the LYSIS dataset and formulates Multi Variant Exploratory Data Analysis(MEDA)through Principle Component Analysis(PCA)and Singular Value Decompo-sition(SVD)for the extraction of unique parameters.The feature mappings are analyzed with Recurrent 2 Convolutional Neural Network(R2CNN)and Gradient Boost Regression(GBR)to identify the maximum correlation.Novel Late Fusion Aggregation enabled with Cyber-Net(LFAEC)is the robust derived algorithm.The quantitative analysis uses predicted threat points with actual threat variables from which mean and difference vectors areevaluated.The performance of the presented system is assessed against the parameters such as Accuracy,Precision,Recall,and F1 Score.The proposed method outperformed by 98% to 100% in all quality measures compared to existing methods.展开更多
The goal of this paper is to propose a fast and secure multi-stage image compression-decompression system by using a wireless network between two Personal Computers (PCs). In this paper, the Principal Component Analys...The goal of this paper is to propose a fast and secure multi-stage image compression-decompression system by using a wireless network between two Personal Computers (PCs). In this paper, the Principal Component Analysis (PCA) technique is used for multi-stage image compression and Inverse Principal Component Analysis (IPCA) for multi-stage image decompression. The first step of the proposed system is to select the input image, the second step is to perform PCA up to 9 times on the input image, this compression, and after multi-stage compression process then the third step begins by transforming across wireless Ad hoc Network (WANET) to the second computing device, forth step start with multi-stage decompression process up 9 times. The proposed system for different images is transferred over the wireless network using Transmission Control Protocol/Internet Protocol (TCP/IP), which is programmed using the network role property of the MATLAB program. The proposed system implements 25 different images correctly (100%). The main contribution of this paper is that we are dealing with the black image at the end of the compressed process ad start with a black image at the start of the decompressed process of this proposed system. In this work, the compressed and uncompressed images are compared with each other in their size and transmission time. This system can be very useful in networks because they provide a high level of protection to the transmitted data from hackers because they cannot guess how much the image has been compressed or what kind of information the image represents.展开更多
Liquid leakage from pipelines is a critical issue in large-scale process plants.Damage in pipelines affects the normal operation of the plant and increases maintenance costs.Furthermore,it causes unsafe and hazardous ...Liquid leakage from pipelines is a critical issue in large-scale process plants.Damage in pipelines affects the normal operation of the plant and increases maintenance costs.Furthermore,it causes unsafe and hazardous situations for operators.Therefore,the detection and localization of leakages is a crucial task for maintenance and condition monitoring.Recently,the use of infrared(IR)cameras was found to be a promising approach for leakage detection in large-scale plants.IR cameras can capture leaking liquid if it has a higher(or lower)temperature than its surroundings.In this paper,a method based on IR video data and machine vision techniques is proposed to detect and localize liquid leakages in a chemical process plant.Since the proposed method is a vision-based method and does not consider the physical properties of the leaking liquid,it is applicable for any type of liquid leakage(i.e.,water,oil,etc.).In this method,subsequent frames are subtracted and divided into blocks.Then,principle component analysis is performed in each block to extract features from the blocks.All subtracted frames within the blocks are individually transferred to feature vectors,which are used as a basis for classifying the blocks.The k-nearest neighbor algorithm is used to classify the blocks as normal(without leakage)or anomalous(with leakage).Finally,the positions of the leakages are determined in each anomalous block.In order to evaluate the approach,two datasets with two different formats,consisting of video footage of a laboratory demonstrator plant captured by an IR camera,are considered.The results show that the proposed method is a promising approach to detect and localize leakages from pipelines using IR videos.The proposed method has high accuracy and a reasonable detection time for leakage detection.The possibility of extending the proposed method to a real industrial plant and the limitations of this method are discussed at the end.展开更多
Asian monsoon have multiple forms of variations such as seasonal variation, intra-seasonal variation, interannual variation, etc. The interannual variations have not only yearly variations but also variations among se...Asian monsoon have multiple forms of variations such as seasonal variation, intra-seasonal variation, interannual variation, etc. The interannual variations have not only yearly variations but also variations among several years. In general, the yearly variations are described with winter temperature and summer precipitation, and the variations among several years are reflected by circulation of ENSO events. In this study, at first, we analyze the relationship between land cover and interannual monsoon variations represented by precipitation changes using Singular Value Decomposition method based on the time series precipitation data and 8km NOAA AVHRR NDVI data covering 1982 to 1993 in east Asia. Furthermore, after confirmation and reclassification of ENSO events which are recognized as the strong signal of several year monsoon variation, using the same time series NDVI data during 1982 to 1993 in east Asia, we make a Principle Component Analysis and analyzed the correlation of the 7th component eigenvectors and Southern Oscillation Index (SOI) that indicates the characteristic of ENSO events, and summed up the temporal-spatial distribution features of east Asian land cover’s inter-annual variations that are being driven by changes of ENSO events.展开更多
Effects of the heavy metal copper(Cu), the metalloid arsenic(As), and the antibiotic oxytetracycline(OTC) on bacterial community structure and diversity during cow and pig manure composting were investigated. Eight tr...Effects of the heavy metal copper(Cu), the metalloid arsenic(As), and the antibiotic oxytetracycline(OTC) on bacterial community structure and diversity during cow and pig manure composting were investigated. Eight treatments were applied, four to each manure type, namely cow manure with:(1) no additives(control),(2) addition of heavy metal and metalloid,(3) addition of OTC and(4) addition of OTC with heavy metal and metalloid;and pig manure with:(5) no additives(control),(6) addition of heavy metal and metalloid,(7) addition of OTC and(8) addition of OTC with heavy metal and metalloid. After 35 days of composting, according to the alpha diversity indices, the combination treatment(OTC with heavy metal and metalloid) in pig manure was less harmful to microbial diversity than the control or heavy metal and metalloid treatments. In cow manure, the treatment with heavy metal and metalloid was the most harmful to the microbial community, followed by the combination and OTC treatments. The OTC and combination treatments had negative effects on the relative abundance of microbes in cow manure composts. The dominant phyla in both manure composts included Actinobacteria, Bacteroidetes, Firmicutes, and Proteobacteria. The microbial diversity relative abundance transformation was dependent on the composting time. Redundancy analysis(RDA) revealed that environmental parameters had the most influence on the bacterial communities. In conclusion, the composting process is the most sustainable technology for reducing heavy metal and metalloid impacts and antibiotic contamination in cow and pig manure. The physicochemical property variations in the manures had a significant effect on the microbial community during the composting process. This study provides an improved understanding of bacterial community composition and its changes during the composting process.展开更多
Lumber moisture content(LMC) is the important parameter to judge the dryness of lumber and the quality of wooden products.Nevertheless the data acquired are mostly redundant and incomplete because of the complexity of...Lumber moisture content(LMC) is the important parameter to judge the dryness of lumber and the quality of wooden products.Nevertheless the data acquired are mostly redundant and incomplete because of the complexity of the course of drying,by interference factors that exist in the dryness environment and by the physical characteristics of the lumber itself.To improve the measuring accuracy and reliability of LMC,the optimal support vector machine(SVM) algorithm was put forward for regression analysis LMC.Environmental factors such as air temperature and relative humidity were considered,the data of which were extracted with the principle component analysis method.The regression and prediction of SVM was optimized based on the grid search(GS) technique.Groups of data were sampled and analyzed,and simulation comparison of forecasting performance shows that the main component data were extracted to speed up the convergence rate of the optimum algorithm.The GS-SVM shows a better performance in solving the LMC measuring and forecasting problem.展开更多
Data-driven temporal filtering technique is integrated into the time trajectory of Teager energy operation (TEO) based feature parameter for improving the robustness of speech recognition system against noise. Three...Data-driven temporal filtering technique is integrated into the time trajectory of Teager energy operation (TEO) based feature parameter for improving the robustness of speech recognition system against noise. Three kinds of data-driven temporal filters are investigated for the motivation of alleviating the harmful effects that the environmental factors have on the speech. The filters include: principle component analysis (PCA) based filters, linear discriminant analysis (LDA) based filters and minimum classification error (MCE) based filters. Detailed comparative analysis among these temporal filtering approaches applied in Teager energy domain is presented. It is shown that while all of them can improve the recognition performance of the original TEO based feature parameter in adverse environment, MCE based temporal filtering can provide the lowest error rate as SNR decreases than any other algorithms.展开更多
This study aimed to provide relevant knowledge about the dynamics of the hydrological parameters in the river-estuary continuum of the Wouri-Nkam river estuary for a sustainable management program. The hydrological pa...This study aimed to provide relevant knowledge about the dynamics of the hydrological parameters in the river-estuary continuum of the Wouri-Nkam river estuary for a sustainable management program. The hydrological parameters were recorded in eleven stations spanned out on the bas<span style="white-space:normal;"><span style="font-family:;" "="">is</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> of population density and human activities. Water quality parameters (Temperature, salinity, dissolved oxygen, pH, Total dissolved solutes, Redox potential and conductivity) were collected in subsurface water using a multiple parameter. Surface currents and morphometric (depth and width) parameters were recorded using a drifter, sonar depth and GPS. The field measurements took placed between 18/05/2019 to 08/09/2020 and were divided into six (06) cruises. The data were later subjected to an analysis of variance (ANOVA) and Principle Component Analysis using XLSTAT 2017 (2.7 version) software. Results obtained revealed that, the water quality parameters were spatially more stable no</span></span><span style="white-space:normal;"><span style="font-family:;" "="">t</span></span><span style="white-space:normal;"><span style="font-family:;" "="">signficant at (df = 9, p <</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> </span></span><span style="white-space:normal;"><span style="font-family:;" "="">0.05) with a relatively low temperature (25.5</span></span><span style="white-space:normal;"><span style="font-family:;" "=""><span style="white-space:nowrap;">°</span>C - 27<span style="white-space:nowrap;">°</span>C) during the wet period. The limit of the frontal zone extended to S5 (Bonalokan, 8.25</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> </span></span><span style="white-space:normal;"><span style="font-family:;" "="">km from S1) during the snapshot of the dry period, spring phase and flood tide conditions. Inversely, during wet period, this extension reduced to S1 (Bridge) and relatively increases slightly to S3 (Bonangang) during the neap phase and ebb tides of this season. This result revealed a change in the axial gradient of about eight (08) and four (04) kilometers during the seasonal and tidal scales (lunar and diurnal periods) respectively. These changes were also accompanied by changes in the water quality signatures, that may affect the fishery distribution and compositions. However, futures studies to buttress the results of this investigation should focus on longer time series sampling methods and model developments.</span></span>展开更多
Parkinson’s disease(PD)is a neurodegenerative disease in the central nervous system.Recently,more researches have been conducted in the determination of PD prediction which is really a challenging task.Due to the dis...Parkinson’s disease(PD)is a neurodegenerative disease in the central nervous system.Recently,more researches have been conducted in the determination of PD prediction which is really a challenging task.Due to the disorders in the central nervous system,the syndromes like off sleep,speech disorders,olfactory and autonomic dysfunction,sensory disorder symptoms will occur.The earliest diagnosing of PD is very challenging among the doctors community.There are techniques that are available in order to predict PD using symptoms and disorder measurement.It helps to save a million lives of future by early prediction.In this article,the early diagnosing of PD using machine learning techniques with feature selection is carried out.In the first stage,the data preprocessing is used for the preparation of Parkinson’s disease data.In the second stage,MFEA is used for extracting features.In the third stage,the feature selection is performed using multiple feature input with a principal component analysis(PCA)algorithm.Finally,a Darknet Convolutional Neural Network(DNetCNN)is used to classify the PD patients.The main advantage of using PCA-DNetCNN is that,it provides the best classification in the image dataset using YOLO.In addition to that,the results of various existing methods are compared and the proposed DNetCNN proves better accuracy,performance in detecting the PD at the initial stages.DNetCNN achieves 97.5%of accuracy in detecting PD as early.Besides,the other performance metrics are compared in the result evaluation and it is proved that the proposed model outperforms all the other existing models.展开更多
Rice stem borer(Chilo agamemnon Bles.)is a primary insect pest of rice and is a major limiting factor to rice production.Breeding for insect-resistant crop varieties has been an economic way of integrated pest managem...Rice stem borer(Chilo agamemnon Bles.)is a primary insect pest of rice and is a major limiting factor to rice production.Breeding for insect-resistant crop varieties has been an economic way of integrated pest management(IPM)as it offers a viable and ecologically acceptable approach.This study was aimed to evaluate rice genotypes for their resistance against rice stem borer.Seven parental genotypes with twenty one F1 crosses were evaluated for genotypic variation in field experiments.Analysis of variance revealed significant differences for the studied traits in almost all crosses and parents.In addition,the mean squares of parents versus their crosses were signifi-cant for stem borer resistance and other associated traits.Moreover,both general combining ability(GCA)and specific combining ability(SCA)variances were highly significant for all characters studied in the F1 generation.Based on GCA,4 genotypes(Sakha101,Gz6903-3-4-2-1,Gz9577-4-1-1 and Hassawi)exhibited highly significant negative values for stem borer resistance(–0.53,–1.06,–0.18 and–0.49,respectively)indicating they are the best combiners for stem borer resistance.Based on SCA analysis,nine cross combinations showed highly significant negative effects for stem borer resistance.Similarly,the cross Giza178Hassawi was the best combination with significantly highest value for early maturity.In addition,seven crosses showed highly significant negative SCA for plant height trait.On the other hand,for panicle length,number of primary branches/panicle,panicle weight and 1000-grain weight,seven,four,eight and six crosses showed highly significant positive SCA,respectively.The result further revealed that the non-additive dominance genetic variance was higher than the additive variance for all evaluated traits indicating that non-additive genetic variances have a role in their inheritance.The broad-sense heritability estimates were high for all the studied traits.The stem borer resistance was significantly correlated with panicle weight and 1000-grain weight,which also showed a highly significant correlation with grain yield/plant.Thus these traits can be effectively employed in a breeding program to confer resistance against stem borer infestation in rice.It was further supported by biplot analysis,which clustered these potentially important traits into two quadrants showing their importance in any future breeding program to control stem borer infestation.This study has contributed valuable information for evaluation of genetic diversity in the local rice germplasm and its utilization in futuristic rice genetic improvement programs.展开更多
The features extracted by principle component analysis(PCA) are the best descriptive and the features extracted by linear discriminant analysis(LDA) are the most classifiable. In this paper, these two methods are comb...The features extracted by principle component analysis(PCA) are the best descriptive and the features extracted by linear discriminant analysis(LDA) are the most classifiable. In this paper, these two methods are combined and a PC-LDA approach is used to extract the features of traffic signs. After obtaining the binary images of the traffic signs through normalization and binarization, PC-LDA can extract the feature subspace of the traffic sign images with the best description and classification. The extracted features are recognized by using the minimum distance classifier. The approach is verified by using MPEG7 CE Shape-1 Part-B computer shape library and traffic sign image library which includes both standard and natural traffic signs. The results show that under the condition that the traffic sign is in a nature scene, PC-LDA approach applied to binary images in which shape features are extracted can obtain better results.展开更多
文摘Online demand prediction plays an important role in transport network services from operations,controls to management,and information provision.However,the online prediction models are impacted by streaming data quality issues with noise measurements and missing data.To address these,we develop a robust prediction method for online network-level demand prediction in public transport.It consists of a PCA method to extract eigen demand images and an optimization-based pattern recognition model to predict the weights of eigen demand images by making use of the partially observed real-time data up to the prediction time in a day.The prediction model is robust to data quality issues given that the eigen demand images are stable and the predicted weights of them are optimized using the network level data(less impacted by local data quality issues).In the case study,we validate the accuracy and transferability of the model by comparing it with benchmark models and evaluate the robustness in tolerating data quality issues of the proposed model.The experimental results demonstrate that the proposed Pattern Recognition Prediction based on PCA(PRP-PCA)consistently outperforms other benchmark models in accuracy and transferability.Moreover,the model shows high robustness in accommodating data quality issues.For example,the PRP-PCA model is robust to missing data up to 50%regardless of the noise level.We also discuss the hidden patterns behind the network level demand.The visualization analysis shows that eigen demand images are significantly connected to the network structure and station activity variabilities.Though the demand changes dramatically before and after the pandemic,the eigen demand images are consistent over time in Stockholm.
基金the National Natural Science Foundation of China(No.61572033)the Natural Science Foundation of Education Department of Anhui Province of China(No.KJ2015ZD08)the Higher Education Promotion Plan of Anhui Province of China(No.TSKJ2015B14)
文摘Block principle component analysis(BPCA) is a recently developed technique in computer vision and pattern classification. In this paper, we propose a robust and sparse BPCA with Lp-norm, referred to as BPCALp-S, which inherits the robustness of BPCA-L1 due to the employment of adjustable Lp-norm. In order to perform a sparse modelling, the elastic net is integrated into the objective function. An iterative algorithm which extracts feature vectors one by one greedily is elaborately designed. The monotonicity of the proposed iterative procedure is theoretically guaranteed. Experiments of image classification and reconstruction on several benchmark sets show the effectiveness of the proposed approach.
文摘Brain arteriovenous malformation(BAVM) is frequently described as vascular malformation. Although computer tomography(CT), magnetic resonance imaging(MRI) and angiography can clearly detect lesions, there are no diagnostic biological markers of BAVM available. Current study demonstrated that micro RNA(mi RNA)showed a feasible marker for vascular disease. To find key correlations between these mi RNAs and the onset of BAVM, we carried out chip analysis of serum mi RNAs by identifying 18 potential markers of BAVM. We then constructed a principle component analysis and logistic regression(PCA-LR) model to analyze the 18 mi RNAs collected from 77 patients. Another 9 independent samples were used to test the resulting model. The results showed that mi RNAs hsa-mir-126-3p and hsa-mir-140 are important protective factors, while hsa-mir-338 is a dominating risk factor, all of which have stronger correlation with BAVM than others. We also compared the testing results using PCA-LR model with those using LR model. The comparison revealed that PCA-LR model is better in predicting the disease.
基金Supported by the National Natural Science Foundation of China (No. 60776795,60736043,60902031,and 60805012)the Research Fund for the Doctoral Program of Higher Education of China (No. 200807010004,20070701023)the Fundamental Research Funds for the Central Universities of China (No. JY10000902028)
文摘NonLocal Means(NLM),taking fully advantage of image redundancy,has been proved to be very effective in noise removal.However,high computational load limits its wide application.Based on Principle Component Analysis(PCA),Principle Neighborhood Dictionary(PND) was proposed to reduce the computational load of NLM.Nevertheless,as the principle components in PND method are computed directly from noisy image neighborhoods,they are prone to be inaccurate due to the presence of noise.In this paper,an improved scheme for image denoising is proposed.This scheme is based on PND and uses preprocessing via Gaussian filter to eliminate the influence of noise.PCA is then used to project those filtered image neighborhood vectors onto a lower-dimensional space.With the preproc-essing process,the principle components computed are more accurate resulting in an improved de-noising performance.A comparison with some NLM based and state-of-art denoising methods shows that the proposed method performs well in terms of Peak Signal to Noise Ratio(PSNR) as well as image visual fidelity.The experimental results demonstrate that our method outperforms existing methods both subjectively and objectively.
基金funded in part by the National Natural Science Foundation of China (31402039,31472079,31372294)the Beijing Natural Science Foundation (6154032)+2 种基金the Species and Breed Resources Conservation of the Ministry of Agriculture of China (2017-2019)the Cattle Breeding Innovative Research Team of Chinese Academy of Agricultural Sciences (cxgc-ias-03)the National Beef Cattle Industrial Technology System (CARS-37)
文摘Traditionally, Chinese indigenous cattle is geographically widespread. The present study analyzed based on genome-wide variants to evaluate the genetic background among 157 individuals from four representative indigenous cattle breeds of Hubei Province of China: Yiling yellow cattle (YL), Bashan cattle (BS), Wuling cattle (WL), Zaobei cattle (ZB), and 21 indi- viduals of Qinchuan cattle (QC) from the nearby Shanxi Province of China. Linkage disequilibrium (LD) analysis showed the LD of YL was the lowest (~=0.32) when the distance between markers was approximately 2 kb. Principle component analysis (PCA), and neighbor-joining (NJ)-tree revealed a separation of Yiling yellow cattle from other geographic nearby local cattle breeds. In PCA plot, the YL and QC groups were segregated as expected; moreover, YL individuals clustered together more obviously. In the N J-tree, the YL group formed an independent branch and BS, WL, ZB groups were mixed. We then used the FST statistic approach to reveal long-term selection sweep of YL and other 4 cattle breeds. According to the selective sweep, we identified the unique pathways of YL, associated with production traits. Based on the results, it can be proposed that YL has its unique genetic characteristics of excellence resource, and it is an indispensable cattle breed in China.
基金supported in part by the grants to Kirill Larin from NIH 1R01EY022362,1R01HL120140,U54HG006348,and DOD PRJ71Tsupported by grants to Wei-Chuan Shih from NSF CAREER Award (CBET-1151154)+1 种基金NASA Early Career Faculty Grant (NNX12AQ44G)Gulf of Mexico Research Initiative (GoMRI-030).
文摘Optical coherence tomography(OCT)provides significant advantages of high resolution(approaching the histopathology level)realtime imaging of tsess without use of contrast agents.Based on these advantages,the microstructural features of tumors can be visualized and detected intra-operatively.However,it is still not clinically accepted for tumor margin delin-eation due to poor specificity and accuracy.In contrast,Raman spectroscopy(RS)can obtain tissue information at the molecular level,but does not provide real-time inaging capability.Therefore,combining OCT and RS could provide synergy.To this end,we present a tissue analysis and dassification method using both the slope of OCT intensity signal Vs depth and the principle components from the RS spectrum as the indicators for tissuse characterization.The goal of this study was to understand prediction accuracy of OCT and combined OCT/RS method for dassification of optically similar tisues and organs.Our pilot experiments were performed on mouse kidneys,livers,and small intestines(SIs).The prediction accuracy with five-fold cross validation of the method has been evaluated by the support vector machine(SVM)method.The results demonstrate that tissue characterization based on the OCT/RS method was superior compared to using OCT structural information alone.This combined OCT/RS method is potentially useful as a noninvasive optical biopsy technique for rapid and automatic tissue characterization during surgery.
基金This work was financially supported by the National Natural Science Foundation of China(No.31870568)Shandong Provincial Natural Science Foundation for Distinguished Young Scholars(China)(No.ZR2019JQ10)+1 种基金the Major Program of the Shandong Province Natural Science Foundation(No.ZR2018ZB0208)the"Transformational Technologies for Clean Energy and Demonstration"Strategic Priority Research Program of the Chinese Academy of Sciences(No.XDA21060201).
文摘In this study,multivariate analysis methods,including a principal component analysis(PCA)and partial least square(PLS)analysis,were applied to reveal the inner relationship of the key variables in the process of H_(2)O_(2)-assisted Na_(2)CO_(3)(HSC)pretreatment of corn stover.A total of 120 pretreatment experiments were implemented at the lab scale under different conditions by varying the particle size of the corn stover and process variables.The results showed that the Na_(2)CO_(3) dosage and pretreatment temperature had a strong influence on lignin removal,whereas pulp refining instrument(PFI)refining and Na_(2)CO_(3) dosage played positive roles in the final total sugar yield.Furthermore,it was found that pretreatment conditions had a more significant impact on the amelioration of pretreatment effectiveness compared with the properties of raw corn stover.In addition,a prediction of the effectiveness of the corn stover HSC pretreatment based on a PLS analysis was conducted for the first time,and the test results of the predictability based on additional pretreatment experiments proved that the developed PLS model achieved a good predictive performance(particularly for the final total sugar yield),indicating that the developed PLS model can be used to predict the effectiveness of HSC pretreatment.Therefore,multivariate analysis can be potentially used to monitor and control the pretreatment process in future large-scale biorefinery applications.
基金National Key Technology R&D Program,Ministry of Science and Technology(2016YFD0100306,2016YFD0100203)National Natural Science Foundation of China(grants 31671746).
文摘Background:Stem hardness is one of the major influencing factors for plant architecture in upland cotton(Gossypium hirsutum L.).Evaluating hardness phenotypic traits is very important for the selection of elite lines for resistance to lodging in Gossypium hirsutum L.Cotton breeders are interested in using diverse genotypes to enhance fiber quality and high-yield.Few pieces of research for hardness and its relationship with fiber quality and yield were found.This study was designed to find the relationship of stem hardness traits with fiber quality and yield contributing traits of upland cotton.Results:Experiments were carried out to measure the bending,acupuncture,and compression properties of the stem from a collection of upland cotton genotypes,comprising 237 accessions.The results showed that the genotypic difference in stem hardness was highly significant among the genotypes,and the stem hardness traits(BL,BU,AL,AU,CL,and CU)have a positive association with fiber quality traits and yield-related traits.Statistical analyses of the results showed that in descriptive statistics result bending(BL,BU)has a maximum coefficient of variance,but fiber length and fiber strength have less coefficient of variance among the genotypes.Principal component analysis(PCA)trimmed quantitative characters into nine principal components.The first nine principal components(PC)with Eigenvalues>1 explained 86%of the variation among 237 accessions of cotton.Both 2017 and 2018,PCA results indicated that BL,BU,FL,FE,and LI contributed to their variability in PC1,and BU,AU,CU,FD,LP,and FWPB have shown their variability in PC2.Conclusion:We describe here the systematic study of the mechanism involved in the regulation of enhancing fiber quality and yield by stem bending strength,acupuncture,and compression properties of G.hirsutum.
基金Supported by the National Natural Science Foundation of China (No. 50877004)
文摘Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feature to improve the accuracy and efficiency when they are used for pattern recognition and clustering analysis. A Poisson Model-based Kernel (PMK) was proposed based on the Poisson distribution of the SAGE data. Kernel Principle Component Analysis (KPCA) with PMK was proposed and used in feature-extract analysis of mouse retinal SAGE data. The computa-tional results show that this algorithm can extract feature effectively and reduce dimensions of SAGE data.
文摘Today,securing devices connected to the internet is challenging as security threats are generated through various sources.The protection of cyber-physical systems from external attacks is a primary task.The presented method is planned on the prime motive of detecting cybersecurity attacks and their impacted parameters.The proposed architecture employs the LYSIS dataset and formulates Multi Variant Exploratory Data Analysis(MEDA)through Principle Component Analysis(PCA)and Singular Value Decompo-sition(SVD)for the extraction of unique parameters.The feature mappings are analyzed with Recurrent 2 Convolutional Neural Network(R2CNN)and Gradient Boost Regression(GBR)to identify the maximum correlation.Novel Late Fusion Aggregation enabled with Cyber-Net(LFAEC)is the robust derived algorithm.The quantitative analysis uses predicted threat points with actual threat variables from which mean and difference vectors areevaluated.The performance of the presented system is assessed against the parameters such as Accuracy,Precision,Recall,and F1 Score.The proposed method outperformed by 98% to 100% in all quality measures compared to existing methods.
文摘The goal of this paper is to propose a fast and secure multi-stage image compression-decompression system by using a wireless network between two Personal Computers (PCs). In this paper, the Principal Component Analysis (PCA) technique is used for multi-stage image compression and Inverse Principal Component Analysis (IPCA) for multi-stage image decompression. The first step of the proposed system is to select the input image, the second step is to perform PCA up to 9 times on the input image, this compression, and after multi-stage compression process then the third step begins by transforming across wireless Ad hoc Network (WANET) to the second computing device, forth step start with multi-stage decompression process up 9 times. The proposed system for different images is transferred over the wireless network using Transmission Control Protocol/Internet Protocol (TCP/IP), which is programmed using the network role property of the MATLAB program. The proposed system implements 25 different images correctly (100%). The main contribution of this paper is that we are dealing with the black image at the end of the compressed process ad start with a black image at the start of the decompressed process of this proposed system. In this work, the compressed and uncompressed images are compared with each other in their size and transmission time. This system can be very useful in networks because they provide a high level of protection to the transmitted data from hackers because they cannot guess how much the image has been compressed or what kind of information the image represents.
基金funded by the German Federal Ministry for Economic Affairs and Energy(BMWi)(01MD15009F).
文摘Liquid leakage from pipelines is a critical issue in large-scale process plants.Damage in pipelines affects the normal operation of the plant and increases maintenance costs.Furthermore,it causes unsafe and hazardous situations for operators.Therefore,the detection and localization of leakages is a crucial task for maintenance and condition monitoring.Recently,the use of infrared(IR)cameras was found to be a promising approach for leakage detection in large-scale plants.IR cameras can capture leaking liquid if it has a higher(or lower)temperature than its surroundings.In this paper,a method based on IR video data and machine vision techniques is proposed to detect and localize liquid leakages in a chemical process plant.Since the proposed method is a vision-based method and does not consider the physical properties of the leaking liquid,it is applicable for any type of liquid leakage(i.e.,water,oil,etc.).In this method,subsequent frames are subtracted and divided into blocks.Then,principle component analysis is performed in each block to extract features from the blocks.All subtracted frames within the blocks are individually transferred to feature vectors,which are used as a basis for classifying the blocks.The k-nearest neighbor algorithm is used to classify the blocks as normal(without leakage)or anomalous(with leakage).Finally,the positions of the leakages are determined in each anomalous block.In order to evaluate the approach,two datasets with two different formats,consisting of video footage of a laboratory demonstrator plant captured by an IR camera,are considered.The results show that the proposed method is a promising approach to detect and localize leakages from pipelines using IR videos.The proposed method has high accuracy and a reasonable detection time for leakage detection.The possibility of extending the proposed method to a real industrial plant and the limitations of this method are discussed at the end.
基金Knowledge Innovation Project of CAS, No.KZCX2-308Natural Science Foundation of Inner Mongolia Autonomous Region, No. 20010905-14
文摘Asian monsoon have multiple forms of variations such as seasonal variation, intra-seasonal variation, interannual variation, etc. The interannual variations have not only yearly variations but also variations among several years. In general, the yearly variations are described with winter temperature and summer precipitation, and the variations among several years are reflected by circulation of ENSO events. In this study, at first, we analyze the relationship between land cover and interannual monsoon variations represented by precipitation changes using Singular Value Decomposition method based on the time series precipitation data and 8km NOAA AVHRR NDVI data covering 1982 to 1993 in east Asia. Furthermore, after confirmation and reclassification of ENSO events which are recognized as the strong signal of several year monsoon variation, using the same time series NDVI data during 1982 to 1993 in east Asia, we make a Principle Component Analysis and analyzed the correlation of the 7th component eigenvectors and Southern Oscillation Index (SOI) that indicates the characteristic of ENSO events, and summed up the temporal-spatial distribution features of east Asian land cover’s inter-annual variations that are being driven by changes of ENSO events.
基金the National Key Technology R&D Program of China(2018YFD0500206)the National Natural Science Foundation of China(31572209,31772395 and 31972943)the Foundation for Safety of Agricultural Products by Ministry of Agriculture and Rural Affairs,China(GJFP2019033)。
文摘Effects of the heavy metal copper(Cu), the metalloid arsenic(As), and the antibiotic oxytetracycline(OTC) on bacterial community structure and diversity during cow and pig manure composting were investigated. Eight treatments were applied, four to each manure type, namely cow manure with:(1) no additives(control),(2) addition of heavy metal and metalloid,(3) addition of OTC and(4) addition of OTC with heavy metal and metalloid;and pig manure with:(5) no additives(control),(6) addition of heavy metal and metalloid,(7) addition of OTC and(8) addition of OTC with heavy metal and metalloid. After 35 days of composting, according to the alpha diversity indices, the combination treatment(OTC with heavy metal and metalloid) in pig manure was less harmful to microbial diversity than the control or heavy metal and metalloid treatments. In cow manure, the treatment with heavy metal and metalloid was the most harmful to the microbial community, followed by the combination and OTC treatments. The OTC and combination treatments had negative effects on the relative abundance of microbes in cow manure composts. The dominant phyla in both manure composts included Actinobacteria, Bacteroidetes, Firmicutes, and Proteobacteria. The microbial diversity relative abundance transformation was dependent on the composting time. Redundancy analysis(RDA) revealed that environmental parameters had the most influence on the bacterial communities. In conclusion, the composting process is the most sustainable technology for reducing heavy metal and metalloid impacts and antibiotic contamination in cow and pig manure. The physicochemical property variations in the manures had a significant effect on the microbial community during the composting process. This study provides an improved understanding of bacterial community composition and its changes during the composting process.
基金supported by the Natural Science Foundation of China(Grant No.31470715),(Grant No.31470714)the Fundamental Research Funds for the Central Universities(2572016EBT1)
文摘Lumber moisture content(LMC) is the important parameter to judge the dryness of lumber and the quality of wooden products.Nevertheless the data acquired are mostly redundant and incomplete because of the complexity of the course of drying,by interference factors that exist in the dryness environment and by the physical characteristics of the lumber itself.To improve the measuring accuracy and reliability of LMC,the optimal support vector machine(SVM) algorithm was put forward for regression analysis LMC.Environmental factors such as air temperature and relative humidity were considered,the data of which were extracted with the principle component analysis method.The regression and prediction of SVM was optimized based on the grid search(GS) technique.Groups of data were sampled and analyzed,and simulation comparison of forecasting performance shows that the main component data were extracted to speed up the convergence rate of the optimum algorithm.The GS-SVM shows a better performance in solving the LMC measuring and forecasting problem.
基金Sponsored bythe Basic Research Foundation of Beijing Institute of Technology (BIT-UBF-200301F03) BIT &Ericsson Cooperation Project
文摘Data-driven temporal filtering technique is integrated into the time trajectory of Teager energy operation (TEO) based feature parameter for improving the robustness of speech recognition system against noise. Three kinds of data-driven temporal filters are investigated for the motivation of alleviating the harmful effects that the environmental factors have on the speech. The filters include: principle component analysis (PCA) based filters, linear discriminant analysis (LDA) based filters and minimum classification error (MCE) based filters. Detailed comparative analysis among these temporal filtering approaches applied in Teager energy domain is presented. It is shown that while all of them can improve the recognition performance of the original TEO based feature parameter in adverse environment, MCE based temporal filtering can provide the lowest error rate as SNR decreases than any other algorithms.
文摘This study aimed to provide relevant knowledge about the dynamics of the hydrological parameters in the river-estuary continuum of the Wouri-Nkam river estuary for a sustainable management program. The hydrological parameters were recorded in eleven stations spanned out on the bas<span style="white-space:normal;"><span style="font-family:;" "="">is</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> of population density and human activities. Water quality parameters (Temperature, salinity, dissolved oxygen, pH, Total dissolved solutes, Redox potential and conductivity) were collected in subsurface water using a multiple parameter. Surface currents and morphometric (depth and width) parameters were recorded using a drifter, sonar depth and GPS. The field measurements took placed between 18/05/2019 to 08/09/2020 and were divided into six (06) cruises. The data were later subjected to an analysis of variance (ANOVA) and Principle Component Analysis using XLSTAT 2017 (2.7 version) software. Results obtained revealed that, the water quality parameters were spatially more stable no</span></span><span style="white-space:normal;"><span style="font-family:;" "="">t</span></span><span style="white-space:normal;"><span style="font-family:;" "="">signficant at (df = 9, p <</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> </span></span><span style="white-space:normal;"><span style="font-family:;" "="">0.05) with a relatively low temperature (25.5</span></span><span style="white-space:normal;"><span style="font-family:;" "=""><span style="white-space:nowrap;">°</span>C - 27<span style="white-space:nowrap;">°</span>C) during the wet period. The limit of the frontal zone extended to S5 (Bonalokan, 8.25</span></span><span style="white-space:normal;"><span style="font-family:;" "=""> </span></span><span style="white-space:normal;"><span style="font-family:;" "="">km from S1) during the snapshot of the dry period, spring phase and flood tide conditions. Inversely, during wet period, this extension reduced to S1 (Bridge) and relatively increases slightly to S3 (Bonangang) during the neap phase and ebb tides of this season. This result revealed a change in the axial gradient of about eight (08) and four (04) kilometers during the seasonal and tidal scales (lunar and diurnal periods) respectively. These changes were also accompanied by changes in the water quality signatures, that may affect the fishery distribution and compositions. However, futures studies to buttress the results of this investigation should focus on longer time series sampling methods and model developments.</span></span>
文摘Parkinson’s disease(PD)is a neurodegenerative disease in the central nervous system.Recently,more researches have been conducted in the determination of PD prediction which is really a challenging task.Due to the disorders in the central nervous system,the syndromes like off sleep,speech disorders,olfactory and autonomic dysfunction,sensory disorder symptoms will occur.The earliest diagnosing of PD is very challenging among the doctors community.There are techniques that are available in order to predict PD using symptoms and disorder measurement.It helps to save a million lives of future by early prediction.In this article,the early diagnosing of PD using machine learning techniques with feature selection is carried out.In the first stage,the data preprocessing is used for the preparation of Parkinson’s disease data.In the second stage,MFEA is used for extracting features.In the third stage,the feature selection is performed using multiple feature input with a principal component analysis(PCA)algorithm.Finally,a Darknet Convolutional Neural Network(DNetCNN)is used to classify the PD patients.The main advantage of using PCA-DNetCNN is that,it provides the best classification in the image dataset using YOLO.In addition to that,the results of various existing methods are compared and the proposed DNetCNN proves better accuracy,performance in detecting the PD at the initial stages.DNetCNN achieves 97.5%of accuracy in detecting PD as early.Besides,the other performance metrics are compared in the result evaluation and it is proved that the proposed model outperforms all the other existing models.
基金the Deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project No.IFT20004.
文摘Rice stem borer(Chilo agamemnon Bles.)is a primary insect pest of rice and is a major limiting factor to rice production.Breeding for insect-resistant crop varieties has been an economic way of integrated pest management(IPM)as it offers a viable and ecologically acceptable approach.This study was aimed to evaluate rice genotypes for their resistance against rice stem borer.Seven parental genotypes with twenty one F1 crosses were evaluated for genotypic variation in field experiments.Analysis of variance revealed significant differences for the studied traits in almost all crosses and parents.In addition,the mean squares of parents versus their crosses were signifi-cant for stem borer resistance and other associated traits.Moreover,both general combining ability(GCA)and specific combining ability(SCA)variances were highly significant for all characters studied in the F1 generation.Based on GCA,4 genotypes(Sakha101,Gz6903-3-4-2-1,Gz9577-4-1-1 and Hassawi)exhibited highly significant negative values for stem borer resistance(–0.53,–1.06,–0.18 and–0.49,respectively)indicating they are the best combiners for stem borer resistance.Based on SCA analysis,nine cross combinations showed highly significant negative effects for stem borer resistance.Similarly,the cross Giza178Hassawi was the best combination with significantly highest value for early maturity.In addition,seven crosses showed highly significant negative SCA for plant height trait.On the other hand,for panicle length,number of primary branches/panicle,panicle weight and 1000-grain weight,seven,four,eight and six crosses showed highly significant positive SCA,respectively.The result further revealed that the non-additive dominance genetic variance was higher than the additive variance for all evaluated traits indicating that non-additive genetic variances have a role in their inheritance.The broad-sense heritability estimates were high for all the studied traits.The stem borer resistance was significantly correlated with panicle weight and 1000-grain weight,which also showed a highly significant correlation with grain yield/plant.Thus these traits can be effectively employed in a breeding program to confer resistance against stem borer infestation in rice.It was further supported by biplot analysis,which clustered these potentially important traits into two quadrants showing their importance in any future breeding program to control stem borer infestation.This study has contributed valuable information for evaluation of genetic diversity in the local rice germplasm and its utilization in futuristic rice genetic improvement programs.
基金Supported by National Natural Science Foundation of China(No.61540069)
文摘The features extracted by principle component analysis(PCA) are the best descriptive and the features extracted by linear discriminant analysis(LDA) are the most classifiable. In this paper, these two methods are combined and a PC-LDA approach is used to extract the features of traffic signs. After obtaining the binary images of the traffic signs through normalization and binarization, PC-LDA can extract the feature subspace of the traffic sign images with the best description and classification. The extracted features are recognized by using the minimum distance classifier. The approach is verified by using MPEG7 CE Shape-1 Part-B computer shape library and traffic sign image library which includes both standard and natural traffic signs. The results show that under the condition that the traffic sign is in a nature scene, PC-LDA approach applied to binary images in which shape features are extracted can obtain better results.