In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without ...In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without a multitaper approach for spectral estimation.There are several common ways to increase the reliability of the Fourier spectral estimation from experimental(noisy)data;for example to subdivide the experimental time series into segments,taper these segments(using single taper),perform the Fourier transform of the individual segments,and average the resulting spectra.展开更多
In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge pr...In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.展开更多
In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the in...In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the influences of baking process parameters, such as baking time, surface heating temperature and bottom heating temperature, on the quality of the cookie were studied to optimize the baking process parameters. The results showed that the baking process parameters had obvious effects on the texture, color, deformation, moisture content, and temperature of the cookie. All of the roasting surface heating temperature, bottom heating temperature and baking time had positive influences on the hardness, crunchiness, crispiness, and the total color difference(ΔE) of the cookie. When the heating temperatures of the surfac and bottom increased, the diameter and thickness deformation rate of the cookie increased. However,with the extension of baking time, the diameter and thickness deformation rate of the cookie first increased and then decreased. With the surface heating temperature of 180 ℃, the bottom heating temperature of 150 ℃, and baking time of 15 min, the cookie was crisp and moderate with moderate deformation and uniform color. There was no burnt phenomenon with the desired quality. Research results provided a theoretical basis for cookie manufactory based on food 3D printing technology.展开更多
Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentime...Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentiment analysisin widely spoken languages such as English, Chinese, Arabic, Roman Arabic, and more, we come to grapplingwith resource-poor languages like Urdu literature which becomes a challenge. Urdu is a uniquely crafted language,characterized by a script that amalgamates elements from diverse languages, including Arabic, Parsi, Pashtu,Turkish, Punjabi, Saraiki, and more. As Urdu literature, characterized by distinct character sets and linguisticfeatures, presents an additional hurdle due to the lack of accessible datasets, rendering sentiment analysis aformidable undertaking. The limited availability of resources has fueled increased interest among researchers,prompting a deeper exploration into Urdu sentiment analysis. This research is dedicated to Urdu languagesentiment analysis, employing sophisticated deep learning models on an extensive dataset categorized into fivelabels: Positive, Negative, Neutral, Mixed, and Ambiguous. The primary objective is to discern sentiments andemotions within the Urdu language, despite the absence of well-curated datasets. To tackle this challenge, theinitial step involves the creation of a comprehensive Urdu dataset by aggregating data from various sources such asnewspapers, articles, and socialmedia comments. Subsequent to this data collection, a thorough process of cleaningand preprocessing is implemented to ensure the quality of the data. The study leverages two well-known deeplearningmodels, namely Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN), for bothtraining and evaluating sentiment analysis performance. Additionally, the study explores hyperparameter tuning tooptimize the models’ efficacy. Evaluation metrics such as precision, recall, and the F1-score are employed to assessthe effectiveness of the models. The research findings reveal that RNN surpasses CNN in Urdu sentiment analysis,gaining a significantly higher accuracy rate of 91%. This result accentuates the exceptional performance of RNN,solidifying its status as a compelling option for conducting sentiment analysis tasks in the Urdu language.展开更多
Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in ...Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.展开更多
Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode che...Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.展开更多
This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze t...This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.展开更多
The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic suppor...The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.展开更多
Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inc...Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.展开更多
Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In ord...Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.展开更多
[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analy...[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.展开更多
In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troubleso...In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and deter- mining the control limits of monitoring statistics in the presence of non-Gaussian distribution. To optimize the number of key independent components~ we introctuce-anoveiconcept of-system-cleviation, which is ab^e'io'evalu[ ate the reconstructed observations with different independent components. The monitored statistics arc transformed to Gaussian distribution data by means of Box-Cox transformation, which helps readily determine the control limits. The proposed method is applied to on-line monitoring of a fed-hatch penicillin fermentation simulator, and the ex- _perimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.展开更多
The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical ...The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.展开更多
Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic ...Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic rock heterogeneity.The constitutive law of representative volume element(RVE)is established according to continuum damage mechanics in which double-damage criterion is considered.The damage evolution and accumulation of RVEs are used to reveal the macroscopic rock failure characteristics.Each single RVE will be represented by one unique element.The initiation,propagation and coalescence of meso-to macro-cracks are captured by smearing failed elements.The above ideas are formulated into the framework of the DSM and programed into self-developed rock failure process analysis(RFPA)software.Two laboratory-scale examples are conducted and the well-known engineering-scale tests,i.e.Atomic Energy of Canada Limited’s(AECL’s)Underground Research Laboratory(URL)tests,are used for verification.It shows that the simulation results match with other experimental results and field observations.展开更多
The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was...The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was used to decompose two blasting seismic signals with the continuous wavelet transforms (CWT). The resultshows that wavelet analysis is the better method to help us determine the essential factors which create damage effectsthan Fourier analysis.展开更多
Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In or...Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.展开更多
Abstract Data-driven tools, such as principal component analysis (PCA) and independent component analysis (ICA) have been applied to different benchmarks as process monitoring methods. The difference between the t...Abstract Data-driven tools, such as principal component analysis (PCA) and independent component analysis (ICA) have been applied to different benchmarks as process monitoring methods. The difference between the two methods is that the components of PCA are still dependent while ICA has no orthogonality constraint and its latentvariables are independent. Process monitoring with PCA often supposes that process data or principal components is Gaussian distribution. However, this kind of constraint cannot be satisfied by several practical processes. To ex-tend the use of PCA, a nonparametric method is added to PCA to overcome the difficulty, and kernel density estimation (KDE) is rather a good choice. Though ICA is based on non-Gaussian distribution intormation, .KDE can help in the close monitoring of the data. Methods, such as PCA, ICA, PCA.with .KDE(KPCA), and ICA with KDE,(KICA), are demonstrated and. compared by applying them to a practical industnal Spheripol craft polypropylene catalyzer reactor instead of a laboratory emulator.展开更多
This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzh...This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzhou, China. First, citizens registered various items constituting desirable values of residential outdoor spaces through a preliminary questionnaire. The result proposed three general attributes (functional, aesthetic and ecological) and ten specific qualities of residential outdoor spaces. An analytic hierarchy process (AHP) was applied to an interview survey in order to clarify the weights among these attributes and qualities. Second, principal factors were extracted from the ten specific qualities with principal component analysis (PCA) for both the common case and the campus case. In addition, the variations of respondents’ groups were classified with cluster analysis (CA) using the results of the PCA. The results of the AHP application found that the public prefers the functional attribute, rather than the aesthetic attribute. The latter is always viewed as the core value of open spaces in the eyes of architects and designers. Fur-thermore, comparisons of ten specific qualities showed that the public prefers the open spaces that can be utilized conveniently and easily for group activities, because such spaces keep an active lifestyle of neighborhood communication, which is also seen to protect human-regarding residential environments. Moreover, different groups of respondents diverge largely in terms of gender, age, behavior and preference.展开更多
Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated thro...Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated through consistency test, making the data consumption uniform to improve the comparability of data. Due to the limit of input and output data of decision making unit in data envelopment analysis(DEA), the energy efficiency data from the same technology in a certain year are disposed monthly using DEA. The DEA data of energy efficiency from the same technology are weighted and fused using analytic hierarchy process. The energy efficiency data from different technologies are evaluated by their relative effectiveness to find the direction of energy saving and consumption reduction.展开更多
In modern transportation,pavement is one of the most important civil infrastructures for the movement of vehicles and pedestrians.Pavement service quality and service life are of great importance for civil engineers a...In modern transportation,pavement is one of the most important civil infrastructures for the movement of vehicles and pedestrians.Pavement service quality and service life are of great importance for civil engineers as they directly affect the regular service for the users.Therefore,monitoring the health status of pavement before irreversible damage occurs is essential for timely maintenance,which in turn ensures public transportation safety.Many pavement damages can be detected and analyzed by monitoring the structure dynamic responses and evaluating road surface conditions.Advanced technologies can be employed for the collection and analysis of such data,including various intrusive sensing techniques,image processing techniques,and machine learning methods.This review summarizes the state-ofthe-art of these three technologies in pavement engineering in recent years and suggests possible developments for future pavement monitoring and analysis based on these approaches.展开更多
文摘In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without a multitaper approach for spectral estimation.There are several common ways to increase the reliability of the Fourier spectral estimation from experimental(noisy)data;for example to subdivide the experimental time series into segments,taper these segments(using single taper),perform the Fourier transform of the individual segments,and average the resulting spectra.
文摘In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.
基金Supported by Heilongjiang Provincial Fruit Tree Modernization Agro-industrial Technology Collaborative Innovation and Promotion System Project(2019-13)。
文摘In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the influences of baking process parameters, such as baking time, surface heating temperature and bottom heating temperature, on the quality of the cookie were studied to optimize the baking process parameters. The results showed that the baking process parameters had obvious effects on the texture, color, deformation, moisture content, and temperature of the cookie. All of the roasting surface heating temperature, bottom heating temperature and baking time had positive influences on the hardness, crunchiness, crispiness, and the total color difference(ΔE) of the cookie. When the heating temperatures of the surfac and bottom increased, the diameter and thickness deformation rate of the cookie increased. However,with the extension of baking time, the diameter and thickness deformation rate of the cookie first increased and then decreased. With the surface heating temperature of 180 ℃, the bottom heating temperature of 150 ℃, and baking time of 15 min, the cookie was crisp and moderate with moderate deformation and uniform color. There was no burnt phenomenon with the desired quality. Research results provided a theoretical basis for cookie manufactory based on food 3D printing technology.
文摘Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentiment analysisin widely spoken languages such as English, Chinese, Arabic, Roman Arabic, and more, we come to grapplingwith resource-poor languages like Urdu literature which becomes a challenge. Urdu is a uniquely crafted language,characterized by a script that amalgamates elements from diverse languages, including Arabic, Parsi, Pashtu,Turkish, Punjabi, Saraiki, and more. As Urdu literature, characterized by distinct character sets and linguisticfeatures, presents an additional hurdle due to the lack of accessible datasets, rendering sentiment analysis aformidable undertaking. The limited availability of resources has fueled increased interest among researchers,prompting a deeper exploration into Urdu sentiment analysis. This research is dedicated to Urdu languagesentiment analysis, employing sophisticated deep learning models on an extensive dataset categorized into fivelabels: Positive, Negative, Neutral, Mixed, and Ambiguous. The primary objective is to discern sentiments andemotions within the Urdu language, despite the absence of well-curated datasets. To tackle this challenge, theinitial step involves the creation of a comprehensive Urdu dataset by aggregating data from various sources such asnewspapers, articles, and socialmedia comments. Subsequent to this data collection, a thorough process of cleaningand preprocessing is implemented to ensure the quality of the data. The study leverages two well-known deeplearningmodels, namely Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN), for bothtraining and evaluating sentiment analysis performance. Additionally, the study explores hyperparameter tuning tooptimize the models’ efficacy. Evaluation metrics such as precision, recall, and the F1-score are employed to assessthe effectiveness of the models. The research findings reveal that RNN surpasses CNN in Urdu sentiment analysis,gaining a significantly higher accuracy rate of 91%. This result accentuates the exceptional performance of RNN,solidifying its status as a compelling option for conducting sentiment analysis tasks in the Urdu language.
文摘Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.
基金Supported by the National Natural Science Foundation of China(61273160,61403418)the Natural Science Foundation of Shandong Province(ZR2011FM014)+1 种基金the Fundamental Research Funds for the Central Universities(10CX04046A)the Doctoral Fund of Shandong Province(BS2012ZZ011)
文摘Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.
文摘This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.
基金supported by the National Natural Science Foundation of China(71171008)
文摘The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.
基金National Science &Technology Pillar Program during the Twelfth Five-year Plan Period(No.2012BAF01B02)National Science and Technology Major Project of China(No.2012ZX04005031)
文摘Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.
基金Supported by the National Basic Research Program of China (2013CB733600), the National Natural Science Foundation of China (21176073), the Doctoral Fund of Ministry of Education of China (20090074110005), the Program for New Century Excellent Talents in University (NCET-09-0346), Shu Guang Project (09SG29) and the Fundamental Research Funds for the Central Universities.
文摘Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.
文摘[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.
文摘In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and deter- mining the control limits of monitoring statistics in the presence of non-Gaussian distribution. To optimize the number of key independent components~ we introctuce-anoveiconcept of-system-cleviation, which is ab^e'io'evalu[ ate the reconstructed observations with different independent components. The monitored statistics arc transformed to Gaussian distribution data by means of Box-Cox transformation, which helps readily determine the control limits. The proposed method is applied to on-line monitoring of a fed-hatch penicillin fermentation simulator, and the ex- _perimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.
基金Supported by National Natural Science Foundation of China(Grant No.51805260)National Natural Science Foundation for Distinguished Young Scholars of China(Grant No.51925505)National Natural Science Foundation of China(Grant No.51775278).
文摘The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.
基金supported in part by the National Natural Science Foundation of China (Grant Nos.51679028 and 51879034)Key Laboratory for Geomechanics and Deep Underground Engineering, China University of Mining and Technology (Grant No. SKLGDUEK1804)the Fundamental Research Funds for the Central Universities (Grant No.DUT18JC10)
文摘Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic rock heterogeneity.The constitutive law of representative volume element(RVE)is established according to continuum damage mechanics in which double-damage criterion is considered.The damage evolution and accumulation of RVEs are used to reveal the macroscopic rock failure characteristics.Each single RVE will be represented by one unique element.The initiation,propagation and coalescence of meso-to macro-cracks are captured by smearing failed elements.The above ideas are formulated into the framework of the DSM and programed into self-developed rock failure process analysis(RFPA)software.Two laboratory-scale examples are conducted and the well-known engineering-scale tests,i.e.Atomic Energy of Canada Limited’s(AECL’s)Underground Research Laboratory(URL)tests,are used for verification.It shows that the simulation results match with other experimental results and field observations.
文摘The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was used to decompose two blasting seismic signals with the continuous wavelet transforms (CWT). The resultshows that wavelet analysis is the better method to help us determine the essential factors which create damage effectsthan Fourier analysis.
基金Supported by the National Natural Science Foundation of China (No.60421002).
文摘Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.
基金Supported by the National Natural Science Foundation of China (No.60574047) and the Doctorate Foundation of the State Education Ministry of China (No.20050335018).
文摘Abstract Data-driven tools, such as principal component analysis (PCA) and independent component analysis (ICA) have been applied to different benchmarks as process monitoring methods. The difference between the two methods is that the components of PCA are still dependent while ICA has no orthogonality constraint and its latentvariables are independent. Process monitoring with PCA often supposes that process data or principal components is Gaussian distribution. However, this kind of constraint cannot be satisfied by several practical processes. To ex-tend the use of PCA, a nonparametric method is added to PCA to overcome the difficulty, and kernel density estimation (KDE) is rather a good choice. Though ICA is based on non-Gaussian distribution intormation, .KDE can help in the close monitoring of the data. Methods, such as PCA, ICA, PCA.with .KDE(KPCA), and ICA with KDE,(KICA), are demonstrated and. compared by applying them to a practical industnal Spheripol craft polypropylene catalyzer reactor instead of a laboratory emulator.
文摘This study examined public attitudes concerning the value of outdoor spaces which people use daily. Two successive analyses were performed based on data from common residents and college students in the city of Hangzhou, China. First, citizens registered various items constituting desirable values of residential outdoor spaces through a preliminary questionnaire. The result proposed three general attributes (functional, aesthetic and ecological) and ten specific qualities of residential outdoor spaces. An analytic hierarchy process (AHP) was applied to an interview survey in order to clarify the weights among these attributes and qualities. Second, principal factors were extracted from the ten specific qualities with principal component analysis (PCA) for both the common case and the campus case. In addition, the variations of respondents’ groups were classified with cluster analysis (CA) using the results of the PCA. The results of the AHP application found that the public prefers the functional attribute, rather than the aesthetic attribute. The latter is always viewed as the core value of open spaces in the eyes of architects and designers. Fur-thermore, comparisons of ten specific qualities showed that the public prefers the open spaces that can be utilized conveniently and easily for group activities, because such spaces keep an active lifestyle of neighborhood communication, which is also seen to protect human-regarding residential environments. Moreover, different groups of respondents diverge largely in terms of gender, age, behavior and preference.
基金Supported by the National Natural Science Foundation of China(61374166)the Doctoral Fund of Ministry of Education of China(20120010110010)the Fundamental Research Funds for the Central Universities(YS1404)
文摘Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated through consistency test, making the data consumption uniform to improve the comparability of data. Due to the limit of input and output data of decision making unit in data envelopment analysis(DEA), the energy efficiency data from the same technology in a certain year are disposed monthly using DEA. The DEA data of energy efficiency from the same technology are weighted and fused using analytic hierarchy process. The energy efficiency data from different technologies are evaluated by their relative effectiveness to find the direction of energy saving and consumption reduction.
基金supported by the National Key R&D Program of China(2017YFF0205600)the International Research Cooperation Seed Fund of Beijing University of Technology(2018A08)+1 种基金Science and Technology Project of Beijing Municipal Commission of Transport(2018-kjc-01-213)the Construction of Service Capability of Scientific and Technological Innovation-Municipal Level of Fundamental Research Funds(Scientific Research Categories)of Beijing City(PXM2019_014204_500032).
文摘In modern transportation,pavement is one of the most important civil infrastructures for the movement of vehicles and pedestrians.Pavement service quality and service life are of great importance for civil engineers as they directly affect the regular service for the users.Therefore,monitoring the health status of pavement before irreversible damage occurs is essential for timely maintenance,which in turn ensures public transportation safety.Many pavement damages can be detected and analyzed by monitoring the structure dynamic responses and evaluating road surface conditions.Advanced technologies can be employed for the collection and analysis of such data,including various intrusive sensing techniques,image processing techniques,and machine learning methods.This review summarizes the state-ofthe-art of these three technologies in pavement engineering in recent years and suggests possible developments for future pavement monitoring and analysis based on these approaches.