The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches inc...The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches include coefficient of determination(R2),adjusted coefficient of determination(adj.-R2),root mean squared error(RMSE),Akaike's information criterion(AIC),bias correction of AIC(AICc) and Bayesian information criterion(BIC).The simulation data were generated by five growth models with different numbers of parameters.Four sets of real data were taken from the literature.The parameters in each of the five growth models were estimated using the maximum likelihood method under the assumption of the additive error structure for the data.The best supported model by the data was identified using each of the six approaches.The results show that R2 and RMSE have the same properties and perform worst.The sample size has an effect on the performance of adj.-R2,AIC,AICc and BIC.Adj.-R2 does better in small samples than in large samples.AIC is not suitable to use in small samples and tends to select more complex model when the sample size becomes large.AICc and BIC have best performance in small and large sample cases,respectively.Use of AICc or BIC is recommended for selection of fish growth model according to the size of the length-at-age data.展开更多
A sequential statistical approach was applied to optimizing the fermentation medium of epothilones(Epos) production by means of a mutant which was obtained by treating polyangium cellulosum ATCC 15384 with nitrite a...A sequential statistical approach was applied to optimizing the fermentation medium of epothilones(Epos) production by means of a mutant which was obtained by treating polyangium cellulosum ATCC 15384 with nitrite and ultraviolet. The effects of different carbon sources and nitrogen sources on the fermentation medium were tested, and the suitable ones were selected. Then a uniform design was employed to design the experiments. A linear model was developed for identifying the significant components in fermentation medium, while a third degree polynomial model was used for studying the relationship between the concentration of the components in fermentation medium and the yield of Epos(YEPs). A pattern search method was used for searching the optimum fermentation medium in the test space, which was as follows(g/L): potassium nitrate 8.00, soybean peptone 17.60, potassium hydrogen phos- phate 1.00, beef extraction 6.46, yeast extraction 1.00, calcium chloride 0.25, sodium chloride 1.00 and ferric chloride 0.02. The optimum fermentation medium was expected to result in a yield of Epos(YEPs) of 2.48 mg/L. The validation experiments with the optimum medium were performed in triplicate and the average yield of Epos was 2.45 mg/L which was 7.78 times higher than that of Epos prepared without optimization.展开更多
A body frame composed of thin sheet metal is a crucial structure that determines the safety performance of a vehicle.Designing a correct weight and high-performance automotive body is an emerging engineering problem.T...A body frame composed of thin sheet metal is a crucial structure that determines the safety performance of a vehicle.Designing a correct weight and high-performance automotive body is an emerging engineering problem.To improve the performance of the automotive frame,we attempt to reconstruct its design criteria based on statistical and mechanical approaches.At first,a fundamental study on the frame strength is conducted and a cross-sectional shape optimization problem is developed for designing the cross-sectional shape of an automobile frame having a very high mass efficiency for strength.Shape optimization is carried out using the nonlinear finite element method and a meta-modeling-based genetic algorithm.Data analysis of the obtained set of optimal results is performed to identify the dominant design variables by employing the smoothing spline analysis of variance,the principal component analysis,and the self-organizing map technique.The relationship between the cross-sectional shape and the objective function is also analyzed by hierarchical clustering.A design guideline is obtained from these statistical approach results.A comparison between the statistically obtained design guideline and the conventional one based on the designers’experience is performed based on mechanical interpretation of the optimal cross-sectional frame.Finally,a mechanically reasonable new general-purpose design guideline is proposed for the cross-sectional shape of the automotive frame.展开更多
INTRODUCTIONThe front portion of the eye consists of a transparent layer called the cornea.The cornea is an important optical component for vision and plays a role in the specific refraction of the eye.The cornea norm...INTRODUCTIONThe front portion of the eye consists of a transparent layer called the cornea.The cornea is an important optical component for vision and plays a role in the specific refraction of the eye.The cornea normally has convexity but the amount of protrusion progressively increases in patients with keratoconus.In other words,the cornea prolapses forward.Keratoconus is a bilateral,typically asymmetric and non-inflammatory degeneration of the cornea caused by corneal protrusion as a result of progressive thinning of the corneal stroma.Corneal thinning generally occurs in the inferior,inferotemporal or central regions of the cornea.展开更多
This paper contributes a sophisticated statistical method for the assessment of performance in routing protocols salient Mobile Ad Hoc Network(MANET)routing protocols:Destination Sequenced Distance Vector(DSDV),Ad hoc...This paper contributes a sophisticated statistical method for the assessment of performance in routing protocols salient Mobile Ad Hoc Network(MANET)routing protocols:Destination Sequenced Distance Vector(DSDV),Ad hoc On-Demand Distance Vector(AODV),Dynamic Source Routing(DSR),and Zone Routing Protocol(ZRP).In this paper,the evaluation will be carried out using complete sets of statistical tests such as Kruskal-Wallis,Mann-Whitney,and Friedman.It articulates a systematic evaluation of how the performance of the previous protocols varies with the number of nodes and the mobility patterns.The study is premised upon the Quality of Service(QoS)metrics of throughput,packet delivery ratio,and end-to-end delay to gain an adequate understanding of the operational efficiency of each protocol under different network scenarios.The findings explained significant differences in the performance of different routing protocols;as a result,decisions for the selection and optimization of routing protocols can be taken effectively according to different network requirements.This paper is a step forward in the general understanding of the routing dynamics of MANETs and contributes significantly to the strategic deployment of robust and efficient network infrastructures.展开更多
Texture analysis methods have been used in a variety of applications, for instance in remote sensing. Though widely used in electrical engineering, its application in atmospheric sciences is still limited. This paper ...Texture analysis methods have been used in a variety of applications, for instance in remote sensing. Though widely used in electrical engineering, its application in atmospheric sciences is still limited. This paper reviews some concepts of digital texture and statistical texture approach, applying them to a set of specific maps to analyze the correlation between texture measurements used in most papers. It is also proposed an improvement of the method by setting free a distance parameter and the use of a new texture measurement based on the Kullback-Leibler divergence. Eight statistical measurements were used: mean, contrast, standard deviation, cluster shade, cluster prominence, angular second moment, local homogeneity and Shannon entropy. The above statistical measurements were applied to simple maps and a set of rainfall fields measured with weather radar. The results indicate some high correlations, e.g. between the mean and the contrast or between the angular second moment, local homogeneity and the Shannon entropy, besides the potentiality of the method to discriminate maps.展开更多
In this paper, an improved nonlinear process fault detection method is proposed based on modified kernel partial least squares(KPLS). By integrating the statistical local approach(SLA) into the KPLS framework, two new...In this paper, an improved nonlinear process fault detection method is proposed based on modified kernel partial least squares(KPLS). By integrating the statistical local approach(SLA) into the KPLS framework, two new statistics are established to monitor changes in the underlying model. The new modeling strategy can avoid the Gaussian distribution assumption of KPLS. Besides, advantage of the proposed method is that the kernel latent variables can be obtained directly through the eigen value decomposition instead of the iterative calculation, which can improve the computing speed. The new method is applied to fault detection in the simulation benchmark of the Tennessee Eastman process. The simulation results show superiority on detection sensitivity and accuracy in comparison to KPLS monitoring.展开更多
A detailed landslide-susceptibility map was produced using a data-driven objective bivariate analysis method with datasets developed for a geographic information system (GIS). Known as one of the most landslide-pron...A detailed landslide-susceptibility map was produced using a data-driven objective bivariate analysis method with datasets developed for a geographic information system (GIS). Known as one of the most landslide-prone areas in China, the Zhongxian-Shizhu Segment in the Three Gorges Reservoir region of China was selected as a suitable case because of the frequency and distribution of landslides. The site covered an area of 260.93 km^2 with a landslide area of 5.32 km^2. Four data domains were used in this study, including remote sensing products, thematic maps, geological maps, and topographical maps, all with 25 m × 25 m pixels. Statistical relationships for landslide susceptibility were developed using landslide and landslide causative factor databases. All continuous variables were converted to categorical variables according to the percentile divisions of seed cells, and the corresponding class weight values were calculated and summed to create the susceptibility map. According to the map, 3.6% of the study area was identified as high-susceptibility. Extremely low-, very low-, low-, and medium-susceptibility zones covered 19.66%, 31.69%, 27.95%, and 17.1% of the area, respectively. The high- and medium-hazardons zones are along both sides of the Yangtze River, being in agreement with the actual distribution of landslides.展开更多
Landslide susceptibility map delineates the potential zones for landslides occurrence. The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Si...Landslide susceptibility map delineates the potential zones for landslides occurrence. The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Sikkim Himalaya. Six important causative factors for landslide occurrences were selected and corresponding thematic data layers were prepared in GIS. Topographic maps,satellite image,field data and published maps constitute the input data for thematic layer preparation. Numerical weights for different categories of these factors were determined based on a statistical approach and the weighted thematic layers were integrated in GIS environment to generate the landslide susceptibility map of the area. The landslide susceptibility map classifies the area into five different landslide susceptible zones i.e.,very high,high,moderate,low and very low. This map was validated using the existing landslide distribution in the area.展开更多
Metagenomics is the study of microbial communities sampled directly from their natural environment, without prior culturing. By enabling an analysis of populations including many (so-far) unculturable and often unkn...Metagenomics is the study of microbial communities sampled directly from their natural environment, without prior culturing. By enabling an analysis of populations including many (so-far) unculturable and often unknown microbes, metagenomics is revolutionizing the field of microbiology, and has excited researchers in many disciplines that could benefit from the study of environmental microbes, including those in ecology, environmental sciences, and biomedicine. Specific computational and statistical tools have been developed for metagenomic data analysis and comparison. New studies, however, have revealed various kinds of artifacts present in metagenomics data caused by limitations in the experimental protocols and/or inadequate data analysis procedures, which often lead to incorrect conclusions about a microbial community. Here, we review some of the artifacts, such as overestimation of species diversity and incorrect estimation of gene family frequencies, and discuss emerging computational approaches to address them. We also review potential challenges that metagenomics may encounter with the extensive application of next-generation sequencing (NGS) techniques.展开更多
Nutrient criteria provide a scientific foundation for the comprehensive evaluation, prevention,control and management of water eutrophication. In this review, the literature was examined to systematically evaluate the...Nutrient criteria provide a scientific foundation for the comprehensive evaluation, prevention,control and management of water eutrophication. In this review, the literature was examined to systematically evaluate the benefits, drawbacks, and applications of statistical analysis,paleolimnological reconstruction, stressor-response model, and model inference approaches for nutrient criteria determination. The developments and challenges in the determination of nutrient criteria in lakes and reservoirs are presented. Reference lakes can reflect the original states of lakes, but reference sites are often unavailable. Using the paleolimnological reconstruction method, it is often difficult to reconstruct the historical nutrient conditions of shallow lakes in which the sediments are easily disturbed. The model inference approach requires sufficient data to identify the appropriate equations and characterize a waterbody or group of waterbodies, thereby increasing the difficulty of establishing nutrient criteria. The stressor-response model is a potential development direction for nutrient criteria determination, and the mechanisms of stressor-response models should be studied further. Based on studies of the relationships among water ecological criteria, eutrophication, nutrient criteria and plankton, methods for determining nutrient criteria should be closely integrated with water management requirements.展开更多
We introduce here a work package for a National Natural Science Foundation of China Major Project. We propose to develop computational methodology starting from the theory of electronic excitation processes to predict...We introduce here a work package for a National Natural Science Foundation of China Major Project. We propose to develop computational methodology starting from the theory of electronic excitation processes to predicting the opto-electronic property for organic materials, in close collaborations with experiments. Through developing methods for the electron dynamics, considering superexchange electronic couplings, spin-orbit coupling elements between excited states, electron-phonon relaxation, intermolecular Coulomb and exchange terms we combine the statistical physics approaches including dynamic Monte Carlo, Boltzmann transport equation and Boltzmann statistics to predict the macroscopic properties of opto-electronic materials such as light-emitting efficiency, charge mobility, and exciton diffusion length. Experimental synthesis and characterization of D-A type ambipolar transport material as well as novel carbon based material will provide a test ground for the verification of theory.展开更多
A design-of-experiments methodology is used to develop a statistical model for the prediction of the hydrodynamics of a liquid–solid circulating fluidized bed. To illustrate the multilevel factorial design approach, ...A design-of-experiments methodology is used to develop a statistical model for the prediction of the hydrodynamics of a liquid–solid circulating fluidized bed. To illustrate the multilevel factorial design approach, a step by step methodology is taken to study the effects of the interactions among the independent factors considered on the performance variables. A multilevel full factorial design with three levels of the two factors and five levels of the third factor has been studied. Various statistical models such as the linear, two-factor interaction, quadratic, and cubic models are tested. The model has been developed to predict responses, viz., average solids holdup and solids circulation rate. The validity of the developed regression model is verified using the analysis of variance. Furthermore, the model developed was compared with an experimental dataset to assess its adequacy and reliability. This detailed statistical design methodology for non-linear systems considered here provides a very important tool for design and optimization in a cost-effective approach展开更多
基金Supported by the High Technology Research and Development Program of China (863 Program,No2006AA100301)
文摘The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches include coefficient of determination(R2),adjusted coefficient of determination(adj.-R2),root mean squared error(RMSE),Akaike's information criterion(AIC),bias correction of AIC(AICc) and Bayesian information criterion(BIC).The simulation data were generated by five growth models with different numbers of parameters.Four sets of real data were taken from the literature.The parameters in each of the five growth models were estimated using the maximum likelihood method under the assumption of the additive error structure for the data.The best supported model by the data was identified using each of the six approaches.The results show that R2 and RMSE have the same properties and perform worst.The sample size has an effect on the performance of adj.-R2,AIC,AICc and BIC.Adj.-R2 does better in small samples than in large samples.AIC is not suitable to use in small samples and tends to select more complex model when the sample size becomes large.AICc and BIC have best performance in small and large sample cases,respectively.Use of AICc or BIC is recommended for selection of fish growth model according to the size of the length-at-age data.
基金Supported by the Science Technology Development Project of Jilin Province,China(No.20020503-2)
文摘A sequential statistical approach was applied to optimizing the fermentation medium of epothilones(Epos) production by means of a mutant which was obtained by treating polyangium cellulosum ATCC 15384 with nitrite and ultraviolet. The effects of different carbon sources and nitrogen sources on the fermentation medium were tested, and the suitable ones were selected. Then a uniform design was employed to design the experiments. A linear model was developed for identifying the significant components in fermentation medium, while a third degree polynomial model was used for studying the relationship between the concentration of the components in fermentation medium and the yield of Epos(YEPs). A pattern search method was used for searching the optimum fermentation medium in the test space, which was as follows(g/L): potassium nitrate 8.00, soybean peptone 17.60, potassium hydrogen phos- phate 1.00, beef extraction 6.46, yeast extraction 1.00, calcium chloride 0.25, sodium chloride 1.00 and ferric chloride 0.02. The optimum fermentation medium was expected to result in a yield of Epos(YEPs) of 2.48 mg/L. The validation experiments with the optimum medium were performed in triplicate and the average yield of Epos was 2.45 mg/L which was 7.78 times higher than that of Epos prepared without optimization.
文摘A body frame composed of thin sheet metal is a crucial structure that determines the safety performance of a vehicle.Designing a correct weight and high-performance automotive body is an emerging engineering problem.To improve the performance of the automotive frame,we attempt to reconstruct its design criteria based on statistical and mechanical approaches.At first,a fundamental study on the frame strength is conducted and a cross-sectional shape optimization problem is developed for designing the cross-sectional shape of an automobile frame having a very high mass efficiency for strength.Shape optimization is carried out using the nonlinear finite element method and a meta-modeling-based genetic algorithm.Data analysis of the obtained set of optimal results is performed to identify the dominant design variables by employing the smoothing spline analysis of variance,the principal component analysis,and the self-organizing map technique.The relationship between the cross-sectional shape and the objective function is also analyzed by hierarchical clustering.A design guideline is obtained from these statistical approach results.A comparison between the statistically obtained design guideline and the conventional one based on the designers’experience is performed based on mechanical interpretation of the optimal cross-sectional frame.Finally,a mechanically reasonable new general-purpose design guideline is proposed for the cross-sectional shape of the automotive frame.
基金Supported by T.C.Ministry of Science,Industry and Technology in the scope of the SAN-TEZ Project(No.0477.STZ.2013-2),with the partners Yildirim Beyazit University and Akgn Software
文摘INTRODUCTIONThe front portion of the eye consists of a transparent layer called the cornea.The cornea is an important optical component for vision and plays a role in the specific refraction of the eye.The cornea normally has convexity but the amount of protrusion progressively increases in patients with keratoconus.In other words,the cornea prolapses forward.Keratoconus is a bilateral,typically asymmetric and non-inflammatory degeneration of the cornea caused by corneal protrusion as a result of progressive thinning of the corneal stroma.Corneal thinning generally occurs in the inferior,inferotemporal or central regions of the cornea.
基金supported by Northern Border University,Arar,KSA,through the Project Number“NBU-FFR-2024-2248-02”.
文摘This paper contributes a sophisticated statistical method for the assessment of performance in routing protocols salient Mobile Ad Hoc Network(MANET)routing protocols:Destination Sequenced Distance Vector(DSDV),Ad hoc On-Demand Distance Vector(AODV),Dynamic Source Routing(DSR),and Zone Routing Protocol(ZRP).In this paper,the evaluation will be carried out using complete sets of statistical tests such as Kruskal-Wallis,Mann-Whitney,and Friedman.It articulates a systematic evaluation of how the performance of the previous protocols varies with the number of nodes and the mobility patterns.The study is premised upon the Quality of Service(QoS)metrics of throughput,packet delivery ratio,and end-to-end delay to gain an adequate understanding of the operational efficiency of each protocol under different network scenarios.The findings explained significant differences in the performance of different routing protocols;as a result,decisions for the selection and optimization of routing protocols can be taken effectively according to different network requirements.This paper is a step forward in the general understanding of the routing dynamics of MANETs and contributes significantly to the strategic deployment of robust and efficient network infrastructures.
文摘Texture analysis methods have been used in a variety of applications, for instance in remote sensing. Though widely used in electrical engineering, its application in atmospheric sciences is still limited. This paper reviews some concepts of digital texture and statistical texture approach, applying them to a set of specific maps to analyze the correlation between texture measurements used in most papers. It is also proposed an improvement of the method by setting free a distance parameter and the use of a new texture measurement based on the Kullback-Leibler divergence. Eight statistical measurements were used: mean, contrast, standard deviation, cluster shade, cluster prominence, angular second moment, local homogeneity and Shannon entropy. The above statistical measurements were applied to simple maps and a set of rainfall fields measured with weather radar. The results indicate some high correlations, e.g. between the mean and the contrast or between the angular second moment, local homogeneity and the Shannon entropy, besides the potentiality of the method to discriminate maps.
基金Supported by the Special Scientific Research of Selection and Cultivation of Excellent Young Teachers in Shanghai Universities(YYY11076)
文摘In this paper, an improved nonlinear process fault detection method is proposed based on modified kernel partial least squares(KPLS). By integrating the statistical local approach(SLA) into the KPLS framework, two new statistics are established to monitor changes in the underlying model. The new modeling strategy can avoid the Gaussian distribution assumption of KPLS. Besides, advantage of the proposed method is that the kernel latent variables can be obtained directly through the eigen value decomposition instead of the iterative calculation, which can improve the computing speed. The new method is applied to fault detection in the simulation benchmark of the Tennessee Eastman process. The simulation results show superiority on detection sensitivity and accuracy in comparison to KPLS monitoring.
基金supported by the National Natural Science Foundation of China (Nos.40801212 and 49971064)the Foun-dation for China Geological Survey (No.200316000035)+1 种基金the Natural Science Foundation of Jiangsu Higher Education Institutions of China (No.06KJB170063)the Opening Fund of State Key Laboratory of Geohazard Prevention and Geoenvironment Protection of Chendu University of Technology, China (No.GZ2007-11).
文摘A detailed landslide-susceptibility map was produced using a data-driven objective bivariate analysis method with datasets developed for a geographic information system (GIS). Known as one of the most landslide-prone areas in China, the Zhongxian-Shizhu Segment in the Three Gorges Reservoir region of China was selected as a suitable case because of the frequency and distribution of landslides. The site covered an area of 260.93 km^2 with a landslide area of 5.32 km^2. Four data domains were used in this study, including remote sensing products, thematic maps, geological maps, and topographical maps, all with 25 m × 25 m pixels. Statistical relationships for landslide susceptibility were developed using landslide and landslide causative factor databases. All continuous variables were converted to categorical variables according to the percentile divisions of seed cells, and the corresponding class weight values were calculated and summed to create the susceptibility map. According to the map, 3.6% of the study area was identified as high-susceptibility. Extremely low-, very low-, low-, and medium-susceptibility zones covered 19.66%, 31.69%, 27.95%, and 17.1% of the area, respectively. The high- and medium-hazardons zones are along both sides of the Yangtze River, being in agreement with the actual distribution of landslides.
文摘Landslide susceptibility map delineates the potential zones for landslides occurrence. The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Sikkim Himalaya. Six important causative factors for landslide occurrences were selected and corresponding thematic data layers were prepared in GIS. Topographic maps,satellite image,field data and published maps constitute the input data for thematic layer preparation. Numerical weights for different categories of these factors were determined based on a statistical approach and the weighted thematic layers were integrated in GIS environment to generate the landslide susceptibility map of the area. The landslide susceptibility map classifies the area into five different landslide susceptible zones i.e.,very high,high,moderate,low and very low. This map was validated using the existing landslide distribution in the area.
基金supported by NIH under Grant No. 1R01HG004908-01NSF of USA under Grant No. DBI-0845685 (YY)the Gordon and Betty Moore Foundation for the Community Cyberinfrastructure for Marine Microbial Ecological Research and Analysis (CAMERA) Project (JW)
文摘Metagenomics is the study of microbial communities sampled directly from their natural environment, without prior culturing. By enabling an analysis of populations including many (so-far) unculturable and often unknown microbes, metagenomics is revolutionizing the field of microbiology, and has excited researchers in many disciplines that could benefit from the study of environmental microbes, including those in ecology, environmental sciences, and biomedicine. Specific computational and statistical tools have been developed for metagenomic data analysis and comparison. New studies, however, have revealed various kinds of artifacts present in metagenomics data caused by limitations in the experimental protocols and/or inadequate data analysis procedures, which often lead to incorrect conclusions about a microbial community. Here, we review some of the artifacts, such as overestimation of species diversity and incorrect estimation of gene family frequencies, and discuss emerging computational approaches to address them. We also review potential challenges that metagenomics may encounter with the extensive application of next-generation sequencing (NGS) techniques.
基金supported by the National key research and development program of China (No. 2017YFA0605003)the National Natural Science Foundation of China (No. 41521003)
文摘Nutrient criteria provide a scientific foundation for the comprehensive evaluation, prevention,control and management of water eutrophication. In this review, the literature was examined to systematically evaluate the benefits, drawbacks, and applications of statistical analysis,paleolimnological reconstruction, stressor-response model, and model inference approaches for nutrient criteria determination. The developments and challenges in the determination of nutrient criteria in lakes and reservoirs are presented. Reference lakes can reflect the original states of lakes, but reference sites are often unavailable. Using the paleolimnological reconstruction method, it is often difficult to reconstruct the historical nutrient conditions of shallow lakes in which the sediments are easily disturbed. The model inference approach requires sufficient data to identify the appropriate equations and characterize a waterbody or group of waterbodies, thereby increasing the difficulty of establishing nutrient criteria. The stressor-response model is a potential development direction for nutrient criteria determination, and the mechanisms of stressor-response models should be studied further. Based on studies of the relationships among water ecological criteria, eutrophication, nutrient criteria and plankton, methods for determining nutrient criteria should be closely integrated with water management requirements.
基金the National Natural Science Foundation of China (21290191)
文摘We introduce here a work package for a National Natural Science Foundation of China Major Project. We propose to develop computational methodology starting from the theory of electronic excitation processes to predicting the opto-electronic property for organic materials, in close collaborations with experiments. Through developing methods for the electron dynamics, considering superexchange electronic couplings, spin-orbit coupling elements between excited states, electron-phonon relaxation, intermolecular Coulomb and exchange terms we combine the statistical physics approaches including dynamic Monte Carlo, Boltzmann transport equation and Boltzmann statistics to predict the macroscopic properties of opto-electronic materials such as light-emitting efficiency, charge mobility, and exciton diffusion length. Experimental synthesis and characterization of D-A type ambipolar transport material as well as novel carbon based material will provide a test ground for the verification of theory.
文摘A design-of-experiments methodology is used to develop a statistical model for the prediction of the hydrodynamics of a liquid–solid circulating fluidized bed. To illustrate the multilevel factorial design approach, a step by step methodology is taken to study the effects of the interactions among the independent factors considered on the performance variables. A multilevel full factorial design with three levels of the two factors and five levels of the third factor has been studied. Various statistical models such as the linear, two-factor interaction, quadratic, and cubic models are tested. The model has been developed to predict responses, viz., average solids holdup and solids circulation rate. The validity of the developed regression model is verified using the analysis of variance. Furthermore, the model developed was compared with an experimental dataset to assess its adequacy and reliability. This detailed statistical design methodology for non-linear systems considered here provides a very important tool for design and optimization in a cost-effective approach