This paper discusses the utilization of latent variable modeling related to occupational health and safety in the mining industry.Latent variable modeling,which is a statistical model that relates observable and laten...This paper discusses the utilization of latent variable modeling related to occupational health and safety in the mining industry.Latent variable modeling,which is a statistical model that relates observable and latent variables,could be used to facilitate researchers’understandings of the underlying constructs or hypothetical factors and their magnitude of effect that constitute a complex system.This enhanced understanding,in turn,can help emphasize the important factors to improve mine safety.The most commonly used techniques include the exploratory factor analysis(EFA),the confirmatory factor analysis(CFA)and the structural equation model with latent variables(SEM).A critical comparison of the three techniques regarding mine safety is provided.Possible applications of latent variable modeling in mining engineering are explored.In this scope,relevant research papers were reviewed.They suggest that the application of such methods could prove useful in mine accident and safety research.Application of latent variables analysis in cognitive work analysis was proposed to improve the understanding of human-work relationships in mining operations.展开更多
Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is p...Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.展开更多
Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in speci...Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.展开更多
An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction ste...An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction step, by which wavelet transform and principal component analysis are used to capture the inherent characteristics from process measurements, followed by a similarity assessment step using hidden Markov model (HMM) for pattern comparison. In most previous cases, a fixed-length moving window was employed to track dynamic data, and often failed to capture enough information for each fault and sometimes even deteriorated the diagnostic performance. A variable moving window, the length of which is modified with time, is introduced in this paper and case studies on the Tennessee Eastman process illustrate the potential of the proposed method.展开更多
Inferential models are widely used in the chemical industry to infer key process variables, which are challenging or expensive to measure, from other more easily measured variables. The aim of this paper is three-fold...Inferential models are widely used in the chemical industry to infer key process variables, which are challenging or expensive to measure, from other more easily measured variables. The aim of this paper is three-fold: to present a theoretical review of some of the well known linear inferential modeling techniques, to enhance the predictive ability of the regularized canonical correlation analysis (RCCA) method, and finally to compare the performances of these techniques and highlight some of the practical issues that can affect their predictive abilities. The inferential modeling techniques considered in this study include full rank modeling techniques, such as ordinary least square (OLS) regression and ridge regression (RR), and latent variable regression (LVR) techniques, such as principal component regression (PCR), partial least squares (PLS) regression, and regularized canonical correlation analysis (RCCA). The theoretical analysis shows that the loading vectors used in LVR modeling can be computed by solving eigenvalue problems. Also, for the RCCA method, we show that by optimizing the regularization parameter, an improvement in prediction accuracy can be achieved over other modeling techniques. To illustrate the performances of all inferential modeling techniques, a comparative analysis was performed through two simulated examples, one using synthetic data and the other using simulated distillation column data. All techniques are optimized and compared by computing the cross validation mean square error using unseen testing data. The results of this comparative analysis show that scaling the data helps improve the performances of all modeling techniques, and that the LVR techniques outperform the full rank ones. One reason for this advantage is that the LVR techniques improve the conditioning of the model by discarding the latent variables (or principal components) with small eigenvalues, which also reduce the effect of the noise on the model prediction. The results also show that PCR and PLS have comparable performances, and that RCCA can provide an advantage by optimizing its regularization parameter.展开更多
Feasibility analysis of soft constraints for input and output variables is critical for model predictive control(MPC).When encountering the infeasible situation, some way should be found to adjust the constraints to g...Feasibility analysis of soft constraints for input and output variables is critical for model predictive control(MPC).When encountering the infeasible situation, some way should be found to adjust the constraints to guarantee that the optimal control law exists. For MPC integrated with soft sensor, considering the soft constraints for critical variables additionally makes it more complicated and difficult for feasibility analysis and constraint adjustment. Therefore, the main contributions are that a linear programming approach is proposed for feasibility analysis, and the corresponding constraint adjustment method and procedure are given as well. The feasibility analysis gives considerations to the manipulated, secondary and critical variables, and the increment of manipulated variables as well. The feasibility analysis and the constraint adjustment are conducted in the entire control process and guarantee the existence of optimal control. In final, a simulation case confirms the contributions in this paper.展开更多
The use of a crop model like STICS for appropriate management decision support requires a good knowledge of all the parameters of the model. Among them, the soil parameters are difficult to know at each point of inter...The use of a crop model like STICS for appropriate management decision support requires a good knowledge of all the parameters of the model. Among them, the soil parameters are difficult to know at each point of interest and costly techniques may be used to measure them. It is therefore important to know which soil parameters need to be determined. It can be stated that those which affect significantly the output variable deserve an accurate determination while those which slightly affect the model output variable do not. This paper demonstrates how a global sensitivity analysis method based on variance decomposition can be applied on soil parameters in order to divide them in the two categories. The Extended FAST method applied to the crop model STICS and a set of 13 soil parameters first allows to calculate the part of variance explained by each soil parameter (giving global sensitivity indices of the soil parameters) and the coefficient of variation of the output variables (measuring the effect of the parameter uncertainty on each variable). These metrics are therefore used for deciding on the importance of the parameter value measurement. Different output variables (Leaf Area Index and chlorophyll content) are evaluated at different stages of interest while others (crop yield, grain protein content, soil mineral nitrogen) are evaluated at harvest. The analysis is applied on two different annual crops (wheat and sugar beet), two contrasted weather and two types of soil depth. When the uncertainty of the output generated by the soil parameters is large (coefficient of variation > 1/3), only the parameters having a significant global sensitivity indices (higher than 10%) are retained. The results show that the number of soil parameters which deserve an accurate determination can be significantly reduced by the use of this relevant method for appropriate management decision support.展开更多
Massive rural-to-urban migration in China is consequential for political trust: rural-to-urban migrants have been found to hold lower levels of trust in local government than their rural peers who choose to stay in th...Massive rural-to-urban migration in China is consequential for political trust: rural-to-urban migrants have been found to hold lower levels of trust in local government than their rural peers who choose to stay in the countryside (mean 4.92 and 6.34 out of 10, respectively, p < 0.001). This article explores why migrants have a certain level of political trust in their county-level government. Using data of rural-to-urban migrants from the China Family Panel Survey, this study performs a hierarchical linear modeling (HLM) to unpack the multi-level explanatory factors of rural-to-urban migrants’ political trust. Findings show that the individual-level socio-economic characteristics and perceptions of government performance (Level-1), the neighborhood-level characteristics-the physical and social status and environment of neighborhoods (Level-2), and the objective macroeconomic performance of county-level government (Level-3), work together to explain migrants’ trust levels. These results suggest that considering the effects of neighborhood-level factors on rural-to-urban migrants’ political trust merits policy and public management attention in rapidly urbanizing countries.展开更多
The solid and finite element model of metal pushing type continuously variable transmission are established at speed ratio of i =0 5 and i=2 0. In order to solve the problem of the complicated of structure,the...The solid and finite element model of metal pushing type continuously variable transmission are established at speed ratio of i =0 5 and i=2 0. In order to solve the problem of the complicated of structure,the node node rod discrete finite element model is put forward and the whole system is simplified and established.The natural frequency and mode shape of system are solved by iterative Lanczos reduce method for sensitivity analysis in finite element model.The new method and the result can be used to improve the smoothness of the variable transmission system and to propose the theory for reducing noise at operation.展开更多
A habitat model has been widely used to manage marine species and analyze relationship between species distribution and environmental factors.The predictive skill in habitat model depends on whether the models include...A habitat model has been widely used to manage marine species and analyze relationship between species distribution and environmental factors.The predictive skill in habitat model depends on whether the models include appropriate explanatory variables.Due to limited habitat range,low density,and low detection rate,the number of zero catches could be very large even in favorable habitats.Excessive zeroes will increase the bias and uncertainty in estimation of habitat.Therefore,appropriate explanatory variables need to be chosen first to prevent underestimate or overestimate species abundance in habitat models.In addition,biotic variables such as prey data and spatial autocovariate(SAC)of target species are often ignored in species distribution models.Therefore,we evaluated the eff ects of input variables on the performance of generalized additive models(GAMs)under excessive zero catch(>70%).Five types of input variables were selected,i.e.,(1)abiotic variables,(2)abiotic and biotic variables,(3)abiotic variables and SAC,(4)abiotic,biotic variables and SAC,and(5)principal component analysis(PCA)based abiotic and biotic variables and SAC.Belanger’s croaker Johnius belangerii is one of the dominant demersal fish in Haizhou Bay,with a large number of zero catches,thus was used for the case study.Results show that the PCA-based GAM incorporated with abiotic and biotic variables and SAC was the most appropriate model to quantify the spatial distribution of the croaker.Biotic variables and SAC were important and should be incorporated as one of the drivers to predict species distribution.Our study suggests that the process of input variables is critical to habitat modelling,which could improve the performance of habitat models and enhance our understanding of the habitat suitability of target species.展开更多
Long-term,ground-based daily global solar radiation (DGSR) at Zhongshan Station in Antarctica can quantitatively reveal the basic characteristics of Earth’s surface radiation balance and validate satellite data for t...Long-term,ground-based daily global solar radiation (DGSR) at Zhongshan Station in Antarctica can quantitatively reveal the basic characteristics of Earth’s surface radiation balance and validate satellite data for the Antarctic region.The fixed station was established in 1989,and conventional radiation observations started much later in 2008.In this study,a random forest (RF) model for estimating DGSR is developed using ground meteorological observation data,and a highprecision,long-term DGSR dataset is constructed.Then,the trend of DGSR from 1990 to 2019 at Zhongshan Station,Antarctica is analyzed.The RF model,which performs better than other models,shows a desirable performance of DGSR hindcast estimation with an R^2 of 0.984,root-mean-square error of 1.377 MJ m^(-2),and mean absolute error of 0.828 MJ m^(-2).The trend of DGSR annual anomalies increases during 1990–2004 and then begins to decrease after 2004.Note that the maximum value of annual anomalies occurs during approximately 2004/05 and is mainly related to the days with precipitation (especially those related to good weather during the polar day period) at this station.In addition to clouds and water vapor,bad weather conditions (such as snowfall,which can result in low visibility and then decreased sunshine duration and solar radiation) are the other major factors affecting solar radiation at this station.The high-precision,longterm estimated DGSR dataset enables further study and understanding of the role of Antarctica in global climate change and the interactions between snow,ice,and atmosphere.展开更多
In this study, a three-dimensional (3D) finite element modelling (FEM) analysis is carried out to investigate the effects of soil spatial variability on the response of retaining walls and an adjacent box culvert due ...In this study, a three-dimensional (3D) finite element modelling (FEM) analysis is carried out to investigate the effects of soil spatial variability on the response of retaining walls and an adjacent box culvert due to a braced excavation. The spatial variability of soil stiffness is modelled using a variogram and calibrated by high-quality experimental data. Multiple random field samples (RFSs) of soil stiffness are generated using geostatistical analysis and mapped onto a finite element mesh for stochastic analysis of excavation-induced structural responses by Monte Carlo simulation. It is found that the spatial variability of soil stiffness can be described by an exponential variogram, and the associated vertical correlation length is varied from 1.3 m to 1.6 m. It also reveals that the spatial variability of soil stiffness has a significant effect on the variations of retaining wall deflections and box culvert settlements. The ignorance of spatial variability in 3D FEM can result in an underestimation of lateral wall deflections and culvert settlements. Thus, the stochastic structural responses obtained from the 3D analysis could serve as an effective aid for probabilistic design and analysis of excavations.展开更多
This paper proposes to develop a data-driven via's depth estimator of the deep reactive ion etching process based on statistical identification of key variables.Several feature extraction algorithms are presented to ...This paper proposes to develop a data-driven via's depth estimator of the deep reactive ion etching process based on statistical identification of key variables.Several feature extraction algorithms are presented to reduce the high-dimensional data and effectively undertake the subsequent virtual metrology(VM) model building process.With the available on-line VM model,the model-based controller is hence readily applicable to improve the quality of a via's depth.Real operational data taken from a industrial manufacturing process are used to verify the effectiveness of the proposed method.The results demonstrate that the proposed method can decrease the MSE from 2.2×10^(-2) to 9×10^(-4) and has great potential in improving the existing DRIE process.展开更多
During a high-speed train operation,the train speed changes frequently,resulting in motion change as a function of time.A dynamic model of a double‐row tapered roller bearing system of a high-speed train under variab...During a high-speed train operation,the train speed changes frequently,resulting in motion change as a function of time.A dynamic model of a double‐row tapered roller bearing system of a high-speed train under variable speed conditions is developed.The model takes into consideration the structural characteristics of one outer ring and two inner rings of the train bearing.The angle iteration method is used to determine the rotation angle of the roller within any time period,solving the difficult problem of determining the location of the roller.The outer ring and inner ring faults are captured by the model,and the model response is obtained under variable speed conditions.Experiments are carried out under two fault conditions to validate the model results.The simulation results are found to be in good agreement with the results of the formula,and the errors between the simulation results and the experimental results when the bearing has outer and inner ring faults are found to be,respectively,5.97% and 2.59%,which demonstrates the effectiveness of the model.The influence of outer ring and inner ring faults on system stability is analyzed quantitatively using the Lempel–Ziv complexity.The results show that for low train acceleration,the inner ring fault has a more significant effect on the system stability,while for high acceleration,the outer ring fault has a more significant effect.However,when the train acceleration changes,the outer ring has a greater influence.In practice,train acceleration is usually small and does not frequently change in one operation cycle.Therefore,the inner ring fault of the bearing deserves more attention.展开更多
When using the Data Development Analysis method for analyzing the efficiency of different firms, it is common to put all similar DMU together for measurement in order to figure out the efficiency values of various DMU...When using the Data Development Analysis method for analyzing the efficiency of different firms, it is common to put all similar DMU together for measurement in order to figure out the efficiency values of various DMU. However, such an analysis may easily neglect the source property of an individual DMU, meaning that the differences among various DMUs derive from different environmental backgrounds, e.g. environment variables such as economic civilization, laws and regulations, and political backgrounds. Applying the Metafrontier model can overcome the barriers resulting from the environment variables, and it can analyze and measure the differences among various DMUs which have different source properties. It can also be used for measuring the difference between each group of DMU and all DMUs. Therefore, this study adopts the DEA method, assuming variable returns to scale to evaluate and comparatively analyze the business performance of life insurance industries in Taiwan and China's Mainland based on "BBC input orientation". When evaluating the business performance, the operating management echelon is affected by uncontrollable external environment variables. Thus, this study applies the Four-Stage Data Envelopment Analysis to discuss the impact of environment variables on business performance and re-measures the business efficiency of life insurance industries in Taiwan and China's Mainland after adjusting the input variables. The demonstration period adopted by this study is from 2003 to 2005, and the research subject comprises 43 companies in Taiwan and China's Mainland, among which, there are 19 companies in Taiwan and 24 companies in China's Mainland, and there are 129 sets of sample data. It is assumed that the discount rate is ? (), is set as 3% in this paper), and figured out the change of each life insurance company in technical efficiency in the inter-period accumulative years from 2003 to 2005.展开更多
基金Natural Sciences and Engineering Research Council of Canada(NSERC)(ID:236482)for supporting this research
文摘This paper discusses the utilization of latent variable modeling related to occupational health and safety in the mining industry.Latent variable modeling,which is a statistical model that relates observable and latent variables,could be used to facilitate researchers’understandings of the underlying constructs or hypothetical factors and their magnitude of effect that constitute a complex system.This enhanced understanding,in turn,can help emphasize the important factors to improve mine safety.The most commonly used techniques include the exploratory factor analysis(EFA),the confirmatory factor analysis(CFA)and the structural equation model with latent variables(SEM).A critical comparison of the three techniques regarding mine safety is provided.Possible applications of latent variable modeling in mining engineering are explored.In this scope,relevant research papers were reviewed.They suggest that the application of such methods could prove useful in mine accident and safety research.Application of latent variables analysis in cognitive work analysis was proposed to improve the understanding of human-work relationships in mining operations.
文摘Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.
基金supported by the National Key R&D Program of China(No.2021YFB0301200)National Natural Science Foundation of China(No.62025208).
文摘Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.
基金Supported by National High-Tech Program of China (No. 2001AA413110).
文摘An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction step, by which wavelet transform and principal component analysis are used to capture the inherent characteristics from process measurements, followed by a similarity assessment step using hidden Markov model (HMM) for pattern comparison. In most previous cases, a fixed-length moving window was employed to track dynamic data, and often failed to capture enough information for each fault and sometimes even deteriorated the diagnostic performance. A variable moving window, the length of which is modified with time, is introduced in this paper and case studies on the Tennessee Eastman process illustrate the potential of the proposed method.
文摘Inferential models are widely used in the chemical industry to infer key process variables, which are challenging or expensive to measure, from other more easily measured variables. The aim of this paper is three-fold: to present a theoretical review of some of the well known linear inferential modeling techniques, to enhance the predictive ability of the regularized canonical correlation analysis (RCCA) method, and finally to compare the performances of these techniques and highlight some of the practical issues that can affect their predictive abilities. The inferential modeling techniques considered in this study include full rank modeling techniques, such as ordinary least square (OLS) regression and ridge regression (RR), and latent variable regression (LVR) techniques, such as principal component regression (PCR), partial least squares (PLS) regression, and regularized canonical correlation analysis (RCCA). The theoretical analysis shows that the loading vectors used in LVR modeling can be computed by solving eigenvalue problems. Also, for the RCCA method, we show that by optimizing the regularization parameter, an improvement in prediction accuracy can be achieved over other modeling techniques. To illustrate the performances of all inferential modeling techniques, a comparative analysis was performed through two simulated examples, one using synthetic data and the other using simulated distillation column data. All techniques are optimized and compared by computing the cross validation mean square error using unseen testing data. The results of this comparative analysis show that scaling the data helps improve the performances of all modeling techniques, and that the LVR techniques outperform the full rank ones. One reason for this advantage is that the LVR techniques improve the conditioning of the model by discarding the latent variables (or principal components) with small eigenvalues, which also reduce the effect of the noise on the model prediction. The results also show that PCR and PLS have comparable performances, and that RCCA can provide an advantage by optimizing its regularization parameter.
文摘Feasibility analysis of soft constraints for input and output variables is critical for model predictive control(MPC).When encountering the infeasible situation, some way should be found to adjust the constraints to guarantee that the optimal control law exists. For MPC integrated with soft sensor, considering the soft constraints for critical variables additionally makes it more complicated and difficult for feasibility analysis and constraint adjustment. Therefore, the main contributions are that a linear programming approach is proposed for feasibility analysis, and the corresponding constraint adjustment method and procedure are given as well. The feasibility analysis gives considerations to the manipulated, secondary and critical variables, and the increment of manipulated variables as well. The feasibility analysis and the constraint adjustment are conducted in the entire control process and guarantee the existence of optimal control. In final, a simulation case confirms the contributions in this paper.
文摘The use of a crop model like STICS for appropriate management decision support requires a good knowledge of all the parameters of the model. Among them, the soil parameters are difficult to know at each point of interest and costly techniques may be used to measure them. It is therefore important to know which soil parameters need to be determined. It can be stated that those which affect significantly the output variable deserve an accurate determination while those which slightly affect the model output variable do not. This paper demonstrates how a global sensitivity analysis method based on variance decomposition can be applied on soil parameters in order to divide them in the two categories. The Extended FAST method applied to the crop model STICS and a set of 13 soil parameters first allows to calculate the part of variance explained by each soil parameter (giving global sensitivity indices of the soil parameters) and the coefficient of variation of the output variables (measuring the effect of the parameter uncertainty on each variable). These metrics are therefore used for deciding on the importance of the parameter value measurement. Different output variables (Leaf Area Index and chlorophyll content) are evaluated at different stages of interest while others (crop yield, grain protein content, soil mineral nitrogen) are evaluated at harvest. The analysis is applied on two different annual crops (wheat and sugar beet), two contrasted weather and two types of soil depth. When the uncertainty of the output generated by the soil parameters is large (coefficient of variation > 1/3), only the parameters having a significant global sensitivity indices (higher than 10%) are retained. The results show that the number of soil parameters which deserve an accurate determination can be significantly reduced by the use of this relevant method for appropriate management decision support.
文摘Massive rural-to-urban migration in China is consequential for political trust: rural-to-urban migrants have been found to hold lower levels of trust in local government than their rural peers who choose to stay in the countryside (mean 4.92 and 6.34 out of 10, respectively, p < 0.001). This article explores why migrants have a certain level of political trust in their county-level government. Using data of rural-to-urban migrants from the China Family Panel Survey, this study performs a hierarchical linear modeling (HLM) to unpack the multi-level explanatory factors of rural-to-urban migrants’ political trust. Findings show that the individual-level socio-economic characteristics and perceptions of government performance (Level-1), the neighborhood-level characteristics-the physical and social status and environment of neighborhoods (Level-2), and the objective macroeconomic performance of county-level government (Level-3), work together to explain migrants’ trust levels. These results suggest that considering the effects of neighborhood-level factors on rural-to-urban migrants’ political trust merits policy and public management attention in rapidly urbanizing countries.
文摘The solid and finite element model of metal pushing type continuously variable transmission are established at speed ratio of i =0 5 and i=2 0. In order to solve the problem of the complicated of structure,the node node rod discrete finite element model is put forward and the whole system is simplified and established.The natural frequency and mode shape of system are solved by iterative Lanczos reduce method for sensitivity analysis in finite element model.The new method and the result can be used to improve the smoothness of the variable transmission system and to propose the theory for reducing noise at operation.
基金Supported by the National Key R&D Program of China(No.2017YFE0104400)the National Natural Science Foundation of China(Nos.31772852,31802301)the Marine S&T Fund of Shandong Province for Pilot National Laboratory for Marine Science and Technology(Qingdao)(No.2018SDKJ0501-2)。
文摘A habitat model has been widely used to manage marine species and analyze relationship between species distribution and environmental factors.The predictive skill in habitat model depends on whether the models include appropriate explanatory variables.Due to limited habitat range,low density,and low detection rate,the number of zero catches could be very large even in favorable habitats.Excessive zeroes will increase the bias and uncertainty in estimation of habitat.Therefore,appropriate explanatory variables need to be chosen first to prevent underestimate or overestimate species abundance in habitat models.In addition,biotic variables such as prey data and spatial autocovariate(SAC)of target species are often ignored in species distribution models.Therefore,we evaluated the eff ects of input variables on the performance of generalized additive models(GAMs)under excessive zero catch(>70%).Five types of input variables were selected,i.e.,(1)abiotic variables,(2)abiotic and biotic variables,(3)abiotic variables and SAC,(4)abiotic,biotic variables and SAC,and(5)principal component analysis(PCA)based abiotic and biotic variables and SAC.Belanger’s croaker Johnius belangerii is one of the dominant demersal fish in Haizhou Bay,with a large number of zero catches,thus was used for the case study.Results show that the PCA-based GAM incorporated with abiotic and biotic variables and SAC was the most appropriate model to quantify the spatial distribution of the croaker.Biotic variables and SAC were important and should be incorporated as one of the drivers to predict species distribution.Our study suggests that the process of input variables is critical to habitat modelling,which could improve the performance of habitat models and enhance our understanding of the habitat suitability of target species.
基金supported by the National Natural Science Foundation of China (Grant Nos.41941010,41771064 and 41776195)the National Basic Research Program of China (Grant No.2016YFC1400303)the Basic Fund of the Chinese Academy of Meteorological Sciences (Grant No.2018Z001)。
文摘Long-term,ground-based daily global solar radiation (DGSR) at Zhongshan Station in Antarctica can quantitatively reveal the basic characteristics of Earth’s surface radiation balance and validate satellite data for the Antarctic region.The fixed station was established in 1989,and conventional radiation observations started much later in 2008.In this study,a random forest (RF) model for estimating DGSR is developed using ground meteorological observation data,and a highprecision,long-term DGSR dataset is constructed.Then,the trend of DGSR from 1990 to 2019 at Zhongshan Station,Antarctica is analyzed.The RF model,which performs better than other models,shows a desirable performance of DGSR hindcast estimation with an R^2 of 0.984,root-mean-square error of 1.377 MJ m^(-2),and mean absolute error of 0.828 MJ m^(-2).The trend of DGSR annual anomalies increases during 1990–2004 and then begins to decrease after 2004.Note that the maximum value of annual anomalies occurs during approximately 2004/05 and is mainly related to the days with precipitation (especially those related to good weather during the polar day period) at this station.In addition to clouds and water vapor,bad weather conditions (such as snowfall,which can result in low visibility and then decreased sunshine duration and solar radiation) are the other major factors affecting solar radiation at this station.The high-precision,longterm estimated DGSR dataset enables further study and understanding of the role of Antarctica in global climate change and the interactions between snow,ice,and atmosphere.
基金The authors would like to acknowledge the financial support provided by the National Natural Science Foundation of China(Grant No.41977240)the Fundamental Research Funds for the Central Universities(Grant No.B200202090).
文摘In this study, a three-dimensional (3D) finite element modelling (FEM) analysis is carried out to investigate the effects of soil spatial variability on the response of retaining walls and an adjacent box culvert due to a braced excavation. The spatial variability of soil stiffness is modelled using a variogram and calibrated by high-quality experimental data. Multiple random field samples (RFSs) of soil stiffness are generated using geostatistical analysis and mapped onto a finite element mesh for stochastic analysis of excavation-induced structural responses by Monte Carlo simulation. It is found that the spatial variability of soil stiffness can be described by an exponential variogram, and the associated vertical correlation length is varied from 1.3 m to 1.6 m. It also reveals that the spatial variability of soil stiffness has a significant effect on the variations of retaining wall deflections and box culvert settlements. The ignorance of spatial variability in 3D FEM can result in an underestimation of lateral wall deflections and culvert settlements. Thus, the stochastic structural responses obtained from the 3D analysis could serve as an effective aid for probabilistic design and analysis of excavations.
基金supported by the National Natural Science Foundation of China(No.60904053)the Natural Science Foundation of Jiangsu(No. SBK201123307)the Priority Academic Program Development of Jiangsu Higher Education Institutions(PAPD)
文摘This paper proposes to develop a data-driven via's depth estimator of the deep reactive ion etching process based on statistical identification of key variables.Several feature extraction algorithms are presented to reduce the high-dimensional data and effectively undertake the subsequent virtual metrology(VM) model building process.With the available on-line VM model,the model-based controller is hence readily applicable to improve the quality of a via's depth.Real operational data taken from a industrial manufacturing process are used to verify the effectiveness of the proposed method.The results demonstrate that the proposed method can decrease the MSE from 2.2×10^(-2) to 9×10^(-4) and has great potential in improving the existing DRIE process.
基金The present work was supported by the National Natural Science Foundation of China (Nos.11790282,12032017,12002221,and 11872256)the National Key R&D Program (2020YFB2007700)+1 种基金the S&T Program of Hebei(20310803D)the Natural Science Foundation of Hebei Province (No.A2020210028).
文摘During a high-speed train operation,the train speed changes frequently,resulting in motion change as a function of time.A dynamic model of a double‐row tapered roller bearing system of a high-speed train under variable speed conditions is developed.The model takes into consideration the structural characteristics of one outer ring and two inner rings of the train bearing.The angle iteration method is used to determine the rotation angle of the roller within any time period,solving the difficult problem of determining the location of the roller.The outer ring and inner ring faults are captured by the model,and the model response is obtained under variable speed conditions.Experiments are carried out under two fault conditions to validate the model results.The simulation results are found to be in good agreement with the results of the formula,and the errors between the simulation results and the experimental results when the bearing has outer and inner ring faults are found to be,respectively,5.97% and 2.59%,which demonstrates the effectiveness of the model.The influence of outer ring and inner ring faults on system stability is analyzed quantitatively using the Lempel–Ziv complexity.The results show that for low train acceleration,the inner ring fault has a more significant effect on the system stability,while for high acceleration,the outer ring fault has a more significant effect.However,when the train acceleration changes,the outer ring has a greater influence.In practice,train acceleration is usually small and does not frequently change in one operation cycle.Therefore,the inner ring fault of the bearing deserves more attention.
文摘When using the Data Development Analysis method for analyzing the efficiency of different firms, it is common to put all similar DMU together for measurement in order to figure out the efficiency values of various DMU. However, such an analysis may easily neglect the source property of an individual DMU, meaning that the differences among various DMUs derive from different environmental backgrounds, e.g. environment variables such as economic civilization, laws and regulations, and political backgrounds. Applying the Metafrontier model can overcome the barriers resulting from the environment variables, and it can analyze and measure the differences among various DMUs which have different source properties. It can also be used for measuring the difference between each group of DMU and all DMUs. Therefore, this study adopts the DEA method, assuming variable returns to scale to evaluate and comparatively analyze the business performance of life insurance industries in Taiwan and China's Mainland based on "BBC input orientation". When evaluating the business performance, the operating management echelon is affected by uncontrollable external environment variables. Thus, this study applies the Four-Stage Data Envelopment Analysis to discuss the impact of environment variables on business performance and re-measures the business efficiency of life insurance industries in Taiwan and China's Mainland after adjusting the input variables. The demonstration period adopted by this study is from 2003 to 2005, and the research subject comprises 43 companies in Taiwan and China's Mainland, among which, there are 19 companies in Taiwan and 24 companies in China's Mainland, and there are 129 sets of sample data. It is assumed that the discount rate is ? (), is set as 3% in this paper), and figured out the change of each life insurance company in technical efficiency in the inter-period accumulative years from 2003 to 2005.