By analyzing the existing methods for the bridge bearing capacity assessment, an analytic hierarchy pro cess estimation model with a variable weight and fuzzy description is proposed based on the nondestructive infor ...By analyzing the existing methods for the bridge bearing capacity assessment, an analytic hierarchy pro cess estimation model with a variable weight and fuzzy description is proposed based on the nondestructive infor mation. Considering the actual strength, the bearing capacity is first calculated from its design state, and then modified based on the detection information. The modification includes the section reduction and the structure deterioration. The section reduction involves the concrete section and the steel cross-section reduction. The structure deterioration is decided by six factors, i.e. , the concrete surface damage, the actual concrete strength, the steel corrosion electric potential, the chloride ion content, the carbonization depth, and the protective layer depth. The initial weight of each factor is calculated by the expert judgment matrix using an analytic hierarchy process. The consistency approximation and the error transfer theory are used. Then, the variable weight is in- troduced to expand the influences of factors in the worse state. Finally, an actual bridge is taken as an example to verify the proposed method. Results show that the estimated capacity agrees well with that of the load test, thus the method is objective and credible展开更多
Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum produ...Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum production, fabrication and manufacture, use and reclamation. Based on the investigation on the 2003-2007 aluminum cycles in China, a number of changes can be found. For instance, resources self-support ratio (RSR) in alumina production dropped from 95.42%to 55.50%, while RSR in the aluminum production increased from 52.45%to 79.25%. However, RSR in the Chinese aluminum industry leveled off at 50%in the period of 2003-2007. The respective use ratios of domestic and imported aluminum scrap in the aluminum industry of 2007 were 5.38% and 9.40%. In contrast, both the net imported Al-containing resources and the lost quantity of Al-containing materials in aluminum cycle increased during the same period, as well as the net increased quantity of Al-containing materials in social stock and recycled Al-scrap. Proposals for promoting aluminum cycle were put forward. The import/export policy and reducing the loss of Al-containing materials for the aluminum industry in China in the future were discussed.展开更多
A multi-level evaluation model for the superstructure of a damaged prestressed concrete girder or beam bridge is established, and the evaluation indices of the model as well as the rating standards are defined. A norm...A multi-level evaluation model for the superstructure of a damaged prestressed concrete girder or beam bridge is established, and the evaluation indices of the model as well as the rating standards are defined. A normal relative function about the evaluation indices of each element is developed to calculate the relative degree, and for each element there are no sub-level elements. When evaluating the elements in the sub-item level or the index level of the model, the weights of elements pertain to one adopted element, taking into account their degrees of deterioration. Since the relative degrees and structure evaluation scales on the damage conditions are applied to characterize the superstructure of damaged prestressed concrete girder bridges, this method can evaluate the prestressed structure in detail, and the evaluation results agree with the Code for Maintenance of Highway Bridges and Culvers (JTG Hll--2004 ). Finally, a bridge in Jilin province is taken as an example, using the method developed to evaluate its damage conditions, which gives an effective way for bridge engineering.展开更多
First,the analytical hierarchy process(AHP),which stands for the subjective weighting method,and the entropy method,which stands for the objective weighting method,are chosen to calculate the index weights of the cont...First,the analytical hierarchy process(AHP),which stands for the subjective weighting method,and the entropy method,which stands for the objective weighting method,are chosen to calculate the index weights of the contract risks of third party logistics(TPL),respectively.Then,they can determine the combination weights using the combination weighting method.Second,using the combination weights,the contract risks of TPL are evaluated through the fuzzy comprehensive evaluation method.According to the combination weights,the most important risk factor of the contract risks of TPL is choosing sub-contractors.The results are basically consistent with the facts and show that the weights determined by the combination weighting method can avoid the man-made deviations of the subjective weighting method on the one hand,and prevent results opposite to the reality brought about by the objective weighting method on the other hand.Meanwhile,the results of the fuzzy comprehensive evaluation are that the contract risks of TPL are at a high risk level.Roughly this matches real situations,and it indicates that the combination weighting method can generate the comprehensive assessment more scientifically and more reasonably as well.展开更多
A min-max optimization method is proposed as a new approach to deal with the weight determination problem in the context of the analytic hierarchy process. The priority is obtained through minimizing the maximal absol...A min-max optimization method is proposed as a new approach to deal with the weight determination problem in the context of the analytic hierarchy process. The priority is obtained through minimizing the maximal absolute difference between the weight vector obtained from each column and the ideal weight vector. By transformation, the. constrained min- max optimization problem is converted to a linear programming problem, which can be solved using either the simplex method or the interior method. The Karush-Kuhn- Tucker condition is also analytically provided. These control thresholds provide a straightforward indication of inconsistency of the pairwise comparison matrix. Numerical computations for several case studies are conducted to compare the performance of the proposed method with three existing methods. This observation illustrates that the min-max method controls maximum deviation and gives more weight to non- dominate factors.展开更多
An S-N curve fitting approach is proposed based on the weighted least square method, and the weights are inversely proportional to the length of mean confidence intervals of experimental data sets. The assumption coin...An S-N curve fitting approach is proposed based on the weighted least square method, and the weights are inversely proportional to the length of mean confidence intervals of experimental data sets. The assumption coincides with the physical characteristics of the fatigue life scatter. Two examples demonstrate the method. It is shown that the method has better accuracy and reasonableness compared with the usual least square method.展开更多
Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision we...Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision well, even it is impractical, and to propose the weighting average fusion algorithm. Methods The theoretical analysis and Monte Carlo simulation methods were ed to compare the traditional fusion algorithm with the new one,and the comparison of the root mean square error statistics values of the two algorithms was made. Results The hierarchical fusion algorithm is not better than the weighting average fusion and feedback weighting average algorithm The weighting filtering fusion algorithm is simple in principle, less in data, faster in processing and better in tolerance.Conclusion The weighting hierarchical fusion algorithm is suitable for the defective sensors.The feedback of the fusion result to the single sersor can enhance the single sensorr's precision. especially once one sensor has great deviation and low accuracy or has some deviation of sample period and is asynchronous to other sensors.展开更多
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form cou...How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distribution. And then the fused prior distribution is obtained and Bayesian analysis can be performed. How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.展开更多
Constitutionally protected rights remove political issues from the control of the democratically elected legislature. Since such rights therefore limit the power of the majority, recent work in rights theory argues th...Constitutionally protected rights remove political issues from the control of the democratically elected legislature. Since such rights therefore limit the power of the majority, recent work in rights theory argues that the constitutional protection of rights is inconsistent with the fundamental democratic idea of government by the people. According to this view, democracies should assign the power to resolve questions regarding the nature and extent of individual rights to the majority. Constitutional attempts to remove such questions from the public agenda, it is argued, are disrespectful to citizens who disagree with the views embodied in the constitutionalized rights. I argue that this critique: (1) is insufficiently attentive to the question of when legislation by the majority constitutes a legitimate exercise of political power; and (2) underestimates the importance of securing the constitutive conditions of democratic self-government.展开更多
A necessary and sufficient condition for the existence of simultaneous (M,N)singular value decomposition of matrices is given.Some properties about the weighted partial ordering are discussed with the help of the deco...A necessary and sufficient condition for the existence of simultaneous (M,N)singular value decomposition of matrices is given.Some properties about the weighted partial ordering are discussed with the help of the decomposition.展开更多
While fundamental individual rights are unquestionably taken as subjective rights, the same does not happen with fundamental social rights. If they are subjective rights, they are justiciable. The main argument in fav...While fundamental individual rights are unquestionably taken as subjective rights, the same does not happen with fundamental social rights. If they are subjective rights, they are justiciable. The main argument in favor of this understanding is based on liberty. The main argument against is the so called formal argument. In relation to the pro argument, liberty can be either juridical or factual. Juridical liberty has no value without factual liberty, because the right to liberty is only put into practice if one has the factual preconditions for its exercise. The argument against is that their justiciability displaces the competence of the elaboration of public politics from Legislative and Executive to Judiciary Power, what violates the principles of separation of powers and democracy. Nevertheless they are subjective rights indeed, but special ones: they are primafacie subjective rights. There is only one subjective right that is a priori considered definitive: the right to Existenzminimum) Its content is not settled, but it is quite unequivocal that the rights to simple housing, fundamental education and minimum level of medical assistance are part of it. Existenzminimum is then related to the minimum necessary for factual liberty. Against the justiciability of fundamental social rights, there are also arguments related to juridification of politics, administrative discretion and the possible reserve clause. The counter-arguments refer to original and exceptional competence, necessary objective proof of state's economical incapability, prohibition of State's will, principles of legality and of non-obviation o f Judiciary jurisdiction, Existenzminimun guarantee.展开更多
Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Ser...Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Services(Diff Serv) architecture for IP network which is based on classifying packets in to different service classes and scheduling them. Scheduling schemes of today's wireless broadband networks work on service differentiation. In this paper, we present a novel packet queue scheduling algorithm called dynamically weighted low complexity fair queuing(DWLC-FQ) which is an improvement over weighted fair queuing(WFQ) and worstcase fair weighted fair queuing+(WF2Q+). The proposed algorithm incorporates dynamic weight adjustment mechanism to cope with dynamics of data traffic such as burst and overload. It also reduces complexity associated with virtual time update and hence makes it suitable for high speed networks. Simulation results of proposed packet scheduling scheme demonstrate improvement in delay and drop rate performance for constant bit rate and video applications with very little or negligible impact on fairness.展开更多
Free energy calculations may provide vital information for studying various chemical and biological processes.Quantum mechanical methods are required to accurately describe interaction energies,but their computations ...Free energy calculations may provide vital information for studying various chemical and biological processes.Quantum mechanical methods are required to accurately describe interaction energies,but their computations are often too demanding for conformational sampling.As a remedy,level correction schemes that allow calculating high level free energies based on conformations from lower level simulations have been developed.Here,we present a variation of a Monte Carlo(MC)resampling approach in relation to the weighted histogram analysis method(WHAM).We show that our scheme can generate free energy surfaces that can practically converge to the exact one with sufficient sampling,and that it treats cases with insufficient sampling in a more stable manner than the conventional WHAM-based level correction scheme.It can also provide a guide for checking the uncertainty of the levelcorrected surface and a well-defined criterion for deciding the extent of smoothing on the free energy surface for its visual improvement.We demonstrate these aspects by obtaining the free energy maps associated with the alanine dipeptide and proton transfer network of the KillerRed protein in explicit water,and exemplify that the MC resampled WHAM scheme can be a practical tool for producing free energy surfaces of realistic systems.展开更多
Conventional multivariate statistical methods for process monitoring may not be suitable for dynamic processes since they usually rely on assumptions such as time invariance or uncorrelation. We are therefore motivate...Conventional multivariate statistical methods for process monitoring may not be suitable for dynamic processes since they usually rely on assumptions such as time invariance or uncorrelation. We are therefore motivated to propose a new monitoring method by compensating the principal component analysis with a weight approach.The proposed monitor consists of two tiers. The first tier uses the principal component analysis method to extract cross-correlation structure among process data, expressed by independent components. The second tier estimates auto-correlation structure among the extracted components as auto-regressive models. It is therefore named a dynamic weighted principal component analysis with hybrid correlation structure. The essential of the proposed method is to incorporate a weight approach into principal component analysis to construct two new subspaces, namely the important component subspace and the residual subspace, and two new statistics are defined to monitor them respectively. Through computing the weight values upon a new observation, the proposed method increases the weights along directions of components that have large estimation errors while reduces the influences of other directions. The rationale behind comes from the observations that the fault information is associated with online estimation errors of auto-regressive models. The proposed monitoring method is exemplified by the Tennessee Eastman process. The monitoring results show that the proposed method outperforms conventional principal component analysis, dynamic principal component analysis and dynamic latent variable.展开更多
Privilege user is needed to manage the commercial transactions, but a super-administrator may have monopolize power and cause serious security problem. Relied on trusted computing technology, a privilege separation me...Privilege user is needed to manage the commercial transactions, but a super-administrator may have monopolize power and cause serious security problem. Relied on trusted computing technology, a privilege separation method is proposed to satisfy the security management requirement for information systems. It authorizes the system privilege to three different managers, and none of it can be interfered by others. Process algebra Communication Sequential Processes is used to model the three powers mechanism, and safety effect is analyzed and compared.展开更多
基金Supported by the Jiangshu Province Communication Scientific Research Project(06Y21)Zhejiang Province Road Scientific Research Project(2007-013-11L)~~
文摘By analyzing the existing methods for the bridge bearing capacity assessment, an analytic hierarchy pro cess estimation model with a variable weight and fuzzy description is proposed based on the nondestructive infor mation. Considering the actual strength, the bearing capacity is first calculated from its design state, and then modified based on the detection information. The modification includes the section reduction and the structure deterioration. The section reduction involves the concrete section and the steel cross-section reduction. The structure deterioration is decided by six factors, i.e. , the concrete surface damage, the actual concrete strength, the steel corrosion electric potential, the chloride ion content, the carbonization depth, and the protective layer depth. The initial weight of each factor is calculated by the expert judgment matrix using an analytic hierarchy process. The consistency approximation and the error transfer theory are used. Then, the variable weight is in- troduced to expand the influences of factors in the worse state. Finally, an actual bridge is taken as an example to verify the proposed method. Results show that the estimated capacity agrees well with that of the load test, thus the method is objective and credible
基金Projects (71003018,71373003) supported by the National Natural Science Foundation of ChinaProjects (N110402003,N120302004) supported by the Fundamental Research Funds for the Central Universities,ChinaProject (13YJCZH172) supported by the Ministry of Education of China of Humanities and Social Sciences
文摘Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum production, fabrication and manufacture, use and reclamation. Based on the investigation on the 2003-2007 aluminum cycles in China, a number of changes can be found. For instance, resources self-support ratio (RSR) in alumina production dropped from 95.42%to 55.50%, while RSR in the aluminum production increased from 52.45%to 79.25%. However, RSR in the Chinese aluminum industry leveled off at 50%in the period of 2003-2007. The respective use ratios of domestic and imported aluminum scrap in the aluminum industry of 2007 were 5.38% and 9.40%. In contrast, both the net imported Al-containing resources and the lost quantity of Al-containing materials in aluminum cycle increased during the same period, as well as the net increased quantity of Al-containing materials in social stock and recycled Al-scrap. Proposals for promoting aluminum cycle were put forward. The import/export policy and reducing the loss of Al-containing materials for the aluminum industry in China in the future were discussed.
文摘A multi-level evaluation model for the superstructure of a damaged prestressed concrete girder or beam bridge is established, and the evaluation indices of the model as well as the rating standards are defined. A normal relative function about the evaluation indices of each element is developed to calculate the relative degree, and for each element there are no sub-level elements. When evaluating the elements in the sub-item level or the index level of the model, the weights of elements pertain to one adopted element, taking into account their degrees of deterioration. Since the relative degrees and structure evaluation scales on the damage conditions are applied to characterize the superstructure of damaged prestressed concrete girder bridges, this method can evaluate the prestressed structure in detail, and the evaluation results agree with the Code for Maintenance of Highway Bridges and Culvers (JTG Hll--2004 ). Finally, a bridge in Jilin province is taken as an example, using the method developed to evaluate its damage conditions, which gives an effective way for bridge engineering.
基金The National Key Technology R&D Program of China during the 11th Five-Year Plan Period(No.2006BAH02A06)
文摘First,the analytical hierarchy process(AHP),which stands for the subjective weighting method,and the entropy method,which stands for the objective weighting method,are chosen to calculate the index weights of the contract risks of third party logistics(TPL),respectively.Then,they can determine the combination weights using the combination weighting method.Second,using the combination weights,the contract risks of TPL are evaluated through the fuzzy comprehensive evaluation method.According to the combination weights,the most important risk factor of the contract risks of TPL is choosing sub-contractors.The results are basically consistent with the facts and show that the weights determined by the combination weighting method can avoid the man-made deviations of the subjective weighting method on the one hand,and prevent results opposite to the reality brought about by the objective weighting method on the other hand.Meanwhile,the results of the fuzzy comprehensive evaluation are that the contract risks of TPL are at a high risk level.Roughly this matches real situations,and it indicates that the combination weighting method can generate the comprehensive assessment more scientifically and more reasonably as well.
基金The US National Science Foundation (No. CMMI-0408390,CMMI-0644552,BCS-0527508)the National Natural Science Foundation of China (No. 51010044,U1134206)+2 种基金the Fok YingTong Education Foundation (No. 114024)the Natural Science Foundation of Jiangsu Province (No. BK2009015)the Postdoctoral Science Foundation of Jiangsu Province (No. 0901005C)
文摘A min-max optimization method is proposed as a new approach to deal with the weight determination problem in the context of the analytic hierarchy process. The priority is obtained through minimizing the maximal absolute difference between the weight vector obtained from each column and the ideal weight vector. By transformation, the. constrained min- max optimization problem is converted to a linear programming problem, which can be solved using either the simplex method or the interior method. The Karush-Kuhn- Tucker condition is also analytically provided. These control thresholds provide a straightforward indication of inconsistency of the pairwise comparison matrix. Numerical computations for several case studies are conducted to compare the performance of the proposed method with three existing methods. This observation illustrates that the min-max method controls maximum deviation and gives more weight to non- dominate factors.
文摘An S-N curve fitting approach is proposed based on the weighted least square method, and the weights are inversely proportional to the length of mean confidence intervals of experimental data sets. The assumption coincides with the physical characteristics of the fatigue life scatter. Two examples demonstrate the method. It is shown that the method has better accuracy and reasonableness compared with the usual least square method.
文摘Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision well, even it is impractical, and to propose the weighting average fusion algorithm. Methods The theoretical analysis and Monte Carlo simulation methods were ed to compare the traditional fusion algorithm with the new one,and the comparison of the root mean square error statistics values of the two algorithms was made. Results The hierarchical fusion algorithm is not better than the weighting average fusion and feedback weighting average algorithm The weighting filtering fusion algorithm is simple in principle, less in data, faster in processing and better in tolerance.Conclusion The weighting hierarchical fusion algorithm is suitable for the defective sensors.The feedback of the fusion result to the single sersor can enhance the single sensorr's precision. especially once one sensor has great deviation and low accuracy or has some deviation of sample period and is asynchronous to other sensors.
文摘How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distribution. And then the fused prior distribution is obtained and Bayesian analysis can be performed. How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
文摘Constitutionally protected rights remove political issues from the control of the democratically elected legislature. Since such rights therefore limit the power of the majority, recent work in rights theory argues that the constitutional protection of rights is inconsistent with the fundamental democratic idea of government by the people. According to this view, democracies should assign the power to resolve questions regarding the nature and extent of individual rights to the majority. Constitutional attempts to remove such questions from the public agenda, it is argued, are disrespectful to citizens who disagree with the views embodied in the constitutionalized rights. I argue that this critique: (1) is insufficiently attentive to the question of when legislation by the majority constitutes a legitimate exercise of political power; and (2) underestimates the importance of securing the constitutive conditions of democratic self-government.
基金The Guangxi Science Foundation(0575032,06400161)the support program for 100 Young and Middle-aged Disciplinary Leaders in Guangxi Higher Education Institutions
文摘A necessary and sufficient condition for the existence of simultaneous (M,N)singular value decomposition of matrices is given.Some properties about the weighted partial ordering are discussed with the help of the decomposition.
文摘While fundamental individual rights are unquestionably taken as subjective rights, the same does not happen with fundamental social rights. If they are subjective rights, they are justiciable. The main argument in favor of this understanding is based on liberty. The main argument against is the so called formal argument. In relation to the pro argument, liberty can be either juridical or factual. Juridical liberty has no value without factual liberty, because the right to liberty is only put into practice if one has the factual preconditions for its exercise. The argument against is that their justiciability displaces the competence of the elaboration of public politics from Legislative and Executive to Judiciary Power, what violates the principles of separation of powers and democracy. Nevertheless they are subjective rights indeed, but special ones: they are primafacie subjective rights. There is only one subjective right that is a priori considered definitive: the right to Existenzminimum) Its content is not settled, but it is quite unequivocal that the rights to simple housing, fundamental education and minimum level of medical assistance are part of it. Existenzminimum is then related to the minimum necessary for factual liberty. Against the justiciability of fundamental social rights, there are also arguments related to juridification of politics, administrative discretion and the possible reserve clause. The counter-arguments refer to original and exceptional competence, necessary objective proof of state's economical incapability, prohibition of State's will, principles of legality and of non-obviation o f Judiciary jurisdiction, Existenzminimun guarantee.
文摘Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Services(Diff Serv) architecture for IP network which is based on classifying packets in to different service classes and scheduling them. Scheduling schemes of today's wireless broadband networks work on service differentiation. In this paper, we present a novel packet queue scheduling algorithm called dynamically weighted low complexity fair queuing(DWLC-FQ) which is an improvement over weighted fair queuing(WFQ) and worstcase fair weighted fair queuing+(WF2Q+). The proposed algorithm incorporates dynamic weight adjustment mechanism to cope with dynamics of data traffic such as burst and overload. It also reduces complexity associated with virtual time update and hence makes it suitable for high speed networks. Simulation results of proposed packet scheduling scheme demonstrate improvement in delay and drop rate performance for constant bit rate and video applications with very little or negligible impact on fairness.
基金supported by the Mid-career Researcher Program(No.2017R1A2B3004946)through National Research Foundationfunded by Ministry of Science and ICT of Korea.
文摘Free energy calculations may provide vital information for studying various chemical and biological processes.Quantum mechanical methods are required to accurately describe interaction energies,but their computations are often too demanding for conformational sampling.As a remedy,level correction schemes that allow calculating high level free energies based on conformations from lower level simulations have been developed.Here,we present a variation of a Monte Carlo(MC)resampling approach in relation to the weighted histogram analysis method(WHAM).We show that our scheme can generate free energy surfaces that can practically converge to the exact one with sufficient sampling,and that it treats cases with insufficient sampling in a more stable manner than the conventional WHAM-based level correction scheme.It can also provide a guide for checking the uncertainty of the levelcorrected surface and a well-defined criterion for deciding the extent of smoothing on the free energy surface for its visual improvement.We demonstrate these aspects by obtaining the free energy maps associated with the alanine dipeptide and proton transfer network of the KillerRed protein in explicit water,and exemplify that the MC resampled WHAM scheme can be a practical tool for producing free energy surfaces of realistic systems.
基金Supported by the National Natural Science Foundation of China(61174114)the Research Fund for the Doctoral Program of Higher Education in China(20120101130016)+1 种基金the Natural Science Foundation of Zhejiang Province(LQ15F030006)and the Science and Technology Program Project of Zhejiang Province(2015C33033)
文摘Conventional multivariate statistical methods for process monitoring may not be suitable for dynamic processes since they usually rely on assumptions such as time invariance or uncorrelation. We are therefore motivated to propose a new monitoring method by compensating the principal component analysis with a weight approach.The proposed monitor consists of two tiers. The first tier uses the principal component analysis method to extract cross-correlation structure among process data, expressed by independent components. The second tier estimates auto-correlation structure among the extracted components as auto-regressive models. It is therefore named a dynamic weighted principal component analysis with hybrid correlation structure. The essential of the proposed method is to incorporate a weight approach into principal component analysis to construct two new subspaces, namely the important component subspace and the residual subspace, and two new statistics are defined to monitor them respectively. Through computing the weight values upon a new observation, the proposed method increases the weights along directions of components that have large estimation errors while reduces the influences of other directions. The rationale behind comes from the observations that the fault information is associated with online estimation errors of auto-regressive models. The proposed monitoring method is exemplified by the Tennessee Eastman process. The monitoring results show that the proposed method outperforms conventional principal component analysis, dynamic principal component analysis and dynamic latent variable.
文摘Privilege user is needed to manage the commercial transactions, but a super-administrator may have monopolize power and cause serious security problem. Relied on trusted computing technology, a privilege separation method is proposed to satisfy the security management requirement for information systems. It authorizes the system privilege to three different managers, and none of it can be interfered by others. Process algebra Communication Sequential Processes is used to model the three powers mechanism, and safety effect is analyzed and compared.