Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum produ...Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum production, fabrication and manufacture, use and reclamation. Based on the investigation on the 2003-2007 aluminum cycles in China, a number of changes can be found. For instance, resources self-support ratio (RSR) in alumina production dropped from 95.42%to 55.50%, while RSR in the aluminum production increased from 52.45%to 79.25%. However, RSR in the Chinese aluminum industry leveled off at 50%in the period of 2003-2007. The respective use ratios of domestic and imported aluminum scrap in the aluminum industry of 2007 were 5.38% and 9.40%. In contrast, both the net imported Al-containing resources and the lost quantity of Al-containing materials in aluminum cycle increased during the same period, as well as the net increased quantity of Al-containing materials in social stock and recycled Al-scrap. Proposals for promoting aluminum cycle were put forward. The import/export policy and reducing the loss of Al-containing materials for the aluminum industry in China in the future were discussed.展开更多
Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision we...Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision well, even it is impractical, and to propose the weighting average fusion algorithm. Methods The theoretical analysis and Monte Carlo simulation methods were ed to compare the traditional fusion algorithm with the new one,and the comparison of the root mean square error statistics values of the two algorithms was made. Results The hierarchical fusion algorithm is not better than the weighting average fusion and feedback weighting average algorithm The weighting filtering fusion algorithm is simple in principle, less in data, faster in processing and better in tolerance.Conclusion The weighting hierarchical fusion algorithm is suitable for the defective sensors.The feedback of the fusion result to the single sersor can enhance the single sensorr's precision. especially once one sensor has great deviation and low accuracy or has some deviation of sample period and is asynchronous to other sensors.展开更多
Unlike the limit equilibrium method(LEM), with which only the global safety factor of the landslide can be calculated, a local safety factor(LSF) method is proposed to evaluate the stability of different sections of a...Unlike the limit equilibrium method(LEM), with which only the global safety factor of the landslide can be calculated, a local safety factor(LSF) method is proposed to evaluate the stability of different sections of a landslide in this paper. Based on three-dimensional(3D) numerical simulation results, the local safety factor is defined as the ratio of the shear strength of the soil at an element on the slip zone to the shear stress parallel to the sliding direction at that element. The global safety factor of the landslide is defined as the weighted average of all local safety factors based on the area of the slip surface. Some example analyses show that the results computed by the LSF method agree well with those calculated by the General Limit Equilibrium(GLE) method in two-dimensional(2D) models and the distribution of the LSF in the 3D slip zone is consistent with that indicated by the observed deformation pattern of an actual landslide in China.展开更多
Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Ser...Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Services(Diff Serv) architecture for IP network which is based on classifying packets in to different service classes and scheduling them. Scheduling schemes of today's wireless broadband networks work on service differentiation. In this paper, we present a novel packet queue scheduling algorithm called dynamically weighted low complexity fair queuing(DWLC-FQ) which is an improvement over weighted fair queuing(WFQ) and worstcase fair weighted fair queuing+(WF2Q+). The proposed algorithm incorporates dynamic weight adjustment mechanism to cope with dynamics of data traffic such as burst and overload. It also reduces complexity associated with virtual time update and hence makes it suitable for high speed networks. Simulation results of proposed packet scheduling scheme demonstrate improvement in delay and drop rate performance for constant bit rate and video applications with very little or negligible impact on fairness.展开更多
By analyzing the structures of circuits,a novel approach for signal probability estimation of very large-scale integration(VLSI)based on the improved weighted averaging algorithm(IWAA)is proposed.Considering the failu...By analyzing the structures of circuits,a novel approach for signal probability estimation of very large-scale integration(VLSI)based on the improved weighted averaging algorithm(IWAA)is proposed.Considering the failure probability of the gate,first,the first reconvergent fan-ins corresponding to the reconvergent fan-outs were identified to locate the important signal correlation nodes based on the principle of homologous signal convergence.Secondly,the reconvergent fan-in nodes of the multiple reconverging structure in the circuit were identified by the sensitization path to determine the interference sources to the signal probability calculation.Then,the weighted signal probability was calculated by combining the weighted average approach to correct the signal probability.Finally,the reconvergent fan-out was quantified by the mixed-calculation strategy of signal probability to reduce the impact of multiple reconvergent fan-outs on the accuracy.Simulation results on ISCAS85 benchmarks circuits show that the proposed method has approximate linear time-space consumption with the increase in the number of the gate,and its accuracy is 4.2%higher than that of the IWAA.展开更多
This paper proposes a spatially denoising algorithm using filtering-based noise estimation for an image corrupted by Gaussian noise.The proposed algorithm consists of two stages:estimation and elimination of noise den...This paper proposes a spatially denoising algorithm using filtering-based noise estimation for an image corrupted by Gaussian noise.The proposed algorithm consists of two stages:estimation and elimination of noise density.To adaptively deal with variety of the noise amount,a noisy input image is firstly filtered by a lowpass filter.Standard deviation of the noise is computed from different images between the noisy input and its filtered image.In addition,a modified Gaussian noise removal filter based on the local statistics such as local weighted mean,local weighted activity and local maximum is used to control the degree of noise suppression.Experiments show the effectiveness of the proposed algorithm.展开更多
Two kinds of determining methods for scenario earthquakes are presented in this paper,namely the weighted average method and maximum probability method. This paper briefly introduces the two methods,then taking a high...Two kinds of determining methods for scenario earthquakes are presented in this paper,namely the weighted average method and maximum probability method. This paper briefly introduces the two methods,then taking a high-rise building in the Yantai area as a case study,we use the weighted average method and maximum probability method to realize seismic hazard analysis, determine earthquake magnitude, the epicenter and specific space position,and then give two response spectrums of the two methods. By comparing the differences of response spectrums between the two methods,we find that the weighted average method is more suitable for long period structures,while considering long period safety. The maximum probability method is more suitable for short period structures. It is reasonable to choose a corresponding different method when the structures have different natural vibration periods.展开更多
First. we use graph theory to further clarify information of nodes and topics. Next, our paper analyzes the factor which affects the nodes probability of being conspirators. According to requirement 1, each node is gi...First. we use graph theory to further clarify information of nodes and topics. Next, our paper analyzes the factor which affects the nodes probability of being conspirators. According to requirement 1, each node is given an initial probability in being a conspirator on the basis of the acquired information.Then we conduct calculations with the iterative equation produced by factor analysis to get the priority list of the 83 given nodes. In addition, according to requirement 2, we make some changes of the nodes information before solving the iterativc modcl above. Compared with former result, some changes of priority and probability of being conspirator emerges.Finally, based upon requirement 3, we pick out some infomaation from some certain topic by semantic analysis and text analysis. A new group of indexes are solved out with TOPSIS to finish the information-gathering period. The terminal indicator, containing the information of nodes and topics, is a weighted average value of the indexes obtained above and the indexes obtained in requirement 1 with the method of the variation coefficient.展开更多
Portfolio management is a typical decision making problem under incomplete,sometimes unknown, information. This paper considers the portfolio selection problemsunder a general setting of uncertain states without proba...Portfolio management is a typical decision making problem under incomplete,sometimes unknown, information. This paper considers the portfolio selection problemsunder a general setting of uncertain states without probability. The investor's preferenceis based on his optimum degree about the nature, and his attitude can be described by anOrdered Weighted Averaging Aggregation function. We construct the OWA portfolio selection model, which is a nonlinear programming problem. The problem can be equivalentlytransformed into a mixed integer linear programming. A numerical example is given andthe solutions imply that the investor's strategies depend not only on his optimum degreebut also on his preference weight vector. The general game-theoretical portfolio selectionmethod, max-min method and competitive ratio method are all the special settings of thismodel.展开更多
In this paper,an approach to improving consistency of judgement matrix in the Analytic Hierarchy Process (AHP) is presented,which utilizes the eigenvector to revise a pair of entries of judgement matrix each time.By u...In this paper,an approach to improving consistency of judgement matrix in the Analytic Hierarchy Process (AHP) is presented,which utilizes the eigenvector to revise a pair of entries of judgement matrix each time.By using this method,any judgement matrix with a large C.R.can be modified to a matrix which can both tally with the consistency requirement and reserve the most information that the original matrix contains.An algorithm to derive a judgement matrix with acceptable consistency (i.e.,C.R.<0.1) and two criteria of evaluating modificatory effectiveness are also given.展开更多
Customers are of great importance to E-commerce in intense competition.It is known that twenty percent customers produce eighty percent profiles.Thus,how to find these customers is very critical.Customer lifetime valu...Customers are of great importance to E-commerce in intense competition.It is known that twenty percent customers produce eighty percent profiles.Thus,how to find these customers is very critical.Customer lifetime value(CLV) is presented to evaluate customers in terms of recency,frequency and monetary(RFM) variables.A novel model is proposed to analyze customers purchase data and RFM variables based on ordered weighting averaging(OWA) and K-Means cluster algorithm.OWA is employed to determine the weights of RFM variables in evaluating customer lifetime value or loyalty.K-Means algorithm is used to cluster customers according to RFM values.Churn customers could be found out by comparing RFM values of every cluster group with average RFM.Questionnaire is conducted to investigate which reasons cause customers dissatisfaction.Rank these reasons to help E-commerce improve services.The experimental results have demonstrated that the model is effective and reasonable.展开更多
Objective: To study which items need the formulation of national standards and which standards should be first formulated. Methods: Apply the method of the questionnaire survey to collect data, and the statistical a...Objective: To study which items need the formulation of national standards and which standards should be first formulated. Methods: Apply the method of the questionnaire survey to collect data, and the statistical analysis adopt the frequency and weighted average method. Results: Propose the items of acupuncture and moxibustion which need the urgent formulation of standards and the sequence in the formulation of standard items. Conclusion: Provid important bases for the follow-up report of national standard items and avoid temporary formulation of standards without plan .展开更多
基金Projects (71003018,71373003) supported by the National Natural Science Foundation of ChinaProjects (N110402003,N120302004) supported by the Fundamental Research Funds for the Central Universities,ChinaProject (13YJCZH172) supported by the Ministry of Education of China of Humanities and Social Sciences
文摘Anthropogenic aluminum cycle in China was analyzed by the aluminum flow diagram based on the life cycle of aluminum products. The whole anthropogenic aluminum cycle consists of four stages: alumina and aluminum production, fabrication and manufacture, use and reclamation. Based on the investigation on the 2003-2007 aluminum cycles in China, a number of changes can be found. For instance, resources self-support ratio (RSR) in alumina production dropped from 95.42%to 55.50%, while RSR in the aluminum production increased from 52.45%to 79.25%. However, RSR in the Chinese aluminum industry leveled off at 50%in the period of 2003-2007. The respective use ratios of domestic and imported aluminum scrap in the aluminum industry of 2007 were 5.38% and 9.40%. In contrast, both the net imported Al-containing resources and the lost quantity of Al-containing materials in aluminum cycle increased during the same period, as well as the net increased quantity of Al-containing materials in social stock and recycled Al-scrap. Proposals for promoting aluminum cycle were put forward. The import/export policy and reducing the loss of Al-containing materials for the aluminum industry in China in the future were discussed.
文摘Aim To analyze the traditional hierarchical Kalman filtering fusion algorithm theoretically and point out that the traditional Kalman filtering fusion algorithm is complex and can not improve the tracking precision well, even it is impractical, and to propose the weighting average fusion algorithm. Methods The theoretical analysis and Monte Carlo simulation methods were ed to compare the traditional fusion algorithm with the new one,and the comparison of the root mean square error statistics values of the two algorithms was made. Results The hierarchical fusion algorithm is not better than the weighting average fusion and feedback weighting average algorithm The weighting filtering fusion algorithm is simple in principle, less in data, faster in processing and better in tolerance.Conclusion The weighting hierarchical fusion algorithm is suitable for the defective sensors.The feedback of the fusion result to the single sersor can enhance the single sensorr's precision. especially once one sensor has great deviation and low accuracy or has some deviation of sample period and is asynchronous to other sensors.
基金financially supported by the National Natural Science Foundation of China(Grant No.51178402,10902112)Department of Transportation Technology Projects(Grant No.2011318740240)the Fundamental Research Funds for the Central Universities(Grant No.2682014CX074)
文摘Unlike the limit equilibrium method(LEM), with which only the global safety factor of the landslide can be calculated, a local safety factor(LSF) method is proposed to evaluate the stability of different sections of a landslide in this paper. Based on three-dimensional(3D) numerical simulation results, the local safety factor is defined as the ratio of the shear strength of the soil at an element on the slip zone to the shear stress parallel to the sliding direction at that element. The global safety factor of the landslide is defined as the weighted average of all local safety factors based on the area of the slip surface. Some example analyses show that the results computed by the LSF method agree well with those calculated by the General Limit Equilibrium(GLE) method in two-dimensional(2D) models and the distribution of the LSF in the 3D slip zone is consistent with that indicated by the observed deformation pattern of an actual landslide in China.
文摘Services provided by internet need guaranteed network performance. Efficient packet queuing and scheduling schemes play key role in achieving this. Internet engineering task force(IETF) has proposed Differentiated Services(Diff Serv) architecture for IP network which is based on classifying packets in to different service classes and scheduling them. Scheduling schemes of today's wireless broadband networks work on service differentiation. In this paper, we present a novel packet queue scheduling algorithm called dynamically weighted low complexity fair queuing(DWLC-FQ) which is an improvement over weighted fair queuing(WFQ) and worstcase fair weighted fair queuing+(WF2Q+). The proposed algorithm incorporates dynamic weight adjustment mechanism to cope with dynamics of data traffic such as burst and overload. It also reduces complexity associated with virtual time update and hence makes it suitable for high speed networks. Simulation results of proposed packet scheduling scheme demonstrate improvement in delay and drop rate performance for constant bit rate and video applications with very little or negligible impact on fairness.
基金The National Natural Science Foundation of China(No.61502422)the Natural Science Foundation of Zhejiang Province(No.LY18F020028,LQ15F020006)the Natural Science Foundation of Zhejiang University of Technology(No.2014XY007)
文摘By analyzing the structures of circuits,a novel approach for signal probability estimation of very large-scale integration(VLSI)based on the improved weighted averaging algorithm(IWAA)is proposed.Considering the failure probability of the gate,first,the first reconvergent fan-ins corresponding to the reconvergent fan-outs were identified to locate the important signal correlation nodes based on the principle of homologous signal convergence.Secondly,the reconvergent fan-in nodes of the multiple reconverging structure in the circuit were identified by the sensitization path to determine the interference sources to the signal probability calculation.Then,the weighted signal probability was calculated by combining the weighted average approach to correct the signal probability.Finally,the reconvergent fan-out was quantified by the mixed-calculation strategy of signal probability to reduce the impact of multiple reconvergent fan-outs on the accuracy.Simulation results on ISCAS85 benchmarks circuits show that the proposed method has approximate linear time-space consumption with the increase in the number of the gate,and its accuracy is 4.2%higher than that of the IWAA.
基金supported by the Korea Science and Engineering Foundation(KOSEF) grant fund by the Korea Govern-ment(MEST)(No.2011-0000148)the Ministry of Knowledge Economy,Korea under the Infor mation Technology Research Center support programsupervised by the National IT Industry Promotion Agency(NIPA-2011-C1090-1121-0010)
文摘This paper proposes a spatially denoising algorithm using filtering-based noise estimation for an image corrupted by Gaussian noise.The proposed algorithm consists of two stages:estimation and elimination of noise density.To adaptively deal with variety of the noise amount,a noisy input image is firstly filtered by a lowpass filter.Standard deviation of the noise is computed from different images between the noisy input and its filtered image.In addition,a modified Gaussian noise removal filter based on the local statistics such as local weighted mean,local weighted activity and local maximum is used to control the degree of noise suppression.Experiments show the effectiveness of the proposed algorithm.
基金funded by the Basic Scientific Research and Business Item of Central Public-interest Scientific Institution,China(ZDJ2012-12)
文摘Two kinds of determining methods for scenario earthquakes are presented in this paper,namely the weighted average method and maximum probability method. This paper briefly introduces the two methods,then taking a high-rise building in the Yantai area as a case study,we use the weighted average method and maximum probability method to realize seismic hazard analysis, determine earthquake magnitude, the epicenter and specific space position,and then give two response spectrums of the two methods. By comparing the differences of response spectrums between the two methods,we find that the weighted average method is more suitable for long period structures,while considering long period safety. The maximum probability method is more suitable for short period structures. It is reasonable to choose a corresponding different method when the structures have different natural vibration periods.
文摘First. we use graph theory to further clarify information of nodes and topics. Next, our paper analyzes the factor which affects the nodes probability of being conspirators. According to requirement 1, each node is given an initial probability in being a conspirator on the basis of the acquired information.Then we conduct calculations with the iterative equation produced by factor analysis to get the priority list of the 83 given nodes. In addition, according to requirement 2, we make some changes of the nodes information before solving the iterativc modcl above. Compared with former result, some changes of priority and probability of being conspirator emerges.Finally, based upon requirement 3, we pick out some infomaation from some certain topic by semantic analysis and text analysis. A new group of indexes are solved out with TOPSIS to finish the information-gathering period. The terminal indicator, containing the information of nodes and topics, is a weighted average value of the indexes obtained above and the indexes obtained in requirement 1 with the method of the variation coefficient.
文摘Portfolio management is a typical decision making problem under incomplete,sometimes unknown, information. This paper considers the portfolio selection problemsunder a general setting of uncertain states without probability. The investor's preferenceis based on his optimum degree about the nature, and his attitude can be described by anOrdered Weighted Averaging Aggregation function. We construct the OWA portfolio selection model, which is a nonlinear programming problem. The problem can be equivalentlytransformed into a mixed integer linear programming. A numerical example is given andthe solutions imply that the investor's strategies depend not only on his optimum degreebut also on his preference weight vector. The general game-theoretical portfolio selectionmethod, max-min method and competitive ratio method are all the special settings of thismodel.
基金This research is supported by the National Natural Science Foundation of China under Project 79970093, the Ph.D. Dissertation Foundation of Southeast University-NARI-Relays Electric Co. Ltd.
文摘In this paper,an approach to improving consistency of judgement matrix in the Analytic Hierarchy Process (AHP) is presented,which utilizes the eigenvector to revise a pair of entries of judgement matrix each time.By using this method,any judgement matrix with a large C.R.can be modified to a matrix which can both tally with the consistency requirement and reserve the most information that the original matrix contains.An algorithm to derive a judgement matrix with acceptable consistency (i.e.,C.R.<0.1) and two criteria of evaluating modificatory effectiveness are also given.
基金supported by the Natural Science Foundation under Grant Nos.71273139,60804047the Social Science Foundation of Chinese Ministry of Education under Grant No.12YJC630271
文摘Customers are of great importance to E-commerce in intense competition.It is known that twenty percent customers produce eighty percent profiles.Thus,how to find these customers is very critical.Customer lifetime value(CLV) is presented to evaluate customers in terms of recency,frequency and monetary(RFM) variables.A novel model is proposed to analyze customers purchase data and RFM variables based on ordered weighting averaging(OWA) and K-Means cluster algorithm.OWA is employed to determine the weights of RFM variables in evaluating customer lifetime value or loyalty.K-Means algorithm is used to cluster customers according to RFM values.Churn customers could be found out by comparing RFM values of every cluster group with average RFM.Questionnaire is conducted to investigate which reasons cause customers dissatisfaction.Rank these reasons to help E-commerce improve services.The experimental results have demonstrated that the model is effective and reasonable.
基金TCM project supported by the National Program of Science and Technology Development (2006BA121B01)
文摘Objective: To study which items need the formulation of national standards and which standards should be first formulated. Methods: Apply the method of the questionnaire survey to collect data, and the statistical analysis adopt the frequency and weighted average method. Results: Propose the items of acupuncture and moxibustion which need the urgent formulation of standards and the sequence in the formulation of standard items. Conclusion: Provid important bases for the follow-up report of national standard items and avoid temporary formulation of standards without plan .