Abstract: The layered decoding algorithm has been widely used in the implementation of Low Density Parity Check (LDPC) decoders, due to its high convergence speed. However, the pipeline operation of the layered dec...Abstract: The layered decoding algorithm has been widely used in the implementation of Low Density Parity Check (LDPC) decoders, due to its high convergence speed. However, the pipeline operation of the layered decoder may introduce memory access conflicts, which heavily deteriorates the decoder throughput. To essentially deal with the issue of memory access conflicts,展开更多
To adjust the variance of source rate in linear broadcast networks, global encoding kernels should have corresponding dimensions to instruct the decoding process. The algorithm of constructing such global encoding ker...To adjust the variance of source rate in linear broadcast networks, global encoding kernels should have corresponding dimensions to instruct the decoding process. The algorithm of constructing such global encoding kernels is to adjust heterogeneous network to possible link failures. Linear algebra, graph theory and group theory are applied to construct one series of global encoding kernels which are applicable to all source rates. The effectiveness and existence of such global encoding kernels are proved. Based on 2 information flow, the algorithm of construction is explicitly given within polynomial time O(|E| |T|.ω^2max), and the memory complexity of algorithm is O(|E|). Both time and memory complexity of this algorithm proposed can be O(ωmax) less than those of algorithms in related works.展开更多
The optimization of high density and concentrated-weight freights loading requires an even distribution of the freight's weight and unconcentrated loading on the floor of the car.Based on the characteristics of co...The optimization of high density and concentrated-weight freights loading requires an even distribution of the freight's weight and unconcentrated loading on the floor of the car.Based on the characteristics of concentrated-weight category freights,an improvement method is put forward to build freight towers and a greedy-construction algorithm is utilized based on heuristic information for the initial layout.Then a feasibility analysis is performed to judge if the balanced and unconcentrated loading constrains are reached.Through introducing optimization or adjustment methods,an overall optimal solution can be obtained.Experiments are conducted using data generated from real cases showing the effectiveness of our approach: volume utility ratio of 90.4% and load capacity utility ratio of 86.7% which is comparably even to the packing of the general freights.展开更多
We propose a new constructive algorithm, called HAPE3 D, which is a heuristic algorithm based on the principle of minimum total potential energy for the 3D irregular packing problem, involving packing a set of irregul...We propose a new constructive algorithm, called HAPE3 D, which is a heuristic algorithm based on the principle of minimum total potential energy for the 3D irregular packing problem, involving packing a set of irregularly shaped polyhedrons into a box-shaped container with fixed width and length but unconstrained height. The objective is to allocate all the polyhedrons in the container, and thus minimize the waste or maximize profit. HAPE3 D can deal with arbitrarily shaped polyhedrons, which can be rotated around each coordinate axis at different angles. The most outstanding merit is that HAPE3 D does not need to calculate no-fit polyhedron(NFP), which is a huge obstacle for the 3D packing problem. HAPE3 D can also be hybridized with a meta-heuristic algorithm such as simulated annealing. Two groups of computational experiments demonstrate the good performance of HAPE3 D and prove that it can be hybridized quite well with a meta-heuristic algorithm to further improve the packing quality.展开更多
Multiple-Instance Learning (MIL) is used to predict the unlabeled bags' label by learning the labeled positive training bags and negative training bags.Each bag is made up of several unlabeled instances.A bag is la...Multiple-Instance Learning (MIL) is used to predict the unlabeled bags' label by learning the labeled positive training bags and negative training bags.Each bag is made up of several unlabeled instances.A bag is labeled positive if at least one of its instances is positive,otherwise negative.Existing multiple-instance learning methods with instance selection ignore the representative degree of the selected instances.For example,if an instance has many similar instances with the same label around it,the instance should be more representative than others.Based on this idea,in this paper,a multiple-instance learning with instance selection via constructive covering algorithm (MilCa) is proposed.In MilCa,we firstly use maximal Hausdorff to select some initial positive instances from positive bags,then use a Constructive Covering Algorithm (CCA) to restructure the structure of the original instances of negative bags.Then an inverse testing process is employed to exclude the false positive instances from positive bags and to select the high representative degree instances ordered by the number of covered instances from training bags.Finally,a similarity measure function is used to convert the training bag into a single sample and CCA is again used to classification for the converted samples.Experimental results on synthetic data and standard benchmark datasets demonstrate that MilCa can decrease the number of the selected instances and it is competitive with the state-of-the-art MIL algorithms.展开更多
Mining from ambiguous data is very important in data mining. This paper discusses one of the tasks for mining from ambiguous data known as multi-instance problem. In multi-instance problem, each pattern is a labeled b...Mining from ambiguous data is very important in data mining. This paper discusses one of the tasks for mining from ambiguous data known as multi-instance problem. In multi-instance problem, each pattern is a labeled bag that consists of a number of unlabeled instances. A bag is negative if all instances in it are negative. A bag is positive if it has at least one positive instance. Because the instances in the positive bag are not labeled, each positive bag is an ambiguous. The mining aim is to classify unseen bags. The main idea of existing multi-instance algorithms is to find true positive instances in positive bags and convert the multi-instance problem to the supervised problem, and get the labels of test bags according to predict the labels of unknown instances. In this paper, we aim at mining the multi-instance data from another point of view, i.e., excluding the false positive instances in positive bags and predicting the label of an entire unknown bag. We propose an algorithm called Multi-Instance Covering kNN (MICkNN) for mining from multi-instance data. Briefly, constructive covering algorithm is utilized to restructure the structure of the original multi-instance data at first. Then, the kNN algorithm is applied to discriminate the false positive instances. In the test stage, we label the tested bag directly according to the similarity between the unseen bag and sphere neighbors obtained from last two steps. Experimental results demonstrate the proposed algorithm is competitive with most of the state-of-the-art multi-instance methods both in classification accuracy and running time.展开更多
Quorum system is a preferable model to construct distributed access control architecture, but not all quorum system can satisfy the requirements of distributed access control architecture. Aiming at the dependable pro...Quorum system is a preferable model to construct distributed access control architecture, but not all quorum system can satisfy the requirements of distributed access control architecture. Aiming at the dependable problem of authorization server in distributed system and combining the requirements of access control, a set of criterions to select and evaluate quorum system is presented. The scheme and algorithm of constructing an authorization server system based on Paths quorum system are designed, and the integrated sys- tem performance under some servers attacked is fully analyzed. Role-based access control on the Web implemented by this scheme is introduced. Analysis shows that with certain node failure probability, the scheme not only has high dependability but also can satisfy the special requirements of distributed access control such as real-time, parallelism, and consistency of security policy.展开更多
Based on the improved particle swarm optimization(PSO) algorithm,an optimization approach for the cargo oil tank design(COTD) is presented in this paper.The purpose is to design an optimal overall dimension of the car...Based on the improved particle swarm optimization(PSO) algorithm,an optimization approach for the cargo oil tank design(COTD) is presented in this paper.The purpose is to design an optimal overall dimension of the cargo oil tank(COT) under various kinds of constraints in the preliminary design stage.A non-linear programming model is built to simulate the optimization design,in which the requirements and rules for COTD are used as the constraints.Considering the distance between the inner shell and hull,a fuzzy constraint is used to express the feasibility degree of the double-hull configuration.In terms of the characteristic of COTD,the PSO algorithm is improved to solve this problem.A bivariate extremum strategy is presented to deal with the fuzzy constraint,by which the maximum and minimum cargo capacities are obtained simultaneously.Finally,the simulation demonstrates the feasibility and effectiveness of the proposed approach.展开更多
This paper presents a new and simple scheme to describe the convex hull in R^d,which only uses three kinds of the faces of the convex hull,i.e.,the d-1-faces,d-2-faces and 0-faces.Thus,we develop an efficient new algo...This paper presents a new and simple scheme to describe the convex hull in R^d,which only uses three kinds of the faces of the convex hull,i.e.,the d-1-faces,d-2-faces and 0-faces.Thus,we develop an efficient new algorithm for constructing the convex hull of a finite set of points incrementally. This algorithm employs much less storage and time than that of the previously-existing approaches.The analysis of the running time as well as the storage for the new algorithm is also theoretically made.The algorithm is optimal in the worst case for even d.展开更多
It is a common practice to simulate some historical or test systems to validate the efficiency of new methods or concepts. However, there are only a small number of existing power system test cases, and validation and...It is a common practice to simulate some historical or test systems to validate the efficiency of new methods or concepts. However, there are only a small number of existing power system test cases, and validation and evaluation results, obtained using such a limited number of test cases, may not be deemed sufficient or convincing. In order to provide more available test cases, a new random graph generation algorithm, named ‘‘dualstage constructed random graph’’ algorithm, is proposed to effectively model the power grid topology. The algorithm generates a spanning tree to guarantee the connectivity of random graphs and is capable of controlling the number of lines precisely. No matter how much the average degree is,whether sparse or not, random graphs can be quickly formed to satisfy the requirements. An approach is developed to generate random graphs with prescribed numbers of connected components, in order to simulate the power grid topology under fault conditions. Our experimental study on several realistic power grid topologies proves that the proposed algorithm can quickly generate a large number of random graphs with the topology characteristics of real-world power grid.展开更多
基金the National Natural Science Foundation of China,the National Key Basic Research Program of China,The authors would like to thank all project partners for their valuable contributions and feedbacks
文摘Abstract: The layered decoding algorithm has been widely used in the implementation of Low Density Parity Check (LDPC) decoders, due to its high convergence speed. However, the pipeline operation of the layered decoder may introduce memory access conflicts, which heavily deteriorates the decoder throughput. To essentially deal with the issue of memory access conflicts,
基金Project(60872005) supported by National Natural Science Foundation of China
文摘To adjust the variance of source rate in linear broadcast networks, global encoding kernels should have corresponding dimensions to instruct the decoding process. The algorithm of constructing such global encoding kernels is to adjust heterogeneous network to possible link failures. Linear algebra, graph theory and group theory are applied to construct one series of global encoding kernels which are applicable to all source rates. The effectiveness and existence of such global encoding kernels are proved. Based on 2 information flow, the algorithm of construction is explicitly given within polynomial time O(|E| |T|.ω^2max), and the memory complexity of algorithm is O(|E|). Both time and memory complexity of this algorithm proposed can be O(ωmax) less than those of algorithms in related works.
基金Project(71371193)supported by the National Natural Science Foundation of ChinaProjects(2005K1001,2007K1005)supported by Guangzhou-Shenzhen Railway Company Limited,China
文摘The optimization of high density and concentrated-weight freights loading requires an even distribution of the freight's weight and unconcentrated loading on the floor of the car.Based on the characteristics of concentrated-weight category freights,an improvement method is put forward to build freight towers and a greedy-construction algorithm is utilized based on heuristic information for the initial layout.Then a feasibility analysis is performed to judge if the balanced and unconcentrated loading constrains are reached.Through introducing optimization or adjustment methods,an overall optimal solution can be obtained.Experiments are conducted using data generated from real cases showing the effectiveness of our approach: volume utility ratio of 90.4% and load capacity utility ratio of 86.7% which is comparably even to the packing of the general freights.
基金supported by the Natural Science Foundation of Guangdong Province,China(No.S2013040016594)the Natural Science Foundation of Liaoning Province,China(No.201102164)the Fundamental Research Funds for the Central Universities,China(No.2013ZM0124)
文摘We propose a new constructive algorithm, called HAPE3 D, which is a heuristic algorithm based on the principle of minimum total potential energy for the 3D irregular packing problem, involving packing a set of irregularly shaped polyhedrons into a box-shaped container with fixed width and length but unconstrained height. The objective is to allocate all the polyhedrons in the container, and thus minimize the waste or maximize profit. HAPE3 D can deal with arbitrarily shaped polyhedrons, which can be rotated around each coordinate axis at different angles. The most outstanding merit is that HAPE3 D does not need to calculate no-fit polyhedron(NFP), which is a huge obstacle for the 3D packing problem. HAPE3 D can also be hybridized with a meta-heuristic algorithm such as simulated annealing. Two groups of computational experiments demonstrate the good performance of HAPE3 D and prove that it can be hybridized quite well with a meta-heuristic algorithm to further improve the packing quality.
基金supported by the National Natural Science Foundation of China (No. 61175046)the Provincial Natural Science Research Program of Higher Education Institutions of Anhui Province (No. KJ2013A016)+1 种基金the Outstanding Young Talents in Higher Education Institutions of Anhui Province (No. 2011SQRL146)the Recruitment Project of Anhui University for Academic and Technology Leader
文摘Multiple-Instance Learning (MIL) is used to predict the unlabeled bags' label by learning the labeled positive training bags and negative training bags.Each bag is made up of several unlabeled instances.A bag is labeled positive if at least one of its instances is positive,otherwise negative.Existing multiple-instance learning methods with instance selection ignore the representative degree of the selected instances.For example,if an instance has many similar instances with the same label around it,the instance should be more representative than others.Based on this idea,in this paper,a multiple-instance learning with instance selection via constructive covering algorithm (MilCa) is proposed.In MilCa,we firstly use maximal Hausdorff to select some initial positive instances from positive bags,then use a Constructive Covering Algorithm (CCA) to restructure the structure of the original instances of negative bags.Then an inverse testing process is employed to exclude the false positive instances from positive bags and to select the high representative degree instances ordered by the number of covered instances from training bags.Finally,a similarity measure function is used to convert the training bag into a single sample and CCA is again used to classification for the converted samples.Experimental results on synthetic data and standard benchmark datasets demonstrate that MilCa can decrease the number of the selected instances and it is competitive with the state-of-the-art MIL algorithms.
基金the National Natural Science Foundation of China (Nos. 61073117 and 61175046)the Provincial Natural Science Research Program of Higher Education Institutions of Anhui Province (No. KJ2013A016)+1 种基金the Academic Innovative Research Projects of Anhui University Graduate Students (No. 10117700183)the 211 Project of Anhui University
文摘Mining from ambiguous data is very important in data mining. This paper discusses one of the tasks for mining from ambiguous data known as multi-instance problem. In multi-instance problem, each pattern is a labeled bag that consists of a number of unlabeled instances. A bag is negative if all instances in it are negative. A bag is positive if it has at least one positive instance. Because the instances in the positive bag are not labeled, each positive bag is an ambiguous. The mining aim is to classify unseen bags. The main idea of existing multi-instance algorithms is to find true positive instances in positive bags and convert the multi-instance problem to the supervised problem, and get the labels of test bags according to predict the labels of unknown instances. In this paper, we aim at mining the multi-instance data from another point of view, i.e., excluding the false positive instances in positive bags and predicting the label of an entire unknown bag. We propose an algorithm called Multi-Instance Covering kNN (MICkNN) for mining from multi-instance data. Briefly, constructive covering algorithm is utilized to restructure the structure of the original multi-instance data at first. Then, the kNN algorithm is applied to discriminate the false positive instances. In the test stage, we label the tested bag directly according to the similarity between the unseen bag and sphere neighbors obtained from last two steps. Experimental results demonstrate the proposed algorithm is competitive with most of the state-of-the-art multi-instance methods both in classification accuracy and running time.
基金Supported by the National Natural Science Foundation of China (70771043, 60873225, 60773191)
文摘Quorum system is a preferable model to construct distributed access control architecture, but not all quorum system can satisfy the requirements of distributed access control architecture. Aiming at the dependable problem of authorization server in distributed system and combining the requirements of access control, a set of criterions to select and evaluate quorum system is presented. The scheme and algorithm of constructing an authorization server system based on Paths quorum system are designed, and the integrated sys- tem performance under some servers attacked is fully analyzed. Role-based access control on the Web implemented by this scheme is introduced. Analysis shows that with certain node failure probability, the scheme not only has high dependability but also can satisfy the special requirements of distributed access control such as real-time, parallelism, and consistency of security policy.
基金the National Special Fund for Agro-scientific Research in the Public Interest(No.201003024)
文摘Based on the improved particle swarm optimization(PSO) algorithm,an optimization approach for the cargo oil tank design(COTD) is presented in this paper.The purpose is to design an optimal overall dimension of the cargo oil tank(COT) under various kinds of constraints in the preliminary design stage.A non-linear programming model is built to simulate the optimization design,in which the requirements and rules for COTD are used as the constraints.Considering the distance between the inner shell and hull,a fuzzy constraint is used to express the feasibility degree of the double-hull configuration.In terms of the characteristic of COTD,the PSO algorithm is improved to solve this problem.A bivariate extremum strategy is presented to deal with the fuzzy constraint,by which the maximum and minimum cargo capacities are obtained simultaneously.Finally,the simulation demonstrates the feasibility and effectiveness of the proposed approach.
基金This work is supported by the National Natural Science Foundation of China.
文摘This paper presents a new and simple scheme to describe the convex hull in R^d,which only uses three kinds of the faces of the convex hull,i.e.,the d-1-faces,d-2-faces and 0-faces.Thus,we develop an efficient new algorithm for constructing the convex hull of a finite set of points incrementally. This algorithm employs much less storage and time than that of the previously-existing approaches.The analysis of the running time as well as the storage for the new algorithm is also theoretically made.The algorithm is optimal in the worst case for even d.
文摘It is a common practice to simulate some historical or test systems to validate the efficiency of new methods or concepts. However, there are only a small number of existing power system test cases, and validation and evaluation results, obtained using such a limited number of test cases, may not be deemed sufficient or convincing. In order to provide more available test cases, a new random graph generation algorithm, named ‘‘dualstage constructed random graph’’ algorithm, is proposed to effectively model the power grid topology. The algorithm generates a spanning tree to guarantee the connectivity of random graphs and is capable of controlling the number of lines precisely. No matter how much the average degree is,whether sparse or not, random graphs can be quickly formed to satisfy the requirements. An approach is developed to generate random graphs with prescribed numbers of connected components, in order to simulate the power grid topology under fault conditions. Our experimental study on several realistic power grid topologies proves that the proposed algorithm can quickly generate a large number of random graphs with the topology characteristics of real-world power grid.