In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct pi...In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.展开更多
Blockchain technology,with its attributes of decentralization,immutability,and traceability,has emerged as a powerful catalyst for enhancing traditional industries in terms of optimizing business processes.However,tra...Blockchain technology,with its attributes of decentralization,immutability,and traceability,has emerged as a powerful catalyst for enhancing traditional industries in terms of optimizing business processes.However,transaction performance and scalability has become the main challenges hindering the widespread adoption of blockchain.Due to its inability to meet the demands of high-frequency trading,blockchain cannot be adopted in many scenarios.To improve the transaction capacity,researchers have proposed some on-chain scaling technologies,including lightning networks,directed acyclic graph technology,state channels,and shardingmechanisms,inwhich sharding emerges as a potential scaling technology.Nevertheless,excessive cross-shard transactions and uneven shard workloads prevent the sharding mechanism from achieving the expected aim.This paper proposes a graphbased sharding scheme for public blockchain to efficiently balance the transaction distribution.Bymitigating crossshard transactions and evening-out workloads among shards,the scheme reduces transaction confirmation latency and enhances the transaction capacity of the blockchain.Therefore,the scheme can achieve a high-frequency transaction as well as a better blockchain scalability.Experiments results show that the scheme effectively reduces the cross-shard transaction ratio to a range of 35%-56%and significantly decreases the transaction confirmation latency to 6 s in a blockchain with no more than 25 shards.展开更多
Information about the relative importance of each criterion or theweights of criteria can have a significant influence on the ultimate rank of alternatives.Accordingly,assessing the weights of criteria is a very impor...Information about the relative importance of each criterion or theweights of criteria can have a significant influence on the ultimate rank of alternatives.Accordingly,assessing the weights of criteria is a very important task in solving multi-criteria decision-making problems.Three methods are commonly used for assessing the weights of criteria:objective,subjective,and integrated methods.In this study,an objective approach is proposed to assess the weights of criteria,called SPCmethod(Symmetry Point of Criterion).This point enriches the criterion so that it is balanced and easy to implement in the process of the evaluation of its influence on decision-making.The SPC methodology is systematically presented and supported by detailed calculations related to an artificial example.To validate the developed method,we used our numerical example and calculated the weights of criteria by CRITIC,Entropy,Standard Deviation and MEREC methods.Comparative analysis between these methods and the SPC method reveals that the developedmethod is a very reliable objective way to determine the weights of criteria.Additionally,in this study,we proposed the application of SPCmethod to evaluate the efficiency of themulti-criteria partitioning algorithm.The main idea of the evaluation is based on the following fact:the greater the uniformity of the weights of criteria,the higher the efficiency of the partitioning algorithm.The research demonstrates that the SPC method can be applied to solving different multi-criteria problems.展开更多
The simulation of multi-domain,multi-physics mathematical models with uncertain parameters can be quite demanding in terms of algorithm design and com-putation costs.Our main objective in this paper is to examine a ph...The simulation of multi-domain,multi-physics mathematical models with uncertain parameters can be quite demanding in terms of algorithm design and com-putation costs.Our main objective in this paper is to examine a physical interface coupling between two random dissipative systems with uncertain parameters.Due to the complexity and uncertainty inherent in such interface-coupled problems,un-certain diffusion coefficients or friction parameters often arise,leading to consid-ering random systems.We employ Monte Carlo methods to produce independent and identically distributed deterministic heat-heat model samples to address ran-dom systems,and adroitly integrate the ensemble idea to facilitate the fast calcu-lation of these samples.To achieve unconditional stability,we introduce the scalar auxiliary variable(SAV)method to overcome the time constraints of the ensemble implicit-explicit algorithm.Furthermore,for a more accurate and stable scheme,the ensemble data-passing algorithm is raised,which is unconditionally stable and convergent without any auxiliary variables.These algorithms employ the same co-efficient matrix for multiple linear systems and enable easy parallelization,which can significantly reduce the computational cost.Finally,numerical experiments are conducted to support the theoretical results and showcase the unique features of the proposed algorithms.展开更多
Many real-world networks are found to be scale-free. However, graph partition technology, as a technology capable of parallel computing, performs poorly when scale-free graphs are provided. The reason for this is that...Many real-world networks are found to be scale-free. However, graph partition technology, as a technology capable of parallel computing, performs poorly when scale-free graphs are provided. The reason for this is that traditional partitioning algorithms are designed for random networks and regular networks, rather than for scale-free networks. Multilevel graph-partitioning algorithms are currently considered to be the state of the art and are used extensively. In this paper, we analyse the reasons why traditional multilevel graph-partitioning algorithms perform poorly and present a new multilevel graph-partitioning paradigm, top down partitioning, which derives its name from the comparison with the traditional bottom-up partitioning. A new multilevel partitioning algorithm, named betweenness-based partitioning algorithm, is also presented as an implementation of top-down partitioning paradigm. An experimental evaluation of seven different real-world scale-free networks shows that the betweenness-based partitioning algorithm significantly outperforms the existing state-of-the-art approaches.展开更多
We reviewand compare different fluid-structure interaction(FSI)numerical methods in the context of heart modeling,aiming at assessing their computational efficiency for cardiac numerical simulations and selecting the ...We reviewand compare different fluid-structure interaction(FSI)numerical methods in the context of heart modeling,aiming at assessing their computational efficiency for cardiac numerical simulations and selecting the most appropriate method for heart FSI.Blood dynamics within the human heart is characterized by active muscular action,during both contraction and relaxation phases of the heartbeat.The efficient solution of the FSI problem in this context is challenging,due to the added-mass effect(caused by the comparable densities of fluid and solid,typical of biomechanics)and to the complexity,nonlinearity and anisotropy of cardiac consitutive laws.In this work,we review existing numerical coupling schemes for FSI in the two classes of strongly-coupled partitioned and monolithic schemes.The schemes are compared on numerical tests that mimic the flow regime characterizing the heartbeat in a human ventricle,during both systole and diastole.Active mechanics is treated in both the active stress and active strain frameworks.Computational costs suggest the use of a monolithic method.We employ it to simulate a full heartbeat of a human ventricle,showing how it allows to efficiently obtain physiologically meaningful results.展开更多
When we use Modified Configuration Interaction method(MCI) to calculate the correlation energy of double electron systems, for obtaining the higher precision, we always need huge calculations. In order to handle this...When we use Modified Configuration Interaction method(MCI) to calculate the correlation energy of double electron systems, for obtaining the higher precision, we always need huge calculations. In order to handle this problem, which will cost much CPU time and memory room if only using a single computer to do it, we now adopt the parallel multisection recurrence algorithm. Thus we can use several CPUs to get the ground state energy of a Helium atom at the same time.展开更多
The mathematical and statistical modeling of the problem of poverty is a major challenge given Burundi’s economic development. Innovative economic optimization systems are widely needed to face the problem of the dyn...The mathematical and statistical modeling of the problem of poverty is a major challenge given Burundi’s economic development. Innovative economic optimization systems are widely needed to face the problem of the dynamic of the poverty in Burundi. The Burundian economy shows an inflation rate of -1.5% in 2018 for the Gross Domestic Product growth real rate of 2.8% in 2016. In this research, the aim is to find a model that contributes to solving the problem of poverty in Burundi. The results of this research fill the knowledge gap in the modeling and optimization of the Burundian economic system. The aim of this model is to solve an optimization problem combining the variables of production, consumption, budget, human resources and available raw materials. Scientific modeling and optimal solving of the poverty problem show the tools for measuring poverty rate and determining various countries’ poverty levels when considering advanced knowledge. In addition, investigating the aspects of poverty will properly orient development aid to developing countries and thus, achieve their objectives of growth and the fight against poverty. This paper provides a new and innovative framework for global scientific research regarding the multiple facets of this problem. An estimate of the poverty rate allows good progress with the theory and optimization methods in measuring the poverty rate and achieving sustainable development goals. By comparing the annual food production and the required annual consumption, there is an imbalance between different types of food. Proteins, minerals and vitamins produced in Burundi are sufficient when considering their consumption as required by the entire Burundian population. This positive contribution for the latter comes from the fact that some cows, goats, fishes, ···, slaughtered in Burundi come from neighboring countries. Real production remains in deficit. The lipids, acids, calcium, fibers and carbohydrates produced in Burundi are insufficient for consumption. This negative contribution proves a Burundian food deficit. It is a decision-making indicator for the design and updating of agricultural policy and implementation programs as well as projects. Investment and economic growth are only possible when food security is mastered. The capital allocated to food investment must be revised upwards. Demographic control is also a relevant indicator to push forward Burundi among the emerging countries in 2040. Meanwhile, better understanding of the determinants of poverty by taking cultural and organizational aspects into account guides managers for poverty reduction projects and programs.展开更多
Gait recognition has significant potential for remote human identification,hut it is easily influenced by identity-unrelated factors such as clothing,carrying conditions,and view angles.Many gait templates have been p...Gait recognition has significant potential for remote human identification,hut it is easily influenced by identity-unrelated factors such as clothing,carrying conditions,and view angles.Many gait templates have been presented that can effectively represent gait features.Each gait template has its advantages and can represent different prominent information.In this paper,gait template fusion is proposed to improve the classical representative gait template(such as a gait energy image)which represents incomplete information that is sensitive to changes in contour.We also present a partition method to reflect the different gait habits of different body parts of each pedestrian.The fused template is cropped into three parts(head,trunk,and leg regions)depending on the human body,and the three parts are then sent into the convolutional neural network to learn merged features.We present an extensive empirical evaluation of the CASIA-B dataset and compare the proposed method with existing ones.The results show good accuracy and robustness of the proposed method for gait recognition.展开更多
Hardware/software(HW/SW) partitioning is one of the key processes in an embedded system.It is used to determine which system components are assigned to hardware and which are processed by software.In contrast with p...Hardware/software(HW/SW) partitioning is one of the key processes in an embedded system.It is used to determine which system components are assigned to hardware and which are processed by software.In contrast with previous research that focuses on developing efficient heuristic,we focus on the pre-process of the task graph before the HW/SW partitioning in this paper,that is,enumerating all the sub-graphs that meet the requirements.Experimental results showed that the original graph can be reduced to 67% in the worst-case scenario and 58% in the best-case scenario.In conclusion,the reduced task graph saved hardware area while improving partitioning speed and accuracy.展开更多
Numerous previous literature has attempted to apply machine learning techniques to analyze relationships between energy variables in energy consumption.However,most machine learning methods are primarily used for pred...Numerous previous literature has attempted to apply machine learning techniques to analyze relationships between energy variables in energy consumption.However,most machine learning methods are primarily used for prediction through complicated learning processes at the expense of interpretability.Those methods have difficulties in evaluating the effect of energy variables on energy consumption and especially capturing their heterogeneous relationship.Therefore,to identify the energy consumption of the heterogeneous relationships in actual buildings,this study applies the MOdel-Based recursive partitioning(MOB)algorithm to the 2012 CBECS survey data,which would offer representative information about actual commercial building characteristics and energy consumption.With resultant tree-structured subgroups,the MOB tree reveals the heterogeneous effect of energy variables and mutual influences on building energy consumptions.The results of this study would provide insights for architects and engineers to develop energy conservative design and retrofit in U.S.office buildings.展开更多
In the course of high-level synthesis of integrate circuit, the hard-to-test structure caused by irrational schedule and allocation reduces the testability of circuit. In order to improve the circuit testability, this...In the course of high-level synthesis of integrate circuit, the hard-to-test structure caused by irrational schedule and allocation reduces the testability of circuit. In order to improve the circuit testability, this paper proposes a weighted compatibility graph (WCG), which provides a weighted formula of compatibility graph based on register allocation for testability and uses improved weighted compatibility clique partition algorithm to deal with this WCG. As a result, four rules for testability are considered simultaneously in the course of register allocation so that the objective of improving the design of testability is acquired. Tested by many experimental results of benchmarks and compared with many other models, the register allocation algorithm proposed in this paper has greatly improved the circuit testability with little overhead on the final circuit area.展开更多
基金the Japan Society for the Promotion of Science,KAKENHI Grant Nos.20H04199 and 23H00475.
文摘In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.
基金supported by Shandong Provincial Key Research and Development Program of China(2021CXGC010107,2020CXGC010107)the Shandong Provincial Natural Science Foundation of China(ZR2020KF035)the New 20 Project of Higher Education of Jinan,China(202228017).
文摘Blockchain technology,with its attributes of decentralization,immutability,and traceability,has emerged as a powerful catalyst for enhancing traditional industries in terms of optimizing business processes.However,transaction performance and scalability has become the main challenges hindering the widespread adoption of blockchain.Due to its inability to meet the demands of high-frequency trading,blockchain cannot be adopted in many scenarios.To improve the transaction capacity,researchers have proposed some on-chain scaling technologies,including lightning networks,directed acyclic graph technology,state channels,and shardingmechanisms,inwhich sharding emerges as a potential scaling technology.Nevertheless,excessive cross-shard transactions and uneven shard workloads prevent the sharding mechanism from achieving the expected aim.This paper proposes a graphbased sharding scheme for public blockchain to efficiently balance the transaction distribution.Bymitigating crossshard transactions and evening-out workloads among shards,the scheme reduces transaction confirmation latency and enhances the transaction capacity of the blockchain.Therefore,the scheme can achieve a high-frequency transaction as well as a better blockchain scalability.Experiments results show that the scheme effectively reduces the cross-shard transaction ratio to a range of 35%-56%and significantly decreases the transaction confirmation latency to 6 s in a blockchain with no more than 25 shards.
文摘Information about the relative importance of each criterion or theweights of criteria can have a significant influence on the ultimate rank of alternatives.Accordingly,assessing the weights of criteria is a very important task in solving multi-criteria decision-making problems.Three methods are commonly used for assessing the weights of criteria:objective,subjective,and integrated methods.In this study,an objective approach is proposed to assess the weights of criteria,called SPCmethod(Symmetry Point of Criterion).This point enriches the criterion so that it is balanced and easy to implement in the process of the evaluation of its influence on decision-making.The SPC methodology is systematically presented and supported by detailed calculations related to an artificial example.To validate the developed method,we used our numerical example and calculated the weights of criteria by CRITIC,Entropy,Standard Deviation and MEREC methods.Comparative analysis between these methods and the SPC method reveals that the developedmethod is a very reliable objective way to determine the weights of criteria.Additionally,in this study,we proposed the application of SPCmethod to evaluate the efficiency of themulti-criteria partitioning algorithm.The main idea of the evaluation is based on the following fact:the greater the uniformity of the weights of criteria,the higher the efficiency of the partitioning algorithm.The research demonstrates that the SPC method can be applied to solving different multi-criteria problems.
文摘The simulation of multi-domain,multi-physics mathematical models with uncertain parameters can be quite demanding in terms of algorithm design and com-putation costs.Our main objective in this paper is to examine a physical interface coupling between two random dissipative systems with uncertain parameters.Due to the complexity and uncertainty inherent in such interface-coupled problems,un-certain diffusion coefficients or friction parameters often arise,leading to consid-ering random systems.We employ Monte Carlo methods to produce independent and identically distributed deterministic heat-heat model samples to address ran-dom systems,and adroitly integrate the ensemble idea to facilitate the fast calcu-lation of these samples.To achieve unconditional stability,we introduce the scalar auxiliary variable(SAV)method to overcome the time constraints of the ensemble implicit-explicit algorithm.Furthermore,for a more accurate and stable scheme,the ensemble data-passing algorithm is raised,which is unconditionally stable and convergent without any auxiliary variables.These algorithms employ the same co-efficient matrix for multiple linear systems and enable easy parallelization,which can significantly reduce the computational cost.Finally,numerical experiments are conducted to support the theoretical results and showcase the unique features of the proposed algorithms.
基金supported by the National Science Foundation for Distinguished Young Scholars of China(Grant Nos.61003082 and 60903059)the National Natural Science Foundation of China(Grant No.60873014)the Foundation for Innovative Research Groups of the National Natural Science Foundation of China(Grant No.60921062)
文摘Many real-world networks are found to be scale-free. However, graph partition technology, as a technology capable of parallel computing, performs poorly when scale-free graphs are provided. The reason for this is that traditional partitioning algorithms are designed for random networks and regular networks, rather than for scale-free networks. Multilevel graph-partitioning algorithms are currently considered to be the state of the art and are used extensively. In this paper, we analyse the reasons why traditional multilevel graph-partitioning algorithms perform poorly and present a new multilevel graph-partitioning paradigm, top down partitioning, which derives its name from the comparison with the traditional bottom-up partitioning. A new multilevel partitioning algorithm, named betweenness-based partitioning algorithm, is also presented as an implementation of top-down partitioning paradigm. An experimental evaluation of seven different real-world scale-free networks shows that the betweenness-based partitioning algorithm significantly outperforms the existing state-of-the-art approaches.
基金funding from the European Research Council(ERC)under the European Union’s Horizon 2020 research and innovation programme(grant agreement No 740132,iHEART-An Integrated Heart Model for the simulation of the cardiac function,P.I.Prof.A.Quarteroni).
文摘We reviewand compare different fluid-structure interaction(FSI)numerical methods in the context of heart modeling,aiming at assessing their computational efficiency for cardiac numerical simulations and selecting the most appropriate method for heart FSI.Blood dynamics within the human heart is characterized by active muscular action,during both contraction and relaxation phases of the heartbeat.The efficient solution of the FSI problem in this context is challenging,due to the added-mass effect(caused by the comparable densities of fluid and solid,typical of biomechanics)and to the complexity,nonlinearity and anisotropy of cardiac consitutive laws.In this work,we review existing numerical coupling schemes for FSI in the two classes of strongly-coupled partitioned and monolithic schemes.The schemes are compared on numerical tests that mimic the flow regime characterizing the heartbeat in a human ventricle,during both systole and diastole.Active mechanics is treated in both the active stress and active strain frameworks.Computational costs suggest the use of a monolithic method.We employ it to simulate a full heartbeat of a human ventricle,showing how it allows to efficiently obtain physiologically meaningful results.
文摘When we use Modified Configuration Interaction method(MCI) to calculate the correlation energy of double electron systems, for obtaining the higher precision, we always need huge calculations. In order to handle this problem, which will cost much CPU time and memory room if only using a single computer to do it, we now adopt the parallel multisection recurrence algorithm. Thus we can use several CPUs to get the ground state energy of a Helium atom at the same time.
文摘The mathematical and statistical modeling of the problem of poverty is a major challenge given Burundi’s economic development. Innovative economic optimization systems are widely needed to face the problem of the dynamic of the poverty in Burundi. The Burundian economy shows an inflation rate of -1.5% in 2018 for the Gross Domestic Product growth real rate of 2.8% in 2016. In this research, the aim is to find a model that contributes to solving the problem of poverty in Burundi. The results of this research fill the knowledge gap in the modeling and optimization of the Burundian economic system. The aim of this model is to solve an optimization problem combining the variables of production, consumption, budget, human resources and available raw materials. Scientific modeling and optimal solving of the poverty problem show the tools for measuring poverty rate and determining various countries’ poverty levels when considering advanced knowledge. In addition, investigating the aspects of poverty will properly orient development aid to developing countries and thus, achieve their objectives of growth and the fight against poverty. This paper provides a new and innovative framework for global scientific research regarding the multiple facets of this problem. An estimate of the poverty rate allows good progress with the theory and optimization methods in measuring the poverty rate and achieving sustainable development goals. By comparing the annual food production and the required annual consumption, there is an imbalance between different types of food. Proteins, minerals and vitamins produced in Burundi are sufficient when considering their consumption as required by the entire Burundian population. This positive contribution for the latter comes from the fact that some cows, goats, fishes, ···, slaughtered in Burundi come from neighboring countries. Real production remains in deficit. The lipids, acids, calcium, fibers and carbohydrates produced in Burundi are insufficient for consumption. This negative contribution proves a Burundian food deficit. It is a decision-making indicator for the design and updating of agricultural policy and implementation programs as well as projects. Investment and economic growth are only possible when food security is mastered. The capital allocated to food investment must be revised upwards. Demographic control is also a relevant indicator to push forward Burundi among the emerging countries in 2040. Meanwhile, better understanding of the determinants of poverty by taking cultural and organizational aspects into account guides managers for poverty reduction projects and programs.
基金Project supported by the National Natural Science Foundation of China(No.61573114)。
文摘Gait recognition has significant potential for remote human identification,hut it is easily influenced by identity-unrelated factors such as clothing,carrying conditions,and view angles.Many gait templates have been presented that can effectively represent gait features.Each gait template has its advantages and can represent different prominent information.In this paper,gait template fusion is proposed to improve the classical representative gait template(such as a gait energy image)which represents incomplete information that is sensitive to changes in contour.We also present a partition method to reflect the different gait habits of different body parts of each pedestrian.The fused template is cropped into three parts(head,trunk,and leg regions)depending on the human body,and the three parts are then sent into the convolutional neural network to learn merged features.We present an extensive empirical evaluation of the CASIA-B dataset and compare the proposed method with existing ones.The results show good accuracy and robustness of the proposed method for gait recognition.
基金Supported by the National Natural Science Foundation of China (60970016,61173032)
文摘Hardware/software(HW/SW) partitioning is one of the key processes in an embedded system.It is used to determine which system components are assigned to hardware and which are processed by software.In contrast with previous research that focuses on developing efficient heuristic,we focus on the pre-process of the task graph before the HW/SW partitioning in this paper,that is,enumerating all the sub-graphs that meet the requirements.Experimental results showed that the original graph can be reduced to 67% in the worst-case scenario and 58% in the best-case scenario.In conclusion,the reduced task graph saved hardware area while improving partitioning speed and accuracy.
文摘Numerous previous literature has attempted to apply machine learning techniques to analyze relationships between energy variables in energy consumption.However,most machine learning methods are primarily used for prediction through complicated learning processes at the expense of interpretability.Those methods have difficulties in evaluating the effect of energy variables on energy consumption and especially capturing their heterogeneous relationship.Therefore,to identify the energy consumption of the heterogeneous relationships in actual buildings,this study applies the MOdel-Based recursive partitioning(MOB)algorithm to the 2012 CBECS survey data,which would offer representative information about actual commercial building characteristics and energy consumption.With resultant tree-structured subgroups,the MOB tree reveals the heterogeneous effect of energy variables and mutual influences on building energy consumptions.The results of this study would provide insights for architects and engineers to develop energy conservative design and retrofit in U.S.office buildings.
基金the National Natural Science Foundation of China (No.60273081)
文摘In the course of high-level synthesis of integrate circuit, the hard-to-test structure caused by irrational schedule and allocation reduces the testability of circuit. In order to improve the circuit testability, this paper proposes a weighted compatibility graph (WCG), which provides a weighted formula of compatibility graph based on register allocation for testability and uses improved weighted compatibility clique partition algorithm to deal with this WCG. As a result, four rules for testability are considered simultaneously in the course of register allocation so that the objective of improving the design of testability is acquired. Tested by many experimental results of benchmarks and compared with many other models, the register allocation algorithm proposed in this paper has greatly improved the circuit testability with little overhead on the final circuit area.