Search-based software engineering has mainly dealt with automated test data generation by metaheuristic search techniques. Similarly, we try to generate the test data (i.e., problem instances) which show the worst cas...Search-based software engineering has mainly dealt with automated test data generation by metaheuristic search techniques. Similarly, we try to generate the test data (i.e., problem instances) which show the worst case of algorithms by such a technique. In this paper, in terms of non-functional testing, we re-define the worst case of some algorithms, respectively. By using genetic algorithms (GAs), we illustrate the strategies corresponding to each type of instances. We here adopt three problems for examples;the sorting problem, the 0/1 knapsack problem (0/1KP), and the travelling salesperson problem (TSP). In some algorithms solving these problems, we could find the worst-case instances successfully;the successfulness of the result is based on a statistical approach and comparison to the results by using the random testing. Our tried examples introduce informative guidelines to the use of genetic algorithms in generating the worst-case instance, which is defined in the aspect of algorithm performance.展开更多
The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existin...The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existing offline data.To evaluate pricing policies when offline data are available,the decision maker can either position herself at the time point when the offline data are already observed and viewed as deterministic,or at the time point when the offline data are not yet generated and viewed as stochastic.We write a framework to discuss how and why these two different positions are relevant to online policy evaluations,from a worst-case perspective and from a Bayesian perspective.We then use a simple online pricing setting with offline data to illustrate the constructions of optimal policies for these two approaches and discuss their differences,especially whether we can decompose the searching for the optimal policy into independent subproblems and optimize separately,and whether there exists a deterministic optimal policy.展开更多
The worst-case radiation effect in deep-submicron SRAM (static random access memory) circuits is studied through theoretical analysis and experimental validation. Detailed analysis about the radiation effect in diff...The worst-case radiation effect in deep-submicron SRAM (static random access memory) circuits is studied through theoretical analysis and experimental validation. Detailed analysis about the radiation effect in different parts of circuitry is presented. For SRAM cells and a sense amplifier which includes flip-flop structures, their failure level against ionizing radiation will have a connection with the storage state during irradiation. They are inclined to store or read the same state as the one stored during irradiation. Worst-case test scheme for an SRAM circuit is presented, which contains a write operation that changes the storage states into the opposite ones after irradiation and then a read operation with opposite storage states. An irradiation experiment is designed for one 0.25 μm SRAM circuit which has a capacity of I k×8 bits. The failure level against ionizing radiation concluded from this test scheme (150 krad(Si)) is much lower than the one from the simplest test scheme (1 Mrad(Si)). It is obvious that the failure level will be overestimated if the simplest test scheme is chosen as the test standard for SRAM circuits against ionizing radiation.展开更多
Passive bistatic radar detects targets by exploiting available local broadcasters and communication transmissions as illuminators, which are not designed for radar. The signal usually contains a time-varying structure...Passive bistatic radar detects targets by exploiting available local broadcasters and communication transmissions as illuminators, which are not designed for radar. The signal usually contains a time-varying structure, which may result in high-level range ambiguity sidelobes. Because the mismatched filter is effective in suppressing sidelobes, it can be used in a passive bistatic radar. However, due to the low signal-to-noise ratio in the reference signal, the sidelobe suppression performance seriously degrades in a passive bistatic radar system. To solve this problem, a novel mismatched filtering algorithm is developed using worst-case performance optimization. In this algorithm, the influence of the low energy level in the reference signal is taken into consideration, and a new cost function is built based on worst-case performance optimization. With this optimization, the mismatched filter weights can be obtained by minimizing the total energy of the ambiguity range sidelobes. Quantitative evaluations and simulation results demonstrate that the proposed algorithm can realize sidelobe suppression when there is a low-energy reference signal. Its effectiveness is proved using real data.展开更多
G-VaR,which is a type of worst-case value-at-risk(VaR),is defined as measuring risk incorporating model uncertainty.Compared with most extant notions of worst-case VaR,G-VaR can be computed using an explicit formula,a...G-VaR,which is a type of worst-case value-at-risk(VaR),is defined as measuring risk incorporating model uncertainty.Compared with most extant notions of worst-case VaR,G-VaR can be computed using an explicit formula,and can be applied to large portfolios of several hundred dimensions with low computational cost.We also apply G-VaR to robust portfolio optimization,thereby providing a tractable means to facilitate optimal allocations under the condition of market ambiguity.展开更多
This paper describes a heuristic for solving F3/bi = b/Cmax scheduling problem. This algorithm first uses the Johnson’s algorithm solving F2Cmax, then, presents revised algorithm to solve F3/bi = b/Cmax. Lastly, an O...This paper describes a heuristic for solving F3/bi = b/Cmax scheduling problem. This algorithm first uses the Johnson’s algorithm solving F2Cmax, then, presents revised algorithm to solve F3/bi = b/Cmax. Lastly, an O(n log n) time heuristic is presented which generates a schedule with length at most 3/2 times that of an optimal schedule, for even number n 4, and 3/2 + 1/(2n) times that of an optimal schedule, for odd number n 4. These bounds are tight.展开更多
Tasks in hard real-time systems are required to meet preset deadlines, even in the presence of transient faults and hence the analysis of worst-case finish time (WCFT) must consider the extra time incurred by re-exe...Tasks in hard real-time systems are required to meet preset deadlines, even in the presence of transient faults and hence the analysis of worst-case finish time (WCFT) must consider the extra time incurred by re-executing tasks that were faulty. Existing solutions can only estimate WCFT and usually result in significant under- or over-estimation. In this work, we conclude that a sufficient and necessary condition of a task set experiencing its WCFT is that its critical task incurs all expected transient faults. A method is presented to identify the critical task and WCFT in O(IVI + IEI) where IVI and IEI are the number of tasks and dependencies between tasks, respectively. This method finds its application in testing the feasibility of directed acyclic graph (DAG) based task sets scheduled in a wide variety of fault-prone multi-processor systems, where the processors could be either homogeneous or heterogeneous, DVS-capable or DVS-incapable, etc. The common practices, which require the same time complexity as the proposed critical-task method, could either underestimate the worst case by up to 25%, or overestimate by 13%. Based on the proposed critical-task method, a simulated-annealing scheduling algorithm is developed to find the energy efficient fault-tolerant schedule for a given DAG task set. Experimental results show that the proposed critical-task method wins over a common practice by up to 40% in terms of energy saving.展开更多
To overcome disadvantages of traditional worst-case execution time (WCET) analysis approaches, we propose a new WCET analysis approach based on independent paths for ARM programs. Based on the results of program flo...To overcome disadvantages of traditional worst-case execution time (WCET) analysis approaches, we propose a new WCET analysis approach based on independent paths for ARM programs. Based on the results of program flow analysis, it reduces and partitions the control flow graph of the program and obtains a directed graph. Using linear combinations of independent paths of the directed graph, a set of feasible paths can be generated that gives complete coverage in terms of the program paths considered. Their timing measurements and execution counts of program segments are derived from a limited number of measurements of an instrumented version of the program. After the timing measurement of the feasible paths are linearly expressed by the execution times of program seg-ments, a system of equations is derived as a constraint problem, from which we can obtain the execution times of program segments. By assigning the execution times of program segments to weights of edges in the directed graph, the WCET estimate can be calculated on the basis of graph-theoretical techniques. Comparing our WCET estimate with the WCET measurement obtained by the exhaustive measurement, the maximum error ratio is only 8.259 3 %. It is shown that the proposed approach is an effective way to obtain the safe and tight WCET estimate for ARM programs.展开更多
This paper considers a worst-case investment optimization problem with delay for a fund manager who is in a crash-threatened financial market. Driven by existing of capital inflow/outflow related to history performanc...This paper considers a worst-case investment optimization problem with delay for a fund manager who is in a crash-threatened financial market. Driven by existing of capital inflow/outflow related to history performance, we investigate the optimal investment strategies under the worst-case scenario and the stochastic control framework with delay. The financial market is assumed to be either in a normal state(crash-free) or in a crash state. In the normal state the prices of risky assets behave as geometric Brownian motion, and in the crash state the prices of risky assets suddenly drop by a certain relative amount, which induces to a dropping of the total wealth relative to that of crash-free state. We obtain the ordinary differential equations satisfied by the optimal investment strategies and the optimal value functions under the power and exponential utilities, respectively. Finally, a numerical simulation is provided to illustrate the sensitivity of the optimal strategies with respective to the model parameters.展开更多
Today, designers are forced to reduce performance and increase power requirements in order to reserve larger margins that are required due to the greater variability introduced by smaller feature sizes and manufacturi...Today, designers are forced to reduce performance and increase power requirements in order to reserve larger margins that are required due to the greater variability introduced by smaller feature sizes and manufacturing variations of modern IC designs. The better-than-worst-case design can both address the variability problem and achieve higher performance/energy efficiency than the worst-case design. This paper surveys the progress to date, provides a snapshot of the most representative methods in this field, and discusses the future research directions of the better-than-worst-case design.展开更多
Caught up in the whirlwind of the global fi-nancial crisis since last October, governments worldwide have rushed to adapt to the drastic economic changes that have occurred both in their own countries and internationa...Caught up in the whirlwind of the global fi-nancial crisis since last October, governments worldwide have rushed to adapt to the drastic economic changes that have occurred both in their own countries and internationally. Anticipating that this year may witness heavier storms, Wang Jian, Secretary General of the China Society of Macroeconomics under the National Development and Reform Commission, warned about some trends in China’s macro-control efforts in 2009. Edited excerpts of his interview with China Securities Journal follow.展开更多
Real-time system timing analysis is crucial for estimating the worst-case execution time(WCET)of a program.To achieve this,static or dynamic analysis methods are used,along with targeted modeling of the actual hardwar...Real-time system timing analysis is crucial for estimating the worst-case execution time(WCET)of a program.To achieve this,static or dynamic analysis methods are used,along with targeted modeling of the actual hardware system.This literature review focuses on calculating WCET for multi-core processors,providing a survey of traditional methods used for static and dynamic analysis and highlighting the major challenges that arise from different program execution scenarios on multi-core platforms.This paper outlines the strengths and weaknesses of current methodologies and offers insights into prospective areas of research on multi-core analysis.By presenting a comprehensive analysis of the current state of research on multi-core processor analysis for WCET estimation,this review aims to serve as a valuable resource for researchers and practitioners in the field.展开更多
文摘Search-based software engineering has mainly dealt with automated test data generation by metaheuristic search techniques. Similarly, we try to generate the test data (i.e., problem instances) which show the worst case of algorithms by such a technique. In this paper, in terms of non-functional testing, we re-define the worst case of some algorithms, respectively. By using genetic algorithms (GAs), we illustrate the strategies corresponding to each type of instances. We here adopt three problems for examples;the sorting problem, the 0/1 knapsack problem (0/1KP), and the travelling salesperson problem (TSP). In some algorithms solving these problems, we could find the worst-case instances successfully;the successfulness of the result is based on a statistical approach and comparison to the results by using the random testing. Our tried examples introduce informative guidelines to the use of genetic algorithms in generating the worst-case instance, which is defined in the aspect of algorithm performance.
文摘The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existing offline data.To evaluate pricing policies when offline data are available,the decision maker can either position herself at the time point when the offline data are already observed and viewed as deterministic,or at the time point when the offline data are not yet generated and viewed as stochastic.We write a framework to discuss how and why these two different positions are relevant to online policy evaluations,from a worst-case perspective and from a Bayesian perspective.We then use a simple online pricing setting with offline data to illustrate the constructions of optimal policies for these two approaches and discuss their differences,especially whether we can decompose the searching for the optimal policy into independent subproblems and optimize separately,and whether there exists a deterministic optimal policy.
基金supported by National Natural Science Foundation of China(No.11175271)
文摘The worst-case radiation effect in deep-submicron SRAM (static random access memory) circuits is studied through theoretical analysis and experimental validation. Detailed analysis about the radiation effect in different parts of circuitry is presented. For SRAM cells and a sense amplifier which includes flip-flop structures, their failure level against ionizing radiation will have a connection with the storage state during irradiation. They are inclined to store or read the same state as the one stored during irradiation. Worst-case test scheme for an SRAM circuit is presented, which contains a write operation that changes the storage states into the opposite ones after irradiation and then a read operation with opposite storage states. An irradiation experiment is designed for one 0.25 μm SRAM circuit which has a capacity of I k×8 bits. The failure level against ionizing radiation concluded from this test scheme (150 krad(Si)) is much lower than the one from the simplest test scheme (1 Mrad(Si)). It is obvious that the failure level will be overestimated if the simplest test scheme is chosen as the test standard for SRAM circuits against ionizing radiation.
基金Project supported by the National Natural Science Foundation of China(No.61401526)the 111 Project+1 种基金China(No.B18039)the National Key Laboratory of Science Foundation of Science and Technology on Space Microwave,China(No.614241103030617)。
文摘Passive bistatic radar detects targets by exploiting available local broadcasters and communication transmissions as illuminators, which are not designed for radar. The signal usually contains a time-varying structure, which may result in high-level range ambiguity sidelobes. Because the mismatched filter is effective in suppressing sidelobes, it can be used in a passive bistatic radar. However, due to the low signal-to-noise ratio in the reference signal, the sidelobe suppression performance seriously degrades in a passive bistatic radar system. To solve this problem, a novel mismatched filtering algorithm is developed using worst-case performance optimization. In this algorithm, the influence of the low energy level in the reference signal is taken into consideration, and a new cost function is built based on worst-case performance optimization. With this optimization, the mismatched filter weights can be obtained by minimizing the total energy of the ambiguity range sidelobes. Quantitative evaluations and simulation results demonstrate that the proposed algorithm can realize sidelobe suppression when there is a low-energy reference signal. Its effectiveness is proved using real data.
基金supported by Natural Science Foundation of China and Jiangsu Province(No.11871050,No.11971342,No.11401414,No.BK20140299,No.14KJB110022)。
文摘G-VaR,which is a type of worst-case value-at-risk(VaR),is defined as measuring risk incorporating model uncertainty.Compared with most extant notions of worst-case VaR,G-VaR can be computed using an explicit formula,and can be applied to large portfolios of several hundred dimensions with low computational cost.We also apply G-VaR to robust portfolio optimization,thereby providing a tractable means to facilitate optimal allocations under the condition of market ambiguity.
文摘This paper describes a heuristic for solving F3/bi = b/Cmax scheduling problem. This algorithm first uses the Johnson’s algorithm solving F2Cmax, then, presents revised algorithm to solve F3/bi = b/Cmax. Lastly, an O(n log n) time heuristic is presented which generates a schedule with length at most 3/2 times that of an optimal schedule, for even number n 4, and 3/2 + 1/(2n) times that of an optimal schedule, for odd number n 4. These bounds are tight.
基金This work is partially supported by the National High Technology Research and Development 863 Program of China under Grant Nos. 2015AA015304 and 2013AA013202, the National Natural Science Foundation of China under Grant No. 61472052, and Chongqing Research Program under Grant No. cstc2014yykfB40007.
文摘Tasks in hard real-time systems are required to meet preset deadlines, even in the presence of transient faults and hence the analysis of worst-case finish time (WCFT) must consider the extra time incurred by re-executing tasks that were faulty. Existing solutions can only estimate WCFT and usually result in significant under- or over-estimation. In this work, we conclude that a sufficient and necessary condition of a task set experiencing its WCFT is that its critical task incurs all expected transient faults. A method is presented to identify the critical task and WCFT in O(IVI + IEI) where IVI and IEI are the number of tasks and dependencies between tasks, respectively. This method finds its application in testing the feasibility of directed acyclic graph (DAG) based task sets scheduled in a wide variety of fault-prone multi-processor systems, where the processors could be either homogeneous or heterogeneous, DVS-capable or DVS-incapable, etc. The common practices, which require the same time complexity as the proposed critical-task method, could either underestimate the worst case by up to 25%, or overestimate by 13%. Based on the proposed critical-task method, a simulated-annealing scheduling algorithm is developed to find the energy efficient fault-tolerant schedule for a given DAG task set. Experimental results show that the proposed critical-task method wins over a common practice by up to 40% in terms of energy saving.
基金Supported by the National High Technology Research and Development Program of China(863 Program,2009AA011705)the National Natural Science Foundation of China(60903033)
文摘To overcome disadvantages of traditional worst-case execution time (WCET) analysis approaches, we propose a new WCET analysis approach based on independent paths for ARM programs. Based on the results of program flow analysis, it reduces and partitions the control flow graph of the program and obtains a directed graph. Using linear combinations of independent paths of the directed graph, a set of feasible paths can be generated that gives complete coverage in terms of the program paths considered. Their timing measurements and execution counts of program segments are derived from a limited number of measurements of an instrumented version of the program. After the timing measurement of the feasible paths are linearly expressed by the execution times of program seg-ments, a system of equations is derived as a constraint problem, from which we can obtain the execution times of program segments. By assigning the execution times of program segments to weights of edges in the directed graph, the WCET estimate can be calculated on the basis of graph-theoretical techniques. Comparing our WCET estimate with the WCET measurement obtained by the exhaustive measurement, the maximum error ratio is only 8.259 3 %. It is shown that the proposed approach is an effective way to obtain the safe and tight WCET estimate for ARM programs.
基金Supported by the National Natural Science Foundation of China(71501050)Startup Foundation for Doctors of ZhaoQing University(611-612282)the National Science Foundation of Guangdong Province of China(2017A030310660)
文摘This paper considers a worst-case investment optimization problem with delay for a fund manager who is in a crash-threatened financial market. Driven by existing of capital inflow/outflow related to history performance, we investigate the optimal investment strategies under the worst-case scenario and the stochastic control framework with delay. The financial market is assumed to be either in a normal state(crash-free) or in a crash state. In the normal state the prices of risky assets behave as geometric Brownian motion, and in the crash state the prices of risky assets suddenly drop by a certain relative amount, which induces to a dropping of the total wealth relative to that of crash-free state. We obtain the ordinary differential equations satisfied by the optimal investment strategies and the optimal value functions under the power and exponential utilities, respectively. Finally, a numerical simulation is provided to illustrate the sensitivity of the optimal strategies with respective to the model parameters.
基金partially supported by the National Science Foundation of USA under Grant No.CCF-0903541
文摘Today, designers are forced to reduce performance and increase power requirements in order to reserve larger margins that are required due to the greater variability introduced by smaller feature sizes and manufacturing variations of modern IC designs. The better-than-worst-case design can both address the variability problem and achieve higher performance/energy efficiency than the worst-case design. This paper surveys the progress to date, provides a snapshot of the most representative methods in this field, and discusses the future research directions of the better-than-worst-case design.
文摘Caught up in the whirlwind of the global fi-nancial crisis since last October, governments worldwide have rushed to adapt to the drastic economic changes that have occurred both in their own countries and internationally. Anticipating that this year may witness heavier storms, Wang Jian, Secretary General of the China Society of Macroeconomics under the National Development and Reform Commission, warned about some trends in China’s macro-control efforts in 2009. Edited excerpts of his interview with China Securities Journal follow.
基金supported by ZTE Industry-University-Institute Cooperation Funds under Grant No.2022ZTE09.
文摘Real-time system timing analysis is crucial for estimating the worst-case execution time(WCET)of a program.To achieve this,static or dynamic analysis methods are used,along with targeted modeling of the actual hardware system.This literature review focuses on calculating WCET for multi-core processors,providing a survey of traditional methods used for static and dynamic analysis and highlighting the major challenges that arise from different program execution scenarios on multi-core platforms.This paper outlines the strengths and weaknesses of current methodologies and offers insights into prospective areas of research on multi-core analysis.By presenting a comprehensive analysis of the current state of research on multi-core processor analysis for WCET estimation,this review aims to serve as a valuable resource for researchers and practitioners in the field.