With the spread of COVID-19 across the world,a large amount of data on reported cases has become available.We are studying here a potential bias induced by the daily number of tests which may be insufficient or vary o...With the spread of COVID-19 across the world,a large amount of data on reported cases has become available.We are studying here a potential bias induced by the daily number of tests which may be insufficient or vary over time.Indeed,tests are hard to produce at the early stage of the epidemic and can therefore be a limiting factor in the detection of cases.Such a limitation may have a strong impact on the reported cases data.Indeed,some cases may be missing from the official count because the number of tests was not sufficient on a given day.In this work,we propose a new differential equation epidemic model which uses the daily number of tests as an input.We obtain a good agreement between the model simulations and the reported cases data coming from the state of New York.We also explore the relationship between the dynamic of the number of tests and the dynamics of the cases.We obtain a good match between the data and the outcome of the model.Finally,by multiplying the number of tests by 2,5,10,and 100 we explore the consequences for the number of reported cases.展开更多
Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system...Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system developments speed.Model-based testing(MBT)is a technique that uses system models to generate and execute test cases automatically.It was identified that the test data generation(TDG)in many existing model-based test case generation(MB-TCG)approaches were still manual.An automatic and effective TDG can further reduce testing cost while detecting more faults.This study proposes an automated TDG approach in MB-TCG using the extended finite state machine model(EFSM).The proposed approach integrates MBT with combinatorial testing.The information available in an EFSM model and the boundary value analysis strategy are used to automate the domain input classifications which were done manually by the existing approach.The results showed that the proposed approach was able to detect 6.62 percent more faults than the conventionalMB-TCG but at the same time generated 43 more tests.The proposed approach effectively detects faults,but a further treatment to the generated tests such as test case prioritization should be done to increase the effectiveness and efficiency of testing.展开更多
In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use Op...In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction展开更多
Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems....Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.展开更多
Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre...Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre- quency-directed run-length (AFDR) codes. Different [rom frequency-directed run-length (FDR) codes, AFDR encodes both 0- and 1-runs and uses the same codes to the equal length runs. It also modifies the codes for 00 and 11 to improve the compression performance. Experimental results for ISCAS 89 benchmark circuits show that AFDR codes achieve higher compression ratio than FDR and other compression codes.展开更多
This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,t...This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,test application time, and area overhead. To improve the compression ratio, the new method is based on variable-to-variable run length codes,and a novel algorithm is proposed to reorder the test vectors and fill the unspecified bits in the pre-processing step. With a novel on-chip decoder, low test application time and low area overhead are obtained by hybrid run length codes. Finally, an experimental comparison on ISCAS 89 benchmark circuits validates the proposed method展开更多
A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used t...A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used to locate damage in the structure, and an iterative estimation scheme for solving nonlinear optimization programming problems based on the quadratic programming technique is used to identify the damage magnitude. A numerical example of a cantilever beam with single or multiple damages is used to examine the capability of the proposed grey-theory-based method to localize and identify damages. The factors of meas-urement noise and incomplete test data are also discussed. The numerical results showed that the damage in the structure can be localized correctly through using the grey-related coefficient of displacement curvature, and the damage magnitude can be iden-tified with a high degree of accuracy, regardless of the number of measured displacement nodes. This proposed method only requires limited static test data, which is easily available in practice, and has wide applications in structural damage detection.展开更多
We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-...We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-off test data were used to construct a mathematical model, which was in turn applied to finding the inverse of an overdetermined system of equations.The method has been demonstrated by a case study in the Appalachian Basin, USA. The calculated horizontal stresses are in reasonable agreement with the reported regional stress study of the area, although there are no field measurement data of the studied well for direct calibration. The results also indicate that 2% of axis difference in the elliptical borehole geometry can cause a 5% difference in minimum horizontal stress calculation and a 10% difference in maximum horizontal stress calculation.展开更多
By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain sy...By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain system for input variables. Only when the predicate function is nonlinear, does the linear arithmetic representation need to be computed. If the entire predicate functions on the given path are linear, either the desired test data or the guarantee that the path is infeasible can be gotten from the solution of the constrain system. Otherwise, the iterative refining for the input is required to obtain the desired test data. Theoretical analysis and test results show that the approach is simple and effective, and takes less computation. The scheme can also be used to generate path-based test data for the programs with arrays and loops.展开更多
The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets covera...The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets coverage requirements.This paper presents an improved Whale Genetic Algorithm for generating test data re-quired for unit testing MC/DC coverage.The proposed algorithm introduces an elite retention strategy to avoid the genetic algorithm from falling into iterative degradation.At the same time,the mutation threshold of the whale algorithm is introduced to balance the global exploration and local search capabilities of the genetic al-gorithm.The threshold is dynamically adjusted according to the diversity and evolution stage of current popu-lation,which positively guides the evolution of the population.Finally,an improved crossover strategy is pro-posed to accelerate the convergence of the algorithm.The improved whale genetic algorithm is compared with genetic algorithm,whale algorithm and particle swarm algorithm on two benchmark programs.The results show that the proposed algorithm is faster for test data generation than comparison methods and can provide better coverage with fewer evaluations,and has great advantages in generating test data.展开更多
A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force ...A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force of damping force, rebound force produced by pressed air, and friction force. Different characters of damping force, air rebound force and friction force can be applied to seperate each force from others. A massive produced air filling shock absorber is adopted for the validation. The statistic test is used to get the displacement-force curves. The data are used as the input of separation calculation. Then the tests are carried out again to obtain the force data without air rebound force. The force without air rebound is compared to the data derived from the former tests with the separation method. The result shows that this method can separate the damping force and the air elastic force.展开更多
On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based ...On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J\|complexity 0; J\|complexity 1; J\|complexity \{1+\}; J\|complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.展开更多
Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvat...Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.展开更多
This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a...This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a developed data processing algorithm for the HSCAT before launch. There were three test flights of the scatterometer, on January 15, 18 and 22, 2010, over the South China Sea near Lingshui, Hainan. The test flights successfully generated simultaneous airborne scatterometer normalized radar cross section(NRCS), ASCAT wind, and ship-borne-measured wind datasets, which were used to analyze HSCAT performance. Azimuthal dependence of the NRCS relative to the wind direction was nearly cos(2w), with NRCS minima at crosswind directions, and maxima near upwind and downwind. The NRCS also showed a small difference between upwind and downwind directions, with upwind crosssections generally larger than those downwind. The dependence of airborne scatterometer NRCS on wind direction and speed showed favorable consistency with the NASA scatterometer geophysical model function(NSCAT GMF), indicating satisfactory HSCAT performance.展开更多
It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate ...It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.展开更多
System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to opti...System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to optimize the test data volume, decrease the testing time, and conquer the ATE memory limitation for SOC designs. This paper presents a new compression method of testing for intellectual property core-based system-on-chip. The proposed method is based on new split- data variable length (SDV) codes that are designed using the split-options along with identification bits in a string of test data. This paper analyses the reduction of test data volume, testing time, run time, size of memory required in ATE and improvement of compression ratio. Experimental results for ISCAS 85 and ISCAS 89 Benchmark circuits show that SDV codes outperform other compression methods with the best compression ratio for test data compression. The decompression architecture for SDV codes is also presented for decoding the implementations of compressed bits. The proposed scheme shows that SDV codes are accessible to any of the variations in the input test data stream.展开更多
In this paper, the new organization for unit testing embedding pair-wise mode is proposed with the core thought focused on the cooperation of programmer and tester by “cross-testing”. The typical content of unit tes...In this paper, the new organization for unit testing embedding pair-wise mode is proposed with the core thought focused on the cooperation of programmer and tester by “cross-testing”. The typical content of unit testing for the new organizing mode should have three aspects, including self-checking, cross-testing and independent-testing. For cross-testing, executing “pair-wise” mode, mainly tackles data testing, function testing and state testing, which function testing must be done by details and state testing must be considered for completeness. With the specializing of independent-testing, it should be taken as more rigid testing without arbitrariness. Consequently, strategy and measure are addressed for data testing focusing on boundary testing and function/state testing. And organizing method of procedure and key points of tackling unit testing are investigated for the new organizing mode. In order to assess the validity of our study and approach, a series of actual examples are demonstrated for GUI software. The result indicates that the execution of unit testing for the new organizing mode is effective and applicable.展开更多
Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of ...Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of undisturbed natural marine clay obtained from the same depth at the same site were deliberately disturbed to different levels. Then, the specimens with different extents of sample disturbance were trimmed for both oedometer tests and unconfined compression tests. The degree of sample disturbance SD is obtained from the oedometer test data. The relationship between the unconfined compressive strength q u and SD is studied for investigating the effect of sample disturbance on q u. It is found that the value of q u decreases linearly with the increase in SD. Then, a simple method of correcting q u for sample disturbance is proposed. Its validity is also verified through analysis of the existing published data.展开更多
Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features ...Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.展开更多
This paper introduces the high-speed electrical multiple unit (EMO) life cycle, including the design, manufacturing, testing, and maintenance stages. It also presents the train control and monitoring system (TCMS)...This paper introduces the high-speed electrical multiple unit (EMO) life cycle, including the design, manufacturing, testing, and maintenance stages. It also presents the train control and monitoring system (TCMS) software development platform, the TCMS testing and verification bench, the EMU driving simulation platform, and the EMU remote data transmittal and maintenance platform. All these platforms and benches combined together make up the EMU life cycle cost (LCC) system. Each platform facilitates EMU LCC management and is an important part of the system.展开更多
基金Q.G.and P.M.acknowledge the support of ANR flash COVID-19 MPCUII.
文摘With the spread of COVID-19 across the world,a large amount of data on reported cases has become available.We are studying here a potential bias induced by the daily number of tests which may be insufficient or vary over time.Indeed,tests are hard to produce at the early stage of the epidemic and can therefore be a limiting factor in the detection of cases.Such a limitation may have a strong impact on the reported cases data.Indeed,some cases may be missing from the official count because the number of tests was not sufficient on a given day.In this work,we propose a new differential equation epidemic model which uses the daily number of tests as an input.We obtain a good agreement between the model simulations and the reported cases data coming from the state of New York.We also explore the relationship between the dynamic of the number of tests and the dynamics of the cases.We obtain a good match between the data and the outcome of the model.Finally,by multiplying the number of tests by 2,5,10,and 100 we explore the consequences for the number of reported cases.
基金The research was funded by Universiti Teknologi Malaysia(UTM)and the MalaysianMinistry of Higher Education(MOHE)under the Industry-International Incentive Grant Scheme(IIIGS)(Vote Number:Q.J130000.3651.02M67 and Q.J130000.3051.01M86)the Aca-demic Fellowship Scheme(SLAM).
文摘Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system developments speed.Model-based testing(MBT)is a technique that uses system models to generate and execute test cases automatically.It was identified that the test data generation(TDG)in many existing model-based test case generation(MB-TCG)approaches were still manual.An automatic and effective TDG can further reduce testing cost while detecting more faults.This study proposes an automated TDG approach in MB-TCG using the extended finite state machine model(EFSM).The proposed approach integrates MBT with combinatorial testing.The information available in an EFSM model and the boundary value analysis strategy are used to automate the domain input classifications which were done manually by the existing approach.The results showed that the proposed approach was able to detect 6.62 percent more faults than the conventionalMB-TCG but at the same time generated 43 more tests.The proposed approach effectively detects faults,but a further treatment to the generated tests such as test case prioritization should be done to increase the effectiveness and efficiency of testing.
基金This project is supported by National Natural Science Foundation of China (No.50405009)
文摘In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction
文摘Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.
基金Supported by the National Natural Science Foundation of China(61076019,61106018)the Aeronautical Science Foundation of China(20115552031)+3 种基金the China Postdoctoral Science Foundation(20100481134)the Jiangsu Province Key Technology R&D Program(BE2010003)the Nanjing University of Aeronautics and Astronautics Research Funding(NS2010115)the Nanjing University of Aeronatics and Astronautics Initial Funding for Talented Faculty(1004-YAH10027)~~
文摘Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre- quency-directed run-length (AFDR) codes. Different [rom frequency-directed run-length (FDR) codes, AFDR encodes both 0- and 1-runs and uses the same codes to the equal length runs. It also modifies the codes for 00 and 11 to improve the compression performance. Experimental results for ISCAS 89 benchmark circuits show that AFDR codes achieve higher compression ratio than FDR and other compression codes.
文摘This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,test application time, and area overhead. To improve the compression ratio, the new method is based on variable-to-variable run length codes,and a novel algorithm is proposed to reorder the test vectors and fill the unspecified bits in the pre-processing step. With a novel on-chip decoder, low test application time and low area overhead are obtained by hybrid run length codes. Finally, an experimental comparison on ISCAS 89 benchmark circuits validates the proposed method
基金Project supported by the Natural Science Foundation of China(No. 50378041) and the Specialized Research Fund for the Doc-toral Program of Higher Education (No. 20030487016), China
文摘A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used to locate damage in the structure, and an iterative estimation scheme for solving nonlinear optimization programming problems based on the quadratic programming technique is used to identify the damage magnitude. A numerical example of a cantilever beam with single or multiple damages is used to examine the capability of the proposed grey-theory-based method to localize and identify damages. The factors of meas-urement noise and incomplete test data are also discussed. The numerical results showed that the damage in the structure can be localized correctly through using the grey-related coefficient of displacement curvature, and the damage magnitude can be iden-tified with a high degree of accuracy, regardless of the number of measured displacement nodes. This proposed method only requires limited static test data, which is easily available in practice, and has wide applications in structural damage detection.
基金support of the United States Department of Energy (DE-FE0026825, UCFER-University Coalition for Fossil Energy Research)
文摘We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-off test data were used to construct a mathematical model, which was in turn applied to finding the inverse of an overdetermined system of equations.The method has been demonstrated by a case study in the Appalachian Basin, USA. The calculated horizontal stresses are in reasonable agreement with the reported regional stress study of the area, although there are no field measurement data of the studied well for direct calibration. The results also indicate that 2% of axis difference in the elliptical borehole geometry can cause a 5% difference in minimum horizontal stress calculation and a 10% difference in maximum horizontal stress calculation.
文摘By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain system for input variables. Only when the predicate function is nonlinear, does the linear arithmetic representation need to be computed. If the entire predicate functions on the given path are linear, either the desired test data or the guarantee that the path is infeasible can be gotten from the solution of the constrain system. Otherwise, the iterative refining for the input is required to obtain the desired test data. Theoretical analysis and test results show that the approach is simple and effective, and takes less computation. The scheme can also be used to generate path-based test data for the programs with arrays and loops.
文摘The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets coverage requirements.This paper presents an improved Whale Genetic Algorithm for generating test data re-quired for unit testing MC/DC coverage.The proposed algorithm introduces an elite retention strategy to avoid the genetic algorithm from falling into iterative degradation.At the same time,the mutation threshold of the whale algorithm is introduced to balance the global exploration and local search capabilities of the genetic al-gorithm.The threshold is dynamically adjusted according to the diversity and evolution stage of current popu-lation,which positively guides the evolution of the population.Finally,an improved crossover strategy is pro-posed to accelerate the convergence of the algorithm.The improved whale genetic algorithm is compared with genetic algorithm,whale algorithm and particle swarm algorithm on two benchmark programs.The results show that the proposed algorithm is faster for test data generation than comparison methods and can provide better coverage with fewer evaluations,and has great advantages in generating test data.
文摘A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force of damping force, rebound force produced by pressed air, and friction force. Different characters of damping force, air rebound force and friction force can be applied to seperate each force from others. A massive produced air filling shock absorber is adopted for the validation. The statistic test is used to get the displacement-force curves. The data are used as the input of separation calculation. Then the tests are carried out again to obtain the force data without air rebound force. The force without air rebound is compared to the data derived from the former tests with the separation method. The result shows that this method can separate the damping force and the air elastic force.
文摘On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J\|complexity 0; J\|complexity 1; J\|complexity \{1+\}; J\|complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.
基金Project supported by the National Natural Science Foundation of China (No. 50378041) Specialized Research Fund for Doctoral Programs of Higher Education (No. 20030487016).
文摘Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.
基金Supported by the National Natural Science Foundation of China(No.41106152)the National Science and Technology Support Program of China(No.2013BAD13B01)+3 种基金the National High Technology Research and Development Program of China(863 Program)(No.2013AA09A505)the International Science&Technology Cooperation Program of China(No.2011DFA22260)the National High Technology Industrialization Project(No.[2012]2083)the Marine Public Projects of China(Nos.201105032,201305032,201105002-07)
文摘This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a developed data processing algorithm for the HSCAT before launch. There were three test flights of the scatterometer, on January 15, 18 and 22, 2010, over the South China Sea near Lingshui, Hainan. The test flights successfully generated simultaneous airborne scatterometer normalized radar cross section(NRCS), ASCAT wind, and ship-borne-measured wind datasets, which were used to analyze HSCAT performance. Azimuthal dependence of the NRCS relative to the wind direction was nearly cos(2w), with NRCS minima at crosswind directions, and maxima near upwind and downwind. The NRCS also showed a small difference between upwind and downwind directions, with upwind crosssections generally larger than those downwind. The dependence of airborne scatterometer NRCS on wind direction and speed showed favorable consistency with the NASA scatterometer geophysical model function(NSCAT GMF), indicating satisfactory HSCAT performance.
文摘It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.
文摘System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to optimize the test data volume, decrease the testing time, and conquer the ATE memory limitation for SOC designs. This paper presents a new compression method of testing for intellectual property core-based system-on-chip. The proposed method is based on new split- data variable length (SDV) codes that are designed using the split-options along with identification bits in a string of test data. This paper analyses the reduction of test data volume, testing time, run time, size of memory required in ATE and improvement of compression ratio. Experimental results for ISCAS 85 and ISCAS 89 Benchmark circuits show that SDV codes outperform other compression methods with the best compression ratio for test data compression. The decompression architecture for SDV codes is also presented for decoding the implementations of compressed bits. The proposed scheme shows that SDV codes are accessible to any of the variations in the input test data stream.
文摘In this paper, the new organization for unit testing embedding pair-wise mode is proposed with the core thought focused on the cooperation of programmer and tester by “cross-testing”. The typical content of unit testing for the new organizing mode should have three aspects, including self-checking, cross-testing and independent-testing. For cross-testing, executing “pair-wise” mode, mainly tackles data testing, function testing and state testing, which function testing must be done by details and state testing must be considered for completeness. With the specializing of independent-testing, it should be taken as more rigid testing without arbitrariness. Consequently, strategy and measure are addressed for data testing focusing on boundary testing and function/state testing. And organizing method of procedure and key points of tackling unit testing are investigated for the new organizing mode. In order to assess the validity of our study and approach, a series of actual examples are demonstrated for GUI software. The result indicates that the execution of unit testing for the new organizing mode is effective and applicable.
文摘Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of undisturbed natural marine clay obtained from the same depth at the same site were deliberately disturbed to different levels. Then, the specimens with different extents of sample disturbance were trimmed for both oedometer tests and unconfined compression tests. The degree of sample disturbance SD is obtained from the oedometer test data. The relationship between the unconfined compressive strength q u and SD is studied for investigating the effect of sample disturbance on q u. It is found that the value of q u decreases linearly with the increase in SD. Then, a simple method of correcting q u for sample disturbance is proposed. Its validity is also verified through analysis of the existing published data.
基金support from the Deanship of Scientific Research,University of Hail,Saudi Arabia through the project Ref.(RG-191315).
文摘Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.
文摘This paper introduces the high-speed electrical multiple unit (EMO) life cycle, including the design, manufacturing, testing, and maintenance stages. It also presents the train control and monitoring system (TCMS) software development platform, the TCMS testing and verification bench, the EMU driving simulation platform, and the EMU remote data transmittal and maintenance platform. All these platforms and benches combined together make up the EMU life cycle cost (LCC) system. Each platform facilitates EMU LCC management and is an important part of the system.