Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on be...Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.展开更多
Safety related accidents of buildings and civil engineering structures have been reported all over the world. With the increasing importance of securing the safety of social infrastructure and optimum performance leve...Safety related accidents of buildings and civil engineering structures have been reported all over the world. With the increasing importance of securing the safety of social infrastructure and optimum performance levels to prevent these accidents, a lot of attention has been concentrated on monitoring performance degradation due to structural defects and deterioration. In this study, the algorithm was developed to evaluate the safety of structures by analyzing signals of time domain and frequency domain, and the developed algorithm was verified through a forced vibration test. From the results of time-domain and frequency-domain data analysis, the damage detection results by each sensor location with a high degree of accuracy were derived in both methods.展开更多
In order to deal with the limitations during the register transfer level verification, a new functional verification method based on the random testing for the system-level of system-on-chip is proposed.The validity o...In order to deal with the limitations during the register transfer level verification, a new functional verification method based on the random testing for the system-level of system-on-chip is proposed.The validity of this method is proven theoretically.Specifically, testcases are generated according to many approaches of randomization.Moreover, the testbench for the system-level verification according to the proposed method is designed by using advanced modeling language.Therefore, under the circumstances that the testbench generates testcases quickly, the hardware/software co-simulation and co-verification can be implemented and the hardware/software partitioning planning can be evaluated easily.The comparison method is put to use in the evaluation approach of the testing validity.The evaluation result indicates that the efficiency of the partition testing is better than that of the random testing only when one or more subdomains are covered over with the area of errors, although the efficiency of the random testing is generally better than that of the partition testing.The experimental result indicates that this method has a good performance in the functional coverage and the cost of testing and can discover the functional errors as soon as possible.展开更多
The optocoupler is a weak link in the inertial navigation platform of a kind of guided munitions.It is necessary to use accelerated storage test to verify the storage life of long storage products.Especially for small...The optocoupler is a weak link in the inertial navigation platform of a kind of guided munitions.It is necessary to use accelerated storage test to verify the storage life of long storage products.Especially for small sample products,it is very important to obtain prior information for the design and implementation of accelerated degradation test.In this paper,the optocoupler failure mechanism verification test is designed and the experimental results are analyzed and the prior information is obtained.The results show that optocouplers have two failure modes,one is sudden failure and the other is degradation failure;the maximum temperature stress of optocoupler can’t exceed 140℃;the increase of leakage current of optocoupler is caused by movable ions contaminating the LED chip.The surface leakage current is proportional to the adsorption amount.The increase of leakage current makes p-n junction tunneling effect occur which LEDs the failure of the optocoupler.The lifetime distribution model of the optocoupler is determined by the failure physics.The lifetime of the optocoupler is subject to the lognormal distribution.The degeneracy orbit of the optocoupler leakage current is described by a power law model.The estimated values of the orbital parameters are initially calculated and the parameters of its life distribution function are deduced.The above information lays a good foundation for the optimization design and data processing of the accelerated degradation experiment.展开更多
To generate a test set for a given circuit (including both combinational and sequential circuits), choice of an algorithm within a number of existing test generation algorithms to apply is bound to vary from circuit t...To generate a test set for a given circuit (including both combinational and sequential circuits), choice of an algorithm within a number of existing test generation algorithms to apply is bound to vary from circuit to circuit. In this paper, the genetic algorithms are used to construct the models of existing test generation algorithms in making such choice more easily. Therefore, we may forecast the testability parameters of a circuit before using the real test generation algorithm. The results also can be used to evaluate the efficiency of the existing test generation algorithms. Experimental results are given to convince the readers of the truth and the usefulness of this approach.展开更多
To study the diagnostic problem of Wire-OR (W-O) interconnect fault of PCB (Printed Circuit Board), five modified boundary scan adaptive algorithms for interconnect test are put forward. These algorithms apply Glo...To study the diagnostic problem of Wire-OR (W-O) interconnect fault of PCB (Printed Circuit Board), five modified boundary scan adaptive algorithms for interconnect test are put forward. These algorithms apply Global-diagnosis sequence algorithm to replace the equal weight algorithm of primary test, and the test time is shortened without changing the fault diagnostic capability. The descriptions of five modified adaptive test algorithms are presented, and the capability comparison between the modified algorithm and the original algorithm is made to prove the validity of these algorithms.展开更多
There are many motors in operation or on standby in nuclear power plants,and the startup of group motors will have a great impact on the voltage of the emergency bus.At present,there is no special or inexpensive softw...There are many motors in operation or on standby in nuclear power plants,and the startup of group motors will have a great impact on the voltage of the emergency bus.At present,there is no special or inexpensive software to solve this problem,and the experience of engineers is not accurate enough.Therefore,this paper developed a method and system for the startup calculation of group motors in nuclear power plants and proposed an automatic generation method of circuit topology in nuclear power plants.Each component in the topology was given its unique number,and the component class could be constructed according to its type and upper and lower connections.The subordination and topology relationship of switches,buses,and motors could be quickly generated by the program according to the component class,and the simplified direct power flow algorithm was used to calculate the power flow for the startup of group motors according to the above relationship.Then,whether the bus voltage is in the safe range and whether the voltage exceeds the limit during the startup of the group motor could be judged.The practical example was used to verify the effectiveness of the method.Compared with other professional software,the method has high efficiency and low cost.展开更多
Technical debt(TD)happens when project teams carry out technical decisions in favor of a short-term goal(s)in their projects,whether deliberately or unknowingly.TD must be properly managed to guarantee that its negati...Technical debt(TD)happens when project teams carry out technical decisions in favor of a short-term goal(s)in their projects,whether deliberately or unknowingly.TD must be properly managed to guarantee that its negative implications do not outweigh its advantages.A lot of research has been conducted to show that TD has evolved into a common problem with considerable financial burden.Test technical debt is the technical debt aspect of testing(or test debt).Test debt is a relatively new concept that has piqued the curiosity of the software industry in recent years.In this article,we assume that the organization selects the testing artifacts at the start of every sprint.Implementing the latest features in consideration of expected business value and repaying technical debt are among candidate tasks in terms of the testing process(test cases increments).To gain the maximum benefit for the organization in terms of software testing optimization,there is a need to select the artifacts(i.e.,test cases)with maximum feature coverage within the available resources.The management of testing optimization for large projects is complicated and can also be treated as a multi-objective problem that entails a trade-off between the agile software’s short-term and long-term value.In this article,we implement a multi-objective indicatorbased evolutionary algorithm(IBEA)for fixing such optimization issues.The capability of the algorithm is evidenced by adding it to a real case study of a university registration process.展开更多
Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main ...Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main features of the problem are the strong nonuniform scale of the solution and large errors (up to 15%) in the input data. In both algorithms, the solution is represented as decomposition on special basic functions, which satisfy given a priori information on solution, and this idea allow us significantly to improve the quality of approximate solution and simplify solving the minimization problem. The theoretical details of the algorithms, as well as the results of numerical experiments for proving robustness of the algorithms, are presented.展开更多
The methods and strategies used to screen for syp-hilis and to confirm initially reactive results can vary significantly across clinical laboratories. While the performance characteristics of these different appro-ach...The methods and strategies used to screen for syp-hilis and to confirm initially reactive results can vary significantly across clinical laboratories. While the performance characteristics of these different appro-aches have been evaluated by multiple studies, there is not, as of yet, a single, universally recommendedalgorithm for syphilis testing. To clarify the currently available options for syphilis testing, this update will summarize the clinical challenges to diagnosis, review the specific performance characteristics of treponemal and non-treponemal tests, and fnally, summarize select studies published over the past decade which have evaluated these approaches. Specifcally, this review will discuss the traditional and reverse sequence syphilis screening algorithms commonly used in the United States, alongside a discussion of the European Centre for Disease Prevention and Control syphilis algorithm. Ultimately, in the United States, the decision of which algorithm to use is largely dependent on laboratory resources, the local incidence of syphilis and patient demographics. Key words: Syphilis; Treponemal infection; Immuno-assay; Reverse sequence screening; Rapid plasma regain; Treponema pallidum particle agglutination test; Automation; Algorithm; Primary infection; Late latent infection展开更多
This paper presents the techniques of verification and Test Generation(TG) for sequential machines (Finite State Machines, FSMs) based on state traversing of State Transition Graph(STG). The problems of traversing, re...This paper presents the techniques of verification and Test Generation(TG) for sequential machines (Finite State Machines, FSMs) based on state traversing of State Transition Graph(STG). The problems of traversing, redundancy and transition fault model are identified. In order to achieve high fault coverage collapsing testing is proposed. Further, the heuristic knowledge for speeding up verification and TG are described.展开更多
The algorithm for evaluation of fiber orientation distribution function (ODF) by laser scattering method based on 2 - dimentional model of fiber arrangement and the method of determining diffuse scattering intensity a...The algorithm for evaluation of fiber orientation distribution function (ODF) by laser scattering method based on 2 - dimentional model of fiber arrangement and the method of determining diffuse scattering intensity are presented. The fiber ODFs of nonwoven samples measured by the computer-program-controlled laser scattering intensity testing system are compared with that of the data obtained by microprojector method. The results show that the algorithm is feasible for assessing the fiber ODFs of nonwoven fabrics manufactured by different processing methods.展开更多
[ Objectives] The paper aimed to select drugs reasonably for treatment of rex rabbit colibacillosis, and to isolate the pathogenicity of Escherichia coll. [ Methods ] Pathogen isolation, drug sensitivity test and path...[ Objectives] The paper aimed to select drugs reasonably for treatment of rex rabbit colibacillosis, and to isolate the pathogenicity of Escherichia coll. [ Methods ] Pathogen isolation, drug sensitivity test and pathogen regression test were performed with rex rabbits killed by E. coli in clinic. [ Results] The isolate was E. coli 0-23, susceptible to amikacin and cefotaxime sodium; when the challenge dose was 1.0 mL/rabbit (about one billion E. coli), the test animal would discharge mucous feces. [ Conclusions] The results provided model support for clinical medicine selection against rex rabbit colibacillosis.展开更多
Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their per...Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their performance is usually measured through a metric Average Percentage of Fault Detection(APFD).This metric is value-neutral because it only works well when all test cases have the same cost,and all faults have the same severity.Using APFD for performance evaluation of test case orders where test cases cost or faults severity varies is prone to produce false results.Therefore,using the right metric for performance evaluation of TCP techniques is very important to get reliable and correct results.In this paper,two value-based TCP techniques have been introduced using Genetic Algorithm(GA)including Value-Cognizant Fault Detection-Based TCP(VCFDB-TCP)and Value-Cognizant Requirements Coverage-Based TCP(VCRCB-TCP).Two novel value-based performance evaluation metrics are also introduced for value-based TCP including Average Percentage of Fault Detection per value(APFDv)and Average Percentage of Requirements Coverage per value(APRCv).Two case studies are performed to validate proposed techniques and performance evaluation metrics.The proposed GA-based techniques outperformed the existing state-of-the-art TCP techniques including Original Order(OO),Reverse Order(REV-O),Random Order(RO),and Greedy algorithm.展开更多
Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems....Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.展开更多
We have recently developed a systematic method for the study on the inheritance of resistance to sheath blight. The key of the system is an innovated method of inoculation and investigation along with the employment o...We have recently developed a systematic method for the study on the inheritance of resistance to sheath blight. The key of the system is an innovated method of inoculation and investigation along with the employment of the permanent population. This paper reported the procedure of the system and the result of its verification.展开更多
As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of a...As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.展开更多
Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features ...Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.展开更多
基金supported by National Natural Science Foundation of China(62032003).
文摘Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.
文摘Safety related accidents of buildings and civil engineering structures have been reported all over the world. With the increasing importance of securing the safety of social infrastructure and optimum performance levels to prevent these accidents, a lot of attention has been concentrated on monitoring performance degradation due to structural defects and deterioration. In this study, the algorithm was developed to evaluate the safety of structures by analyzing signals of time domain and frequency domain, and the developed algorithm was verified through a forced vibration test. From the results of time-domain and frequency-domain data analysis, the damage detection results by each sensor location with a high degree of accuracy were derived in both methods.
基金supported by the National High Technology Research and Development Program of China (863 Program) (2002AA1Z1490)Specialized Research Fund for the Doctoral Program of Higher Education (20040486049)the University Cooperative Research Fund of Huawei Technology Co., Ltd
文摘In order to deal with the limitations during the register transfer level verification, a new functional verification method based on the random testing for the system-level of system-on-chip is proposed.The validity of this method is proven theoretically.Specifically, testcases are generated according to many approaches of randomization.Moreover, the testbench for the system-level verification according to the proposed method is designed by using advanced modeling language.Therefore, under the circumstances that the testbench generates testcases quickly, the hardware/software co-simulation and co-verification can be implemented and the hardware/software partitioning planning can be evaluated easily.The comparison method is put to use in the evaluation approach of the testing validity.The evaluation result indicates that the efficiency of the partition testing is better than that of the random testing only when one or more subdomains are covered over with the area of errors, although the efficiency of the random testing is generally better than that of the partition testing.The experimental result indicates that this method has a good performance in the functional coverage and the cost of testing and can discover the functional errors as soon as possible.
基金supported by the National Natural Science Foundation of China of China(No.61471385)。
文摘The optocoupler is a weak link in the inertial navigation platform of a kind of guided munitions.It is necessary to use accelerated storage test to verify the storage life of long storage products.Especially for small sample products,it is very important to obtain prior information for the design and implementation of accelerated degradation test.In this paper,the optocoupler failure mechanism verification test is designed and the experimental results are analyzed and the prior information is obtained.The results show that optocouplers have two failure modes,one is sudden failure and the other is degradation failure;the maximum temperature stress of optocoupler can’t exceed 140℃;the increase of leakage current of optocoupler is caused by movable ions contaminating the LED chip.The surface leakage current is proportional to the adsorption amount.The increase of leakage current makes p-n junction tunneling effect occur which LEDs the failure of the optocoupler.The lifetime distribution model of the optocoupler is determined by the failure physics.The lifetime of the optocoupler is subject to the lognormal distribution.The degeneracy orbit of the optocoupler leakage current is described by a power law model.The estimated values of the orbital parameters are initially calculated and the parameters of its life distribution function are deduced.The above information lays a good foundation for the optimization design and data processing of the accelerated degradation experiment.
基金This work was supported by National Natural Science Foundation of China (NSFC) under the grant !No. 69873030
文摘To generate a test set for a given circuit (including both combinational and sequential circuits), choice of an algorithm within a number of existing test generation algorithms to apply is bound to vary from circuit to circuit. In this paper, the genetic algorithms are used to construct the models of existing test generation algorithms in making such choice more easily. Therefore, we may forecast the testability parameters of a circuit before using the real test generation algorithm. The results also can be used to evaluate the efficiency of the existing test generation algorithms. Experimental results are given to convince the readers of the truth and the usefulness of this approach.
文摘To study the diagnostic problem of Wire-OR (W-O) interconnect fault of PCB (Printed Circuit Board), five modified boundary scan adaptive algorithms for interconnect test are put forward. These algorithms apply Global-diagnosis sequence algorithm to replace the equal weight algorithm of primary test, and the test time is shortened without changing the fault diagnostic capability. The descriptions of five modified adaptive test algorithms are presented, and the capability comparison between the modified algorithm and the original algorithm is made to prove the validity of these algorithms.
基金Key Project of National Natural Science Foundation of China(52237008)Beijing Municipal Education Commission Research Program Funding Project(KM202111232022)。
文摘There are many motors in operation or on standby in nuclear power plants,and the startup of group motors will have a great impact on the voltage of the emergency bus.At present,there is no special or inexpensive software to solve this problem,and the experience of engineers is not accurate enough.Therefore,this paper developed a method and system for the startup calculation of group motors in nuclear power plants and proposed an automatic generation method of circuit topology in nuclear power plants.Each component in the topology was given its unique number,and the component class could be constructed according to its type and upper and lower connections.The subordination and topology relationship of switches,buses,and motors could be quickly generated by the program according to the component class,and the simplified direct power flow algorithm was used to calculate the power flow for the startup of group motors according to the above relationship.Then,whether the bus voltage is in the safe range and whether the voltage exceeds the limit during the startup of the group motor could be judged.The practical example was used to verify the effectiveness of the method.Compared with other professional software,the method has high efficiency and low cost.
基金The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code:(22UQUyouracademicnumberDSRxx).
文摘Technical debt(TD)happens when project teams carry out technical decisions in favor of a short-term goal(s)in their projects,whether deliberately or unknowingly.TD must be properly managed to guarantee that its negative implications do not outweigh its advantages.A lot of research has been conducted to show that TD has evolved into a common problem with considerable financial burden.Test technical debt is the technical debt aspect of testing(or test debt).Test debt is a relatively new concept that has piqued the curiosity of the software industry in recent years.In this article,we assume that the organization selects the testing artifacts at the start of every sprint.Implementing the latest features in consideration of expected business value and repaying technical debt are among candidate tasks in terms of the testing process(test cases increments).To gain the maximum benefit for the organization in terms of software testing optimization,there is a need to select the artifacts(i.e.,test cases)with maximum feature coverage within the available resources.The management of testing optimization for large projects is complicated and can also be treated as a multi-objective problem that entails a trade-off between the agile software’s short-term and long-term value.In this article,we implement a multi-objective indicatorbased evolutionary algorithm(IBEA)for fixing such optimization issues.The capability of the algorithm is evidenced by adding it to a real case study of a university registration process.
文摘Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main features of the problem are the strong nonuniform scale of the solution and large errors (up to 15%) in the input data. In both algorithms, the solution is represented as decomposition on special basic functions, which satisfy given a priori information on solution, and this idea allow us significantly to improve the quality of approximate solution and simplify solving the minimization problem. The theoretical details of the algorithms, as well as the results of numerical experiments for proving robustness of the algorithms, are presented.
文摘The methods and strategies used to screen for syp-hilis and to confirm initially reactive results can vary significantly across clinical laboratories. While the performance characteristics of these different appro-aches have been evaluated by multiple studies, there is not, as of yet, a single, universally recommendedalgorithm for syphilis testing. To clarify the currently available options for syphilis testing, this update will summarize the clinical challenges to diagnosis, review the specific performance characteristics of treponemal and non-treponemal tests, and fnally, summarize select studies published over the past decade which have evaluated these approaches. Specifcally, this review will discuss the traditional and reverse sequence syphilis screening algorithms commonly used in the United States, alongside a discussion of the European Centre for Disease Prevention and Control syphilis algorithm. Ultimately, in the United States, the decision of which algorithm to use is largely dependent on laboratory resources, the local incidence of syphilis and patient demographics. Key words: Syphilis; Treponemal infection; Immuno-assay; Reverse sequence screening; Rapid plasma regain; Treponema pallidum particle agglutination test; Automation; Algorithm; Primary infection; Late latent infection
基金Supported by the National Natural science Foundation of China(No.69576038)
文摘This paper presents the techniques of verification and Test Generation(TG) for sequential machines (Finite State Machines, FSMs) based on state traversing of State Transition Graph(STG). The problems of traversing, redundancy and transition fault model are identified. In order to achieve high fault coverage collapsing testing is proposed. Further, the heuristic knowledge for speeding up verification and TG are described.
基金This project is supported by Key Subject Foundation of Shanghai Educational Committee.
文摘The algorithm for evaluation of fiber orientation distribution function (ODF) by laser scattering method based on 2 - dimentional model of fiber arrangement and the method of determining diffuse scattering intensity are presented. The fiber ODFs of nonwoven samples measured by the computer-program-controlled laser scattering intensity testing system are compared with that of the data obtained by microprojector method. The results show that the algorithm is feasible for assessing the fiber ODFs of nonwoven fabrics manufactured by different processing methods.
基金Supported by Natural Science Foundation of Shandong Province(ZR2014CQ012)
文摘[ Objectives] The paper aimed to select drugs reasonably for treatment of rex rabbit colibacillosis, and to isolate the pathogenicity of Escherichia coll. [ Methods ] Pathogen isolation, drug sensitivity test and pathogen regression test were performed with rex rabbits killed by E. coli in clinic. [ Results] The isolate was E. coli 0-23, susceptible to amikacin and cefotaxime sodium; when the challenge dose was 1.0 mL/rabbit (about one billion E. coli), the test animal would discharge mucous feces. [ Conclusions] The results provided model support for clinical medicine selection against rex rabbit colibacillosis.
文摘Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their performance is usually measured through a metric Average Percentage of Fault Detection(APFD).This metric is value-neutral because it only works well when all test cases have the same cost,and all faults have the same severity.Using APFD for performance evaluation of test case orders where test cases cost or faults severity varies is prone to produce false results.Therefore,using the right metric for performance evaluation of TCP techniques is very important to get reliable and correct results.In this paper,two value-based TCP techniques have been introduced using Genetic Algorithm(GA)including Value-Cognizant Fault Detection-Based TCP(VCFDB-TCP)and Value-Cognizant Requirements Coverage-Based TCP(VCRCB-TCP).Two novel value-based performance evaluation metrics are also introduced for value-based TCP including Average Percentage of Fault Detection per value(APFDv)and Average Percentage of Requirements Coverage per value(APRCv).Two case studies are performed to validate proposed techniques and performance evaluation metrics.The proposed GA-based techniques outperformed the existing state-of-the-art TCP techniques including Original Order(OO),Reverse Order(REV-O),Random Order(RO),and Greedy algorithm.
文摘Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.
文摘We have recently developed a systematic method for the study on the inheritance of resistance to sheath blight. The key of the system is an innovated method of inoculation and investigation along with the employment of the permanent population. This paper reported the procedure of the system and the result of its verification.
基金supported by the University of New South Wales and the Australian Research Council under grant No.DP120102607
文摘As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.
基金support from the Deanship of Scientific Research,University of Hail,Saudi Arabia through the project Ref.(RG-191315).
文摘Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.