The lack of existing solutions makes it really hard to understand formal specification languages since the application domain for representations is useful for the purpose of carrying out certain software engineering ...The lack of existing solutions makes it really hard to understand formal specification languages since the application domain for representations is useful for the purpose of carrying out certain software engineering operations such as slicing and the computation of program metrics.A Z specification dependence graph is presented in this letter. It draws on the strengths of a range of earlier works and adapts them, if necessary, to the Z language.展开更多
This paper proposes an extended system dependence graph called AspectSDG to represent control and data dependences for AspeetC++ programs, and presents an approach for the construction of AspectSDG. This approach de...This paper proposes an extended system dependence graph called AspectSDG to represent control and data dependences for AspeetC++ programs, and presents an approach for the construction of AspectSDG. This approach decomposes aspect-oriented programs into three parts: component codes, aspect codes, and weaving codes. It constructs program dependence graphs (PDGs) for each part, and then connects the PDGs at call sites to form the complete AspectSDG. The AspectSDG can deal with advice precedence correctly, and represent the additional dependences caused by aspect codes. Based on this model, we introduce how to compute a static slice of an AspectC+ + program.展开更多
The existing slicing algorithms do not consider parameterized types in generic programs, so they are not suitable for generic programs. To solve this problem, this paper presents a generic system dependence graph for ...The existing slicing algorithms do not consider parameterized types in generic programs, so they are not suitable for generic programs. To solve this problem, this paper presents a generic system dependence graph for Java generic programs based on the traditional system dependence graph to express dependences for parameterized type information. A novel slicing criterion and slicing algorithm for generic programs is proposed. The slices computed by the algorithm can help to understand relations between concepts and types for generic programs and can express the features of generic programs better.展开更多
Dynamic program slicing is an effective technique for narrowing the errors to the relevant parts of a program when debugging. Given a slicing criterion, the dynamic slice contains only those statements that actually a...Dynamic program slicing is an effective technique for narrowing the errors to the relevant parts of a program when debugging. Given a slicing criterion, the dynamic slice contains only those statements that actually affect the variables in the slicing criterion. This paper proposes a dynamic slicing method based on static dependence analysis. It uses the program dependence graph and other static information to reduce the information needed to be traced during program execution. Thus, the efficiency is dramatically improved while the precision is not depressed. The slicing criterion is modified to fit for debugging. It consists of file name and the line number at which the statement is.展开更多
Knowledge graph(KG)conflict resolution is to solve knowledge conflicts problem in the construction of KG.Aiming at the problem of KG conflict resolution,a KG conflict resolution algorithm NGDcrm is proposed,which is a...Knowledge graph(KG)conflict resolution is to solve knowledge conflicts problem in the construction of KG.Aiming at the problem of KG conflict resolution,a KG conflict resolution algorithm NGDcrm is proposed,which is a numeric graph dependency-based conflict resolution method.NGDcrm utilizes the dependency graph to perform arithmetic calculation and predicate comparison of numerical entity knowledge in the KG.NGDcrm first uses a parallel segmentation method to segment the KG;then,it extracts the features of the KG according to KG embedding;finally,it uses numerical graph dependencies to detect and correct the wrong facts in the KG based on the extracted features.The experimental results on real data show that NGDcrm is better than the state-of-the-art knowledge conflict resolution method.Among them,the AUC value of NGDcrm on the DBpedia dataset is 15.4%higher than the state-of-the-art method.展开更多
To combat increasingly sophisticated cyber attacks,the security community has proposed and deployed a large body of threat detection approaches to discover malicious behaviors on host systems and attack payloads in ne...To combat increasingly sophisticated cyber attacks,the security community has proposed and deployed a large body of threat detection approaches to discover malicious behaviors on host systems and attack payloads in network traffic.Several studies have begun to focus on threat detection methods based on provenance data of host-level event tracing.On the other side,with the significant development of big data and artificial intelligence technologies,large-scale graph computing has been widely used.To this end,kinds of research try to bridge the gap between threat detection based on host log provenance data and graph algorithm,and propose the threat detection algorithm based on system provenance graph.These approaches usually generate the system provenance graph via tagging and tracking of system events,and then leverage the characteristics of the graph to conduct threat detection and attack investigation.For the purpose of deeply understanding the correctness,effectiveness,and efficiency of different graph-based threat detection algorithms,we pay attention to mainstream threat detection methods based on provenance graphs.We select and implement 5 state-of-the-art threat detection approaches among a large number of studies as evaluation objects for further analysis.To this end,we collect about 40GB of host-level raw log data in a real-world IT environment,and simulate 6 types of cyber attack scenarios in an isolated environment for malicious provenance data to build our evaluation datasets.The crosswise comparison and longitudinal assessment interpret in detail these detection approaches can detect which attack scenarios well and why.Our empirical evaluation provides a solid foundation for the improvement direction of the threat detection approach.展开更多
Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more t...Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more than 40 million smart contracts are running,is frequently challenged by smart contract vulnerabilities.What’s worse,since the homogeneity of a wide range of smart contracts and the increase in inter-contract dependencies,a vulnerability in a certain smart contract could affect a large number of other contracts in Ethereum.However,little is known about how vulnerable contracts affect other on-chain contracts and which contracts can be affected.Thus,we first present the contract dependency graph(CDG)to perform a vulnerability analysis for Ethereum smart contracts,where CDG characterizes inter-contract dependencies formed by DELEGATECALL-type internal transaction in Ethereum.Then,three generic definitions of security violations against CDG are given for finding respective potential victim contracts affected by different types of vulnerable contracts.Further,we construct the CDG with 195,247 smart contracts active in the latest blocks of the Ethereum and verify the above security violations against CDG by detecting three representative known vulnerabilities.Compared to previous large-scale vulnerability analysis,our analysis scheme marks potential victim contracts that can be affected by different types of vulnerable contracts,and identify their possible risks based on the type of security violation actually occurring.The analysis results show that the proportion of potential victim contracts reaches 14.7%,far more than that of corresponding vulnerable contracts(less than 0.02%)in CDG.展开更多
A qualia role-based entity-dependency graph(EDG)is proposed to represent and extract quantity relations for solving algebra story problems stated in Chinese.Traditional neural solvers use end-to-end models to translat...A qualia role-based entity-dependency graph(EDG)is proposed to represent and extract quantity relations for solving algebra story problems stated in Chinese.Traditional neural solvers use end-to-end models to translate problem texts into math expressions,which lack quantity relation acquisition in sophisticated scenarios.To address the problem,the proposed method leverages EDG to represent quantity relations hidden in qualia roles of math objects.Algorithms were designed for EDG generation and quantity relation extraction for solving algebra story problems.Experimental result shows that the proposedmethod achieved an average accuracy of 82.2%on quantity relation extraction compared to 74.5%of baseline method.Another prompt learning result shows a 5%increase obtained in problem solving by injecting the extracted quantity relations into the baseline neural solvers.展开更多
Program slice has many applications such as program debugging, testing, maintenance, and complexity measurement. A static slice consists of all statements in program P that may effect the value of variable v a...Program slice has many applications such as program debugging, testing, maintenance, and complexity measurement. A static slice consists of all statements in program P that may effect the value of variable v at some point p , and a dynamic slice consists only of statements that influence the value of variable occurrence for specific program inputs. In this paper, we concern the problem of dynamic slicing of object oriented programs which, to our knowledge, has not been addressed in the literatures. To solve this problem, we present the dynamic object oriented dependence graph (DODG)which is an arc classified digraph to explicitly represent various dynamic dependence between statement instances for a particular execution of an object oriented program. Based on the DODG, we present a two phase backward algorithm for computing a dynamic slice of an object oriented program.展开更多
In a very large digital library that support computer aided collaborative design, an indexing process is crucial whenever the retrieval process has to select among many possible designs. In this paper, we address the...In a very large digital library that support computer aided collaborative design, an indexing process is crucial whenever the retrieval process has to select among many possible designs. In this paper, we address the problem of retrieving important design and engineering information by structural indexing. A design is represented by a model dependency graph, therefor, the indexing problem is to determine whether a graph is present or absent in a database of model dependency graphs. we present a novel graph indexing method using polynomial characterization of a model dependency graph and on hashing. Such an approach is able to create an high efficient 3D solid digital library for retrieving and extracting solid geometric model and engineering information.展开更多
With the continuous expansion of software applications,people’s requirements for software quality are increasing.Software defect prediction is an important technology to improve software quality.It often encodes the ...With the continuous expansion of software applications,people’s requirements for software quality are increasing.Software defect prediction is an important technology to improve software quality.It often encodes the software into several features and applies the machine learning method to build defect prediction classifiers,which can estimate the software areas is clean or buggy.However,the current encoding methods are mainly based on the traditional manual features or the AST of source code.Traditional manual features are difficult to reflect the deep semantics of programs,and there is a lot of noise information in AST,which affects the expression of semantic features.To overcome the above deficiencies,we combined with the Convolutional Neural Networks(CNN)and proposed a novel compiler Intermediate Representation(IR)based program encoding method for software defect prediction(CIR-CNN).Specifically,our program encoding method is based on the compiler IR,which can eliminate a large amount of noise information in the syntax structure of the source code and facilitate the acquisition of more accurate semantic information.Secondly,with the help of data flow analysis,a Data Dependency Graph(DDG)is constructed on the compiler IR,which helps to capture the deeper semantic information of the program.Finally,we use the widely used CNN model to build a software defect prediction model,which can increase the adaptive ability of the method.To evaluate the performance of the CIR-CNN,we use seven projects from PROMISE datasets to set up comparative experiments.The experiments results show that,in WPDP,with our CIR-CNN method,the prediction accuracy was improved by 12%for the AST-encoded CNN-based model and by 20.9%for the traditional features-based LR model,respectively.And in CPDP,the AST-encoded DBNbased model was improved by 9.1%and the traditional features-based TCA+model by 19.2%,respectively.展开更多
In this paper, a new method for DO-loop parallelization based on the new collcept allocation-dependence and equivalence classification of iteration space is proposed. This method has many advantages: It is a general,...In this paper, a new method for DO-loop parallelization based on the new collcept allocation-dependence and equivalence classification of iteration space is proposed. This method has many advantages: It is a general,ullified method for DO-loop parallelization. It is used in coarse grain parallel partitioning on MINID and SPMD. While partitioning iteration space, it also does the does the partition and computation partition such that these partitions are independent each other. It can extract the potential parallelism of program accurately. Combining with task-level parallelization vectorization and pipeline,it can extract parallelism thoroughly.展开更多
Design extraction and reduction have been extensively used in modern VLSI design process. The extracted and reduced design can be efficiently processed by various applications, such as formal verification, simulation,...Design extraction and reduction have been extensively used in modern VLSI design process. The extracted and reduced design can be efficiently processed by various applications, such as formal verification, simulation, automatic test pattern generation (ATPG), etc. This paper presents a new circuit extraction method using program slicing technique, and develops an elegant theoretical basis based on program slicing for circuit extraction from Verilog description. The technique can obtain a chaining slice for given signals of interest. Compared with related researches, the main advantages of the method include that it is fine grain, it has no hardware description language (HDL) coding style limitation; it is precise and is capable of dealing with various Verilog constructions. The technique has been integrated with a commercial simulation environment and incorporated into a design process. The results of practical designs show the significant benefits of the approach.展开更多
Even after thorough testing, a few bugs still remain in a program with moderate complexity. These residual bugs are randomly distributed throughout the code. We have noticed that bugs in some parts of a program cause ...Even after thorough testing, a few bugs still remain in a program with moderate complexity. These residual bugs are randomly distributed throughout the code. We have noticed that bugs in some parts of a program cause frequent and severe failures compared to those in other parts. Then, it is necessary to take a decision about what to test more and what to test less within the testing budget. It is possible to prioritize the methods and classes of an object-oriented program according to their potential to cause failures. For this, we propose a program metric called influence metric to find the influence of a program element on the source code. First, we represent the source code into an intermediate graph called extended system dependence graph. Then, forward slicing is applied on a node of the graph to get the influence of that node. The influence metric for a method m in a program shows the number of statements of the program which directly or indirectly use the result produced by method m. We compute the influence metric for a class c based on the influence metric of all its methods. As influence metric is computed statically, it does not show the expected behavior of a class at run time. It is already known that faults in highly executed parts tend to more failures. Therefore, we have considered operational profile to find the average execution time of a class in a system. Then, classes are prioritized in the source code based on influence metric and average execution time. The priority of an element indicates the potential of the element to cause failures. Once all program elements have been prioritized, the testing effort can be apportioned so that the elements causing frequent failures will be tested thoroughly. We have conducted experiments for two well-known case studies -- Library Management System and Trading Automation System -- and successfully identified critical elements in the source code of each case study. We have also conducted experiments to compare our scheme with a related scheme. The experimental studies justify that our approach is more accurate than the existing ones in exposing critical elements at the implementation level.展开更多
The traditional similar code detection approaches are limited in detecting semantically similar codes, impeding their applications in practice. In this paper, we have improved the traditional metrics-based approach as...The traditional similar code detection approaches are limited in detecting semantically similar codes, impeding their applications in practice. In this paper, we have improved the traditional metrics-based approach as well as the graph- based approach and presented a metrics-based and graph- based combined approach. First, source codes are represented as augmented system dependence graphs. Then, metrics- based candidate similar code extraction is performed to filter out most of the dissimilar code pairs so as to lower the computational complexity. After that, code normalization is performed on the candidate similar codes to remove code variations so as to detect similar code at the semantic level. Finally, program matching is performed on the normalized control dependence trees to output semantically similar codes. Experiment results show that our approach can detect similar codes with code variations, and it can be applied to large software.展开更多
The capacity that computer can solve more complex design problem was gradually increased. Bridge designs need a breakthrough in the current development limitations, and then become more intelligent and integrated. Thi...The capacity that computer can solve more complex design problem was gradually increased. Bridge designs need a breakthrough in the current development limitations, and then become more intelligent and integrated. This paper proposes a new parametric and feature-based computer aided design (CAD) models which can represent families of bridge objects, includes knowledge representation, three-dimensional geometric topology relationships. The realization of a family member is found by solving first the geometric constraints, and then the topological constraints. From the geometric solution, constraint equations are constructed. Topology solution is developed by feature dependencies graph between bridge objects. Finally, feature parameters are proposed to drive bridge design with feature parameters. Results from our implementation show that the method can help to facilitate bridge design.展开更多
文摘The lack of existing solutions makes it really hard to understand formal specification languages since the application domain for representations is useful for the purpose of carrying out certain software engineering operations such as slicing and the computation of program metrics.A Z specification dependence graph is presented in this letter. It draws on the strengths of a range of earlier works and adapts them, if necessary, to the Z language.
基金Supported by the National Science Foundation forDistinguished Young Scholars (60425206) the National Natural Sci-ence Foundation of China ( 90412003 , 60373066 , 60403016 ,60503033) the National Basic Research Programof China (973 Pro-gram2002CB312000)
文摘This paper proposes an extended system dependence graph called AspectSDG to represent control and data dependences for AspeetC++ programs, and presents an approach for the construction of AspectSDG. This approach decomposes aspect-oriented programs into three parts: component codes, aspect codes, and weaving codes. It constructs program dependence graphs (PDGs) for each part, and then connects the PDGs at call sites to form the complete AspectSDG. The AspectSDG can deal with advice precedence correctly, and represent the additional dependences caused by aspect codes. Based on this model, we introduce how to compute a static slice of an AspectC+ + program.
基金Supported by and the National High Technology Research and Development Program of China (863 Program) (2009AA01Z147)the National Natural Science Foundation of China (90818027, 60633010)
文摘The existing slicing algorithms do not consider parameterized types in generic programs, so they are not suitable for generic programs. To solve this problem, this paper presents a generic system dependence graph for Java generic programs based on the traditional system dependence graph to express dependences for parameterized type information. A novel slicing criterion and slicing algorithm for generic programs is proposed. The slices computed by the algorithm can help to understand relations between concepts and types for generic programs and can express the features of generic programs better.
文摘Dynamic program slicing is an effective technique for narrowing the errors to the relevant parts of a program when debugging. Given a slicing criterion, the dynamic slice contains only those statements that actually affect the variables in the slicing criterion. This paper proposes a dynamic slicing method based on static dependence analysis. It uses the program dependence graph and other static information to reduce the information needed to be traced during program execution. Thus, the efficiency is dramatically improved while the precision is not depressed. The slicing criterion is modified to fit for debugging. It consists of file name and the line number at which the statement is.
基金Supported by the Henan Province Science and Technology Department Foundation(No.202102310237,192102210133,202102310295)the Doctoral Research Fund of Zhengzhou University of Light Industry(No.2018BSJJ039)the Internet Medical and Health Service Henan Collaborative Innovation Center Open Project Fund(No.IH2019006).
文摘Knowledge graph(KG)conflict resolution is to solve knowledge conflicts problem in the construction of KG.Aiming at the problem of KG conflict resolution,a KG conflict resolution algorithm NGDcrm is proposed,which is a numeric graph dependency-based conflict resolution method.NGDcrm utilizes the dependency graph to perform arithmetic calculation and predicate comparison of numerical entity knowledge in the KG.NGDcrm first uses a parallel segmentation method to segment the KG;then,it extracts the features of the KG according to KG embedding;finally,it uses numerical graph dependencies to detect and correct the wrong facts in the KG based on the extracted features.The experimental results on real data show that NGDcrm is better than the state-of-the-art knowledge conflict resolution method.Among them,the AUC value of NGDcrm on the DBpedia dataset is 15.4%higher than the state-of-the-art method.
基金supported by National Natural Science Foundation of China (No. U1736218)National Key R&D Program of China (No. 2018YFB0804704)partially supported by CNCERT/CC
文摘To combat increasingly sophisticated cyber attacks,the security community has proposed and deployed a large body of threat detection approaches to discover malicious behaviors on host systems and attack payloads in network traffic.Several studies have begun to focus on threat detection methods based on provenance data of host-level event tracing.On the other side,with the significant development of big data and artificial intelligence technologies,large-scale graph computing has been widely used.To this end,kinds of research try to bridge the gap between threat detection based on host log provenance data and graph algorithm,and propose the threat detection algorithm based on system provenance graph.These approaches usually generate the system provenance graph via tagging and tracking of system events,and then leverage the characteristics of the graph to conduct threat detection and attack investigation.For the purpose of deeply understanding the correctness,effectiveness,and efficiency of different graph-based threat detection algorithms,we pay attention to mainstream threat detection methods based on provenance graphs.We select and implement 5 state-of-the-art threat detection approaches among a large number of studies as evaluation objects for further analysis.To this end,we collect about 40GB of host-level raw log data in a real-world IT environment,and simulate 6 types of cyber attack scenarios in an isolated environment for malicious provenance data to build our evaluation datasets.The crosswise comparison and longitudinal assessment interpret in detail these detection approaches can detect which attack scenarios well and why.Our empirical evaluation provides a solid foundation for the improvement direction of the threat detection approach.
基金supported by the Key R and D Programs of Zhejiang Province under Grant No.2022C01018the Natural Science Foundation of Zhejiang Province under Grant No.LQ20F020019.
文摘Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more than 40 million smart contracts are running,is frequently challenged by smart contract vulnerabilities.What’s worse,since the homogeneity of a wide range of smart contracts and the increase in inter-contract dependencies,a vulnerability in a certain smart contract could affect a large number of other contracts in Ethereum.However,little is known about how vulnerable contracts affect other on-chain contracts and which contracts can be affected.Thus,we first present the contract dependency graph(CDG)to perform a vulnerability analysis for Ethereum smart contracts,where CDG characterizes inter-contract dependencies formed by DELEGATECALL-type internal transaction in Ethereum.Then,three generic definitions of security violations against CDG are given for finding respective potential victim contracts affected by different types of vulnerable contracts.Further,we construct the CDG with 195,247 smart contracts active in the latest blocks of the Ethereum and verify the above security violations against CDG by detecting three representative known vulnerabilities.Compared to previous large-scale vulnerability analysis,our analysis scheme marks potential victim contracts that can be affected by different types of vulnerable contracts,and identify their possible risks based on the type of security violation actually occurring.The analysis results show that the proportion of potential victim contracts reaches 14.7%,far more than that of corresponding vulnerable contracts(less than 0.02%)in CDG.
基金supported by the National Natural Science Foundation of China (Nos.62177024,62007014)the Humanities and Social Sciences Youth Fund of the Ministry of Education (No.20YJC880024)+1 种基金China Post Doctoral Science Foundation (No.2019M652678)the Fundamental Research Funds for the Central Universities (No.CCNU20ZT019).
文摘A qualia role-based entity-dependency graph(EDG)is proposed to represent and extract quantity relations for solving algebra story problems stated in Chinese.Traditional neural solvers use end-to-end models to translate problem texts into math expressions,which lack quantity relation acquisition in sophisticated scenarios.To address the problem,the proposed method leverages EDG to represent quantity relations hidden in qualia roles of math objects.Algorithms were designed for EDG generation and quantity relation extraction for solving algebra story problems.Experimental result shows that the proposedmethod achieved an average accuracy of 82.2%on quantity relation extraction compared to 74.5%of baseline method.Another prompt learning result shows a 5%increase obtained in problem solving by injecting the extracted quantity relations into the baseline neural solvers.
文摘Program slice has many applications such as program debugging, testing, maintenance, and complexity measurement. A static slice consists of all statements in program P that may effect the value of variable v at some point p , and a dynamic slice consists only of statements that influence the value of variable occurrence for specific program inputs. In this paper, we concern the problem of dynamic slicing of object oriented programs which, to our knowledge, has not been addressed in the literatures. To solve this problem, we present the dynamic object oriented dependence graph (DODG)which is an arc classified digraph to explicitly represent various dynamic dependence between statement instances for a particular execution of an object oriented program. Based on the DODG, we present a two phase backward algorithm for computing a dynamic slice of an object oriented program.
文摘In a very large digital library that support computer aided collaborative design, an indexing process is crucial whenever the retrieval process has to select among many possible designs. In this paper, we address the problem of retrieving important design and engineering information by structural indexing. A design is represented by a model dependency graph, therefor, the indexing problem is to determine whether a graph is present or absent in a database of model dependency graphs. we present a novel graph indexing method using polynomial characterization of a model dependency graph and on hashing. Such an approach is able to create an high efficient 3D solid digital library for retrieving and extracting solid geometric model and engineering information.
基金This work was supported by the Universities Natural Science Research Project of Jiangsu Province under Grant 20KJB520026 and 20KJA520002the Foundation for Young Teachers of Nanjing Auditing University under Grant 19QNPY018the National Nature Science Foundation of China under Grant 71972102 and 61902189.
文摘With the continuous expansion of software applications,people’s requirements for software quality are increasing.Software defect prediction is an important technology to improve software quality.It often encodes the software into several features and applies the machine learning method to build defect prediction classifiers,which can estimate the software areas is clean or buggy.However,the current encoding methods are mainly based on the traditional manual features or the AST of source code.Traditional manual features are difficult to reflect the deep semantics of programs,and there is a lot of noise information in AST,which affects the expression of semantic features.To overcome the above deficiencies,we combined with the Convolutional Neural Networks(CNN)and proposed a novel compiler Intermediate Representation(IR)based program encoding method for software defect prediction(CIR-CNN).Specifically,our program encoding method is based on the compiler IR,which can eliminate a large amount of noise information in the syntax structure of the source code and facilitate the acquisition of more accurate semantic information.Secondly,with the help of data flow analysis,a Data Dependency Graph(DDG)is constructed on the compiler IR,which helps to capture the deeper semantic information of the program.Finally,we use the widely used CNN model to build a software defect prediction model,which can increase the adaptive ability of the method.To evaluate the performance of the CIR-CNN,we use seven projects from PROMISE datasets to set up comparative experiments.The experiments results show that,in WPDP,with our CIR-CNN method,the prediction accuracy was improved by 12%for the AST-encoded CNN-based model and by 20.9%for the traditional features-based LR model,respectively.And in CPDP,the AST-encoded DBNbased model was improved by 9.1%and the traditional features-based TCA+model by 19.2%,respectively.
文摘In this paper, a new method for DO-loop parallelization based on the new collcept allocation-dependence and equivalence classification of iteration space is proposed. This method has many advantages: It is a general,ullified method for DO-loop parallelization. It is used in coarse grain parallel partitioning on MINID and SPMD. While partitioning iteration space, it also does the does the partition and computation partition such that these partitions are independent each other. It can extract the potential parallelism of program accurately. Combining with task-level parallelization vectorization and pipeline,it can extract parallelism thoroughly.
文摘Design extraction and reduction have been extensively used in modern VLSI design process. The extracted and reduced design can be efficiently processed by various applications, such as formal verification, simulation, automatic test pattern generation (ATPG), etc. This paper presents a new circuit extraction method using program slicing technique, and develops an elegant theoretical basis based on program slicing for circuit extraction from Verilog description. The technique can obtain a chaining slice for given signals of interest. Compared with related researches, the main advantages of the method include that it is fine grain, it has no hardware description language (HDL) coding style limitation; it is precise and is capable of dealing with various Verilog constructions. The technique has been integrated with a commercial simulation environment and incorporated into a design process. The results of practical designs show the significant benefits of the approach.
基金supported by grants from the Department of Science and TechnologyGovernment of India under SERC Project
文摘Even after thorough testing, a few bugs still remain in a program with moderate complexity. These residual bugs are randomly distributed throughout the code. We have noticed that bugs in some parts of a program cause frequent and severe failures compared to those in other parts. Then, it is necessary to take a decision about what to test more and what to test less within the testing budget. It is possible to prioritize the methods and classes of an object-oriented program according to their potential to cause failures. For this, we propose a program metric called influence metric to find the influence of a program element on the source code. First, we represent the source code into an intermediate graph called extended system dependence graph. Then, forward slicing is applied on a node of the graph to get the influence of that node. The influence metric for a method m in a program shows the number of statements of the program which directly or indirectly use the result produced by method m. We compute the influence metric for a class c based on the influence metric of all its methods. As influence metric is computed statically, it does not show the expected behavior of a class at run time. It is already known that faults in highly executed parts tend to more failures. Therefore, we have considered operational profile to find the average execution time of a class in a system. Then, classes are prioritized in the source code based on influence metric and average execution time. The priority of an element indicates the potential of the element to cause failures. Once all program elements have been prioritized, the testing effort can be apportioned so that the elements causing frequent failures will be tested thoroughly. We have conducted experiments for two well-known case studies -- Library Management System and Trading Automation System -- and successfully identified critical elements in the source code of each case study. We have also conducted experiments to compare our scheme with a related scheme. The experimental studies justify that our approach is more accurate than the existing ones in exposing critical elements at the implementation level.
基金Acknowledgements This work was supported by the National Natural Science Foundation of China (Grant Nos. 61202092 and 61173021), the Research Fund for the Doctoral Program of Higher Education of China (20112302120052), Research Fund for the Innovative Scholars of Harbin (RC2013QN010001), and Young Colleger Academic Backbone Project of Heilongjiang.
文摘The traditional similar code detection approaches are limited in detecting semantically similar codes, impeding their applications in practice. In this paper, we have improved the traditional metrics-based approach as well as the graph- based approach and presented a metrics-based and graph- based combined approach. First, source codes are represented as augmented system dependence graphs. Then, metrics- based candidate similar code extraction is performed to filter out most of the dissimilar code pairs so as to lower the computational complexity. After that, code normalization is performed on the candidate similar codes to remove code variations so as to detect similar code at the semantic level. Finally, program matching is performed on the normalized control dependence trees to output semantically similar codes. Experiment results show that our approach can detect similar codes with code variations, and it can be applied to large software.
基金the West Communication Science and Technology Project of Ministry of Communications (No. 200431822315)
文摘The capacity that computer can solve more complex design problem was gradually increased. Bridge designs need a breakthrough in the current development limitations, and then become more intelligent and integrated. This paper proposes a new parametric and feature-based computer aided design (CAD) models which can represent families of bridge objects, includes knowledge representation, three-dimensional geometric topology relationships. The realization of a family member is found by solving first the geometric constraints, and then the topological constraints. From the geometric solution, constraint equations are constructed. Topology solution is developed by feature dependencies graph between bridge objects. Finally, feature parameters are proposed to drive bridge design with feature parameters. Results from our implementation show that the method can help to facilitate bridge design.