Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degrad...Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.展开更多
Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the inform...Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the information of one block will require changes to all the next block in order for that change to take effect.Which makes it unfeasible for such an attack to happen.However,the structure of how blockchain works makes the last block always vulnerable for attacks,given that its information is not saved yet in any block.This allows malicious node to change the information of the last block and generate a new block and broadcast it to the network.Given that the nodes always follow the longer chain wins rule,the malicious node will win given that it has the longest chain in the network.This paper suggests a solution to this issue by making the nodes send consistency check messages before broadcasting a block.If the nodes manage to successfully verify that the node that generated a new block hasn’t tampered with the blockchain than that block will be broadcasted.The results of the simulation show suggested protocol provided better security compared to the regular blockchain.展开更多
A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagra...A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences. We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.展开更多
This article describes a global consistency check of CO2 satellite retrieval products from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI) using statistical analysis...This article describes a global consistency check of CO2 satellite retrieval products from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI) using statistical analysis and data from the World Data Centre for Greenhouse Gases (WDCGG). We use the correlation coefficient (r), relative difference (RD), root mean square errors (RMSE), and mean bias error (MBE) as evaluation indicators for this study. Statistical results show that a linear positive correlation between AIRS/IASI and WDCGG data occurs for most regions around the world. Temporal and spatial variations of these statistical quantities reflect obvious differences between satellite-derived and ground-based data based on geographic position, especially for stations near areas of intense human activities in the Northern Hemisphere. It is noteworthy that there appears to be a very weak correlation between AIRS/IASI data and ten ground- based observation stations in Europe, Asia, and North America. These results indicate that retrieval products from the two satellite-based instruments studied should be used with great caution.展开更多
Consistency checking is a fundamental computational problem in genetics.Given a pedigree and information on the genotypes (of some) of the individuals in it, the aim ofconsistency checking is to determine whether thes...Consistency checking is a fundamental computational problem in genetics.Given a pedigree and information on the genotypes (of some) of the individuals in it, the aim ofconsistency checking is to determine whether these data are consistent with the classic Mendelianlaws of inheritance. This problem arose originally from the geneticists'' need to filter their inputdata from erroneous information, and is well motivated from both a biological and a sociologicalviewpoint. This paper shows that consistency checking is NP-complete, even with focus on a singlegene and in the presence of three alleles. Several other results on the computational complexity ofproblems from genetics that are related to consistency checking are also offered. In particular, itis shown that checking the consistency of pedigrees over two alleles, and of pedigrees withoutloops, can be done in polynomial time.展开更多
A conceptual framework for the specification and verification of constraints on the content and narrative structure of documents is proposed. As a specification formalism, CTLDL is defined, which is an extension of th...A conceptual framework for the specification and verification of constraints on the content and narrative structure of documents is proposed. As a specification formalism, CTLDL is defined, which is an extension of the temporal logic CTL by description logic concepts. In contrast to existing solutions this approach allows for the integration of ontologies to achieve interoperability and abstraction from implementation aspects of documents. This makes CTLDL specifically suitable for the integration of heterogeneous and distributed information resources in the semantic web.展开更多
MongoDB is one of the first commercial distributed databases that support causal consistency.Its implementation of causal consistency combines several research ideas for achieving scalability,fault tolerance,and secur...MongoDB is one of the first commercial distributed databases that support causal consistency.Its implementation of causal consistency combines several research ideas for achieving scalability,fault tolerance,and security.Given its inherent complexity,a natural question arises:"Has MongoDB correctly implemented causal consistency as it claimed?"To address this concern,the Jepsen team has conducted black-box testing of MongoDB.However,this Jepsen testing has several drawbacks in terms of specification,test case generation,implementation of causal consistency checking algorithms,and testing scenarios,which undermine the credibility of its reports.In this work,we propose a more thorough design of Jepsen testing of causal consistency of MongoDB.Specifically,we fully implement the causal consistency checking algorithms proposed by Bouajjani et al.and test MongoDB against three well-known variants of causal consistency,namely CC,CCv,and CM,under various scenarios including node failures,data movement,and network partitions.In addition,we develop formal specifications of causal consistency and their checking algorithms in TLA^(+),and verify them using the TLC model checker.We also explain how TLA^(+) specification can be related to Jepsen testing.展开更多
When vectors of a judgement matrix is arranged according to their weight in AHP, uniform consistency is always required and the check criterion is uniform check. In this paper, a stronger uniform consistency check is ...When vectors of a judgement matrix is arranged according to their weight in AHP, uniform consistency is always required and the check criterion is uniform check. In this paper, a stronger uniform consistency check is introduced which can obtain a exact and practical effect by making an adjust to any non-satisfying uniforming matrix.展开更多
基金supported by the Natural Science Foundation of Hunan Province(2018JJ2282)
文摘Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.
基金supported by research fund of Chungnam National University.
文摘Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the information of one block will require changes to all the next block in order for that change to take effect.Which makes it unfeasible for such an attack to happen.However,the structure of how blockchain works makes the last block always vulnerable for attacks,given that its information is not saved yet in any block.This allows malicious node to change the information of the last block and generate a new block and broadcast it to the network.Given that the nodes always follow the longer chain wins rule,the malicious node will win given that it has the longest chain in the network.This paper suggests a solution to this issue by making the nodes send consistency check messages before broadcasting a block.If the nodes manage to successfully verify that the node that generated a new block hasn’t tampered with the blockchain than that block will be broadcasted.The results of the simulation show suggested protocol provided better security compared to the regular blockchain.
基金Supported by the National Natural Science Foundation of China (60673115)the National Basic Research Program of China (973 Program) (2002CB312001)the Open Foundation of State Key Laboratory of Soft-ware Engineering (SKLSE05-13)
文摘A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences. We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.
基金Acknowledgements This project was supported by the National Basic Research Program of China (No. 2010CB951603) and the Major Program of National Social Science Foundation of China (No.13&ZD161). We thank Prof. Jietai Mao of the Department of Atmospheric & Oceanic Sciences, Peking University, China for providing expert advice and assistance. We also thank the WDCGG for providing the CO2 data. Many thanks to NASA for providing AIRS CO2 data and NOAA for providing IASI CO2 data.
文摘This article describes a global consistency check of CO2 satellite retrieval products from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI) using statistical analysis and data from the World Data Centre for Greenhouse Gases (WDCGG). We use the correlation coefficient (r), relative difference (RD), root mean square errors (RMSE), and mean bias error (MBE) as evaluation indicators for this study. Statistical results show that a linear positive correlation between AIRS/IASI and WDCGG data occurs for most regions around the world. Temporal and spatial variations of these statistical quantities reflect obvious differences between satellite-derived and ground-based data based on geographic position, especially for stations near areas of intense human activities in the Northern Hemisphere. It is noteworthy that there appears to be a very weak correlation between AIRS/IASI data and ten ground- based observation stations in Europe, Asia, and North America. These results indicate that retrieval products from the two satellite-based instruments studied should be used with great caution.
文摘Consistency checking is a fundamental computational problem in genetics.Given a pedigree and information on the genotypes (of some) of the individuals in it, the aim ofconsistency checking is to determine whether these data are consistent with the classic Mendelianlaws of inheritance. This problem arose originally from the geneticists'' need to filter their inputdata from erroneous information, and is well motivated from both a biological and a sociologicalviewpoint. This paper shows that consistency checking is NP-complete, even with focus on a singlegene and in the presence of three alleles. Several other results on the computational complexity ofproblems from genetics that are related to consistency checking are also offered. In particular, itis shown that checking the consistency of pedigrees over two alleles, and of pedigrees withoutloops, can be done in polynomial time.
文摘A conceptual framework for the specification and verification of constraints on the content and narrative structure of documents is proposed. As a specification formalism, CTLDL is defined, which is an extension of the temporal logic CTL by description logic concepts. In contrast to existing solutions this approach allows for the integration of ontologies to achieve interoperability and abstraction from implementation aspects of documents. This makes CTLDL specifically suitable for the integration of heterogeneous and distributed information resources in the semantic web.
基金supported by the CCF-Tencent Open Fund under Grant No.RAGR20200124the National Natural Science Foundation of China under Grant Nos.61702253 and 61772258.
文摘MongoDB is one of the first commercial distributed databases that support causal consistency.Its implementation of causal consistency combines several research ideas for achieving scalability,fault tolerance,and security.Given its inherent complexity,a natural question arises:"Has MongoDB correctly implemented causal consistency as it claimed?"To address this concern,the Jepsen team has conducted black-box testing of MongoDB.However,this Jepsen testing has several drawbacks in terms of specification,test case generation,implementation of causal consistency checking algorithms,and testing scenarios,which undermine the credibility of its reports.In this work,we propose a more thorough design of Jepsen testing of causal consistency of MongoDB.Specifically,we fully implement the causal consistency checking algorithms proposed by Bouajjani et al.and test MongoDB against three well-known variants of causal consistency,namely CC,CCv,and CM,under various scenarios including node failures,data movement,and network partitions.In addition,we develop formal specifications of causal consistency and their checking algorithms in TLA^(+),and verify them using the TLC model checker.We also explain how TLA^(+) specification can be related to Jepsen testing.
文摘When vectors of a judgement matrix is arranged according to their weight in AHP, uniform consistency is always required and the check criterion is uniform check. In this paper, a stronger uniform consistency check is introduced which can obtain a exact and practical effect by making an adjust to any non-satisfying uniforming matrix.