Stochastic process algebras have been proposed as compositional specification formalisms for performance models. A formal analysis method of survivable network was proposed based on stochastic process algebra, which i...Stochastic process algebras have been proposed as compositional specification formalisms for performance models. A formal analysis method of survivable network was proposed based on stochastic process algebra, which incorporates formal modeling into performance analysis perfectly, and then various performance parameters of survivable network can be simultaneously obtained after formal modeling. The formal description with process expression to the survivable network system was carried out based on the simply introduced syntax and operational semantics of stochastic process algebra. Then PEPA workbench tool was used to obtain the probability of system’s steady state availability and transient state availability. Simulation experiments show the effectiveness and feasibility of the developed method.展开更多
IEEE 802.16 is the standard for broadband wireless access.The security sublayer is provided within IEEE 802.16 MAC layer for privacy and access control,in which the privacy and key management(PKM)protocols are specifi...IEEE 802.16 is the standard for broadband wireless access.The security sublayer is provided within IEEE 802.16 MAC layer for privacy and access control,in which the privacy and key management(PKM)protocols are specified.In IEEE 802.16e,SA-TEK 3-way handshake is added into PKM protocols,aiming to facilitate re-authentication and key distribution.This paper analyzes the SA-TEK 3-way handshake protocol,and proposes an optimized version.We also use CasperFDR,a popular formal analysis tool,to verify our analysis.Moreover,we model various simplified versions to find the functions of those elements in the protocol,and correct some misunderstandings in related works using other formal analysis tools.展开更多
Fair exchange protocols play a critical role in enabling two distrustful entities to conduct electronic data exchanges in a fair and secure manner.These protocols are widely used in electronic payment systems and elec...Fair exchange protocols play a critical role in enabling two distrustful entities to conduct electronic data exchanges in a fair and secure manner.These protocols are widely used in electronic payment systems and electronic contract signing,ensuring the reliability and security of network transactions.In order to address the limitations of current research methods and enhance the analytical capabilities for fair exchange protocols,this paper proposes a formal model for analyzing such protocols.The proposed model begins with a thorough analysis of fair exchange protocols,followed by the formal definition of fairness.This definition accurately captures the inherent requirements of fair exchange protocols.Building upon event logic,the model incorporates the time factor into predicates and introduces knowledge set axioms.This enhancement empowers the improved logic to effectively describe the state and knowledge of protocol participants at different time points,facilitating reasoning about their acquired knowledge.To maximize the intruder’s capabilities,channel errors are translated into the behaviors of the intruder.The participants are further categorized into honest participants and malicious participants,enabling a comprehensive evaluation of the intruder’s potential impact.By employing a typical fair exchange protocol as an illustrative example,this paper demonstrates the detailed steps of utilizing the proposed model for protocol analysis.The entire process of protocol execution under attack scenarios is presented,shedding light on the underlying reasons for the attacks and proposing corresponding countermeasures.The developedmodel enhances the ability to reason about and evaluate the security properties of fair exchange protocols,thereby contributing to the advancement of secure network transactions.展开更多
Non-fungible token(NFT)is a digital asset whose ownership can be validated and controlled via blockchain technology.The NFT market is a rapidly growing field,and the rarity of NFT is an essential factor that affects i...Non-fungible token(NFT)is a digital asset whose ownership can be validated and controlled via blockchain technology.The NFT market is a rapidly growing field,and the rarity of NFT is an essential factor that affects its price,as scarcity leads to higher demand.This study focuses on the BAYC NFT collection,which is a successful and representative collection of profile picture NFT,and analyzes how rarity affects NFT prices.This paper investigates the relationship between the rarity and price of the BAYC NFT collection using formal concept analysis(FCA)method.The results show that rarity is a major factor influencing the price of NFT and the effect is more apparent in the medium rarity range.When rarity is very high or very low,other factors become significant determinants,such as the uniqueness and appeal of NFT,and even the naturalness of NFT images.This research highlights the importance of considering rarity when assessing NFT and underscores the need for a comprehensive evaluation of NFT rarity.This study also provides valuable insights into the NFT market and can be useful for NFT investors,creators,and collectors.Furthermore,the usefulness of FCA as a tool for quantifying NFT rarity and evaluating NFT price was demonstrated.展开更多
Identifying business components is the basis of component-based software engineering. Many approaches, including cluster analysis and concept analysis, have been proposed to identify components from business models. T...Identifying business components is the basis of component-based software engineering. Many approaches, including cluster analysis and concept analysis, have been proposed to identify components from business models. These approaches classify business elements into a set of components by analyzing their properties. However, most of them do not consider the difference in their properties for the business elements, which may decrease the ac- curacy of the identification results. Fhrthermore, component identification by partitioning business elements cannot reflect which features are responsible for the generation of certain results. This paper deals with a new approach for component identification from business models using fuzzy formal concept analysis. First, the membership between business elements and their properties is quantified and transformed into a fuzzy formal context, from which the concept lattice is built using a refined incremental algorithm. Then the components are selected from the concepts according to the concept dispersion and distance. Finally, the effectiveness and efficiency are validated by applying our approach in the real-life cases and experiments.展开更多
An effective domain ontology automatically constructed is proposed in this paper. The main concept is using the Formal Concept Analysis to automatically establish domain ontology. Finally, the ontology is acted as the...An effective domain ontology automatically constructed is proposed in this paper. The main concept is using the Formal Concept Analysis to automatically establish domain ontology. Finally, the ontology is acted as the base for the Naive Bayes classifier to approve the effectiveness of the domain ontology for document classification. The 1752 documents divided into 10 categories are used to assess the effectiveness of the ontology, where 1252 and 500 documents are the training and testing documents, respectively. The Fl-measure is as the assessment criteria and the following three results are obtained. The average recall of Naive Bayes classifier is 0.94. Therefore, in recall, the performance of Naive Bayes classifier is excellent based on the automatically constructed ontology. The average precision of Naive Bayes classifier is 0.81. Therefore, in precision, the performance of Naive Bayes classifier is gored based on the automatically constructed ontology. The average Fl-measure for 10 categories by Naive Bayes classifier is 0.86. Therefore, the performance of Naive Bayes classifier is effective based on the automatically constructed ontology in the point of F 1-measure. Thus, the domain ontology automatically constructed could indeed be acted as the document categories to reach the effectiveness for document classification.展开更多
Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce...Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce the number of decision implications. Moreover, based on the inference rule we introduce the notion of a-maximal decision implication and prove that the set of all α-maximal decision implications is α-complete and α-non-redundant. Finally, we propose a method to generate the set.展开更多
Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent d...Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different,and often contradictory criteria are to be taken into account.Based on the concept of Pareto dominance,the skyline process extracts the most interesting(not dominated in the sense of Pareto)objects from a set of data.Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited.The purpose of this paper is to tackle this problem,known as the large size skyline problem,and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approach-The problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting.Then,an ideal fuzzy formal concept is computed in the sense of some particular defined criteria.By leveraging the elements of this ideal concept,one can reduce the size of the computed Skyline.Findings-An appropriate and rational solution is discussed for the problem of interest.Then,a tool,named SkyRef,is developed.Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implications-The authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches.However,thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implications-The tool developed SkyRef can have many domains applications that require decision-making,personalized recommendation and where the size of skyline has to be reduced.In particular,SkyRef can be used in several real-world applications such as economic,security,medicine and services.Social implications-This work can be expected in all domains that require decision-making like hotel finder,restaurant recommender,recruitment of candidates,etc.Originality/value-This study mixes two research fields artificial intelligence(i.e.formal concept analysis)and databases(i.e.skyline queries).The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational,semantically speaking.On the other hand,this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries,such as relaxation.展开更多
Explainable recommendation systems deal with the problem of‘Why’.Besides providing the user with the recommendation,it is also explained why such an object is being recommended.It helps to improve trustworthiness,ef...Explainable recommendation systems deal with the problem of‘Why’.Besides providing the user with the recommendation,it is also explained why such an object is being recommended.It helps to improve trustworthiness,effectiveness,efficiency,persuasiveness,and user satisfaction towards the system.To recommend the relevant information with an explanation to the user is required.Existing systems provide the top-k recommendation options to the user based on ratings and reviews about the required object but unable to explain the matched-attribute-based recommendation to the user.A framework is proposed to fetch the most specific information that matches the user requirements based on Formal Concept Analysis(FCA).The ranking quality of the recommendation list for the proposed system is evaluated quantitatively with Normalized Discounted Cumulative Gain(NDCG)@k,which is better than the existing systems.Explanation is provided qualitatively by considering trustworthiness criterion i.e.,among the seven explainability evaluation criteria,and its metric satisfies the results of proposed method.This framework can be enhanced to accommodate for more effectiveness and trustworthiness.展开更多
More and more cryptographic protocols have been used to achieve various security requirements of distributed systems in the open network environment. However cryptographic protocols are very difficult to design and an...More and more cryptographic protocols have been used to achieve various security requirements of distributed systems in the open network environment. However cryptographic protocols are very difficult to design and analyze due to the complexity of the cryptographic protocol execution, and a large number of problems are unsolved that range from the theory framework to the concrete analysis technique. In this paper, we build a new algebra called cryptographic protocol algebra (CPA) for describing the message operations with many cryptographic primitives, and proposed a new algebra model for cryptographic protocols based on the CPA. In the model, expanding processes of the participants knowledge on the protocol runs are characterized with some algebraic notions such as subalgebra, free generator and polynomial algebra, and attack processes are modeled with a new notion similar to that of the exact sequence used in homological algebra. Then we develope a mathematical approach to the cryptographic protocol security analysis. By using algebraic techniques, we have shown that for those cryptographic protocols with some symmetric properties, the execution space generated by an arbitrary number of participants may boil down to a smaller space generated by several honest participants and attackers. Furthermore we discuss the composability problem of cryptographic protocols and give a sufficient condition under which the protocol composed of two correct cryptographic protocols is still correct, and we finally offer a counterexample to show that the statement may not be true when the condition is not met.展开更多
Researchers have proposed several security protocols to protect the electronic commerce security in these years;however,not all of them are secure enough.This article extends model checking method with Casper/FDR2 to ...Researchers have proposed several security protocols to protect the electronic commerce security in these years;however,not all of them are secure enough.This article extends model checking method with Casper/FDR2 to model and analyze a new electronic protocol.Attacks are found in the protocol and their mechanisms are discussed.A variety of solutions are given to different security flaws.The improved protocol is proven to be robust and secure.展开更多
XCD is a design-by-contract based architecture description language that supports modular specifications in terms of components and connectors (i.e., interaction protocols). XCD is supported by a translator that produ...XCD is a design-by-contract based architecture description language that supports modular specifications in terms of components and connectors (i.e., interaction protocols). XCD is supported by a translator that produces formal models in SPIN’s ProMeLa formal verification language, which can then be formally analysed using SPIN’s model checker. XCD is extended with a visual notation set called VXCD. VXCD extends UML’s component diagram and adapts it to XCD’s structure, contractual behaviour, and interaction protocol specifications. Visual VXCD specifications can be translated into textual XCD specifications for formal analysis. To illustrate VXCD, the well-known gas station system is used. The gas system is specified contractually using VXCD’s visual notation set and then formally analysed using SPIN’s model checker for a number of properties including deadlock and race-condition.展开更多
An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons...An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message concurrent. We propose an efficient structures are complex and their executions are automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.展开更多
A new semantic model in Abstract State Model (ASM) for authentication protocols is presented. It highlights the Woo-Lam's ideas for authentication, which is the strongest one in Lowe's definition hierarchy for...A new semantic model in Abstract State Model (ASM) for authentication protocols is presented. It highlights the Woo-Lam's ideas for authentication, which is the strongest one in Lowe's definition hierarchy for entity authentication. Apart from the flexible and natural features in forming and analyzing protocols inherited from ASM, the model defines both authentication and secrecy properties explicitly in first order sentences as invariants. The process of proving security properties with respect to an authentication protocol blends the correctness and secrecy properties together to avoid the potential flaws which may happen when treated separately. The security of revised Helsinki protocol is shown as a case study. The new model is different from the previous ones in ASMs.展开更多
With increased cyber attacks over years,information system security assessment becomes more and more important.This paper provides an ontology-based attack model,and then utilizes it to assess the information system s...With increased cyber attacks over years,information system security assessment becomes more and more important.This paper provides an ontology-based attack model,and then utilizes it to assess the information system security from attack angle.We categorize attacks into a taxonomy suitable for security assessment.The proposed taxonomy consists of five dimensions,which include attack impact,attack vector,attack target,vulnerability and defense.Afterwards we build an ontology according to the taxonomy.In the ontology,attack related concepts included in the five dimensions and relationships between them are formalized and analyzed in detail.We also populate our attack ontology with information from national vulnerability database(NVD)about the vulnerabilities,such as common vulnerabilities and exposures(CVE),common weakness enumeration(CWE),common vulnerability scoring system(CVSS),and common platform enumeration(CPE).Finally we propose an ontology-based framework for security assessment of network and computer systems,and describe the utilization of ontology in the security assessment and the method for evaluating attack efect on the system when it is under attack.展开更多
In order to address the study of complex systems,the detection of patterns in their dynamics could play a key role in understanding their evolution.In particular,global patterns are required to detect emergent concept...In order to address the study of complex systems,the detection of patterns in their dynamics could play a key role in understanding their evolution.In particular,global patterns are required to detect emergent concepts and trends,some of them of a qualitative nature.Formal concept analysis(FCA) is a theory whose goal is to discover and extract knowledge from qualitative data(organized in concept lattices).In complex environments,such as sport competitions,the large amount of information currently available turns concept lattices into complex networks.The authors analyze how to apply FCA reasoning in order to increase confidence in sports predictions by means of detecting regularities from data through the management of intuitive and natural attributes extracted from publicly available information.The complexity of concept lattices-considered as networks with complex topological structure- is analyzed.It is applied to building a knowledge based system for confidence-based reasoning,which simulates how humans tend to avoid the complexity of concept networks by means of bounded reasoning skills.展开更多
Because of the completeness of concept lattices, the time complexity of constructing concept lattices has become the main factor affecting the application of formal concept analysis(FCA). The key problems in the resea...Because of the completeness of concept lattices, the time complexity of constructing concept lattices has become the main factor affecting the application of formal concept analysis(FCA). The key problems in the research of concept lattices are how to improve the generation efficiency and how to reduce the space and time complexity of constructing concept lattices. So far, reviews on lattice construction algorithms have not been comprehensive. In view of this situation, we give a detailed review on two categories of construction algorithms:batch methods and incremental methods. The first category is a formal context that cannot be updated once the concept lattice has been constructed; the second category is a formal context that can be updated after a new object being added to the formal context. We briefly introduce classical and improved construction methods, illustrate the deficiencies of some algorithms and point out the improvements of the follow-up algorithms. Furthermore, we compare and discuss several key algorithms, and also pay attention to the application of concept lattices. Finally,two further research directions of concept lattices are proposed, including parallel construction methods of concept lattices and research of heterogeneous data concept lattices.展开更多
The topic recognition for dynamic topic number can realize the dynamic update of super parameters,and obtain the probability distribution of dynamic topics in time dimension,which helps to clear the understanding and ...The topic recognition for dynamic topic number can realize the dynamic update of super parameters,and obtain the probability distribution of dynamic topics in time dimension,which helps to clear the understanding and tracking of convection text data.However,the current topic recognition model tends to be based on a fixed number of topics K and lacks multi-granularity analysis of subject knowledge.Therefore,it is impossible to deeply perceive the dynamic change of the topic in the time series.By introducing a novel approach on the basis of Infinite Latent Dirichlet allocation model,a topic feature lattice under the dynamic topic number is constructed.In the model,documents,topics and vocabularies are jointly modeled to generate two probability distribution matrices:Documentstopics and topic-feature words.Afterwards,the association intensity is computed between the topic and its feature vocabulary to establish the topic formal context matrix.Finally,the topic feature is induced according to the formal concept analysis(FCA)theory.The topic feature lattice under dynamic topic number(TFL DTN)model is validated on the real dataset by comparing with the mainstream methods.Experiments show that this model is more in line with actual needs,and achieves better results in semi-automatic modeling of topic visualization analysis.展开更多
The majority of existing information systems deals with crisp data through crisp database systems. Traditional Database Management Systems (DBMS) have not taken into account imprecision so one can say there is some ...The majority of existing information systems deals with crisp data through crisp database systems. Traditional Database Management Systems (DBMS) have not taken into account imprecision so one can say there is some sort of lack of flexibility. The reason is that queries retrieve only elements which precisely match to the given Boolean query. That is, an element belongs to the result if the query is true for this element; otherwise, no answers are returned to the user. The aim of this paper is to present a cooperative approach to handling empty answers of fuzzy conjunctive queries by referring to the Formal Concept Analysis (FCA) theory and fuzzy logic. We present an architecture which combines FCA and databases. The processing of fuzzy queries allows detecting the minimal reasons of empty answers. We also use concept lattice in order to provide the user with the nearest answers in the case of a query failure.展开更多
An efficient way to improve the efficiency of the applications based on formal concept analysis (FCA) is to construct the needed part of concept lattice used by applications. Inspired by this idea, an approach that ...An efficient way to improve the efficiency of the applications based on formal concept analysis (FCA) is to construct the needed part of concept lattice used by applications. Inspired by this idea, an approach that constructs lower concept semi-lattice called non-frequent concept semi-lattice in this paper is introduced, and the method is based on subposition assembly. Primarily, we illustrate the theoretical framework of subposition assembly for non-frequent concept semi-lattice. Second, an algorithm called Nocose based on this framework is proposed. Experiments show both theoretical correctness and practicability of the algorithm Nocose.展开更多
基金the Specialized Research Fund for the Doctoral Program of Higher Education (No. 20050217007)
文摘Stochastic process algebras have been proposed as compositional specification formalisms for performance models. A formal analysis method of survivable network was proposed based on stochastic process algebra, which incorporates formal modeling into performance analysis perfectly, and then various performance parameters of survivable network can be simultaneously obtained after formal modeling. The formal description with process expression to the survivable network system was carried out based on the simply introduced syntax and operational semantics of stochastic process algebra. Then PEPA workbench tool was used to obtain the probability of system’s steady state availability and transient state availability. Simulation experiments show the effectiveness and feasibility of the developed method.
基金the Startup Research Fund for the Doctoral Faculty of Shenyang University of Chemical Technology(No.51045084)。
文摘IEEE 802.16 is the standard for broadband wireless access.The security sublayer is provided within IEEE 802.16 MAC layer for privacy and access control,in which the privacy and key management(PKM)protocols are specified.In IEEE 802.16e,SA-TEK 3-way handshake is added into PKM protocols,aiming to facilitate re-authentication and key distribution.This paper analyzes the SA-TEK 3-way handshake protocol,and proposes an optimized version.We also use CasperFDR,a popular formal analysis tool,to verify our analysis.Moreover,we model various simplified versions to find the functions of those elements in the protocol,and correct some misunderstandings in related works using other formal analysis tools.
基金the National Natural Science Foundation of China(Nos.61562026,61962020)Academic and Technical Leaders of Major Disciplines in Jiangxi Province(No.20172BCB22015)+1 种基金Special Fund Project for Postgraduate Innovation in Jiangxi Province(No.YC2020-B1141)Jiangxi Provincial Natural Science Foundation(No.20224ACB202006).
文摘Fair exchange protocols play a critical role in enabling two distrustful entities to conduct electronic data exchanges in a fair and secure manner.These protocols are widely used in electronic payment systems and electronic contract signing,ensuring the reliability and security of network transactions.In order to address the limitations of current research methods and enhance the analytical capabilities for fair exchange protocols,this paper proposes a formal model for analyzing such protocols.The proposed model begins with a thorough analysis of fair exchange protocols,followed by the formal definition of fairness.This definition accurately captures the inherent requirements of fair exchange protocols.Building upon event logic,the model incorporates the time factor into predicates and introduces knowledge set axioms.This enhancement empowers the improved logic to effectively describe the state and knowledge of protocol participants at different time points,facilitating reasoning about their acquired knowledge.To maximize the intruder’s capabilities,channel errors are translated into the behaviors of the intruder.The participants are further categorized into honest participants and malicious participants,enabling a comprehensive evaluation of the intruder’s potential impact.By employing a typical fair exchange protocol as an illustrative example,this paper demonstrates the detailed steps of utilizing the proposed model for protocol analysis.The entire process of protocol execution under attack scenarios is presented,shedding light on the underlying reasons for the attacks and proposing corresponding countermeasures.The developedmodel enhances the ability to reason about and evaluate the security properties of fair exchange protocols,thereby contributing to the advancement of secure network transactions.
基金supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea(NRF-2023S1A5A2A03083440).
文摘Non-fungible token(NFT)is a digital asset whose ownership can be validated and controlled via blockchain technology.The NFT market is a rapidly growing field,and the rarity of NFT is an essential factor that affects its price,as scarcity leads to higher demand.This study focuses on the BAYC NFT collection,which is a successful and representative collection of profile picture NFT,and analyzes how rarity affects NFT prices.This paper investigates the relationship between the rarity and price of the BAYC NFT collection using formal concept analysis(FCA)method.The results show that rarity is a major factor influencing the price of NFT and the effect is more apparent in the medium rarity range.When rarity is very high or very low,other factors become significant determinants,such as the uniqueness and appeal of NFT,and even the naturalness of NFT images.This research highlights the importance of considering rarity when assessing NFT and underscores the need for a comprehensive evaluation of NFT rarity.This study also provides valuable insights into the NFT market and can be useful for NFT investors,creators,and collectors.Furthermore,the usefulness of FCA as a tool for quantifying NFT rarity and evaluating NFT price was demonstrated.
基金supported by the Fundamental Research Funds for the Central Universities,China
文摘Identifying business components is the basis of component-based software engineering. Many approaches, including cluster analysis and concept analysis, have been proposed to identify components from business models. These approaches classify business elements into a set of components by analyzing their properties. However, most of them do not consider the difference in their properties for the business elements, which may decrease the ac- curacy of the identification results. Fhrthermore, component identification by partitioning business elements cannot reflect which features are responsible for the generation of certain results. This paper deals with a new approach for component identification from business models using fuzzy formal concept analysis. First, the membership between business elements and their properties is quantified and transformed into a fuzzy formal context, from which the concept lattice is built using a refined incremental algorithm. Then the components are selected from the concepts according to the concept dispersion and distance. Finally, the effectiveness and efficiency are validated by applying our approach in the real-life cases and experiments.
文摘An effective domain ontology automatically constructed is proposed in this paper. The main concept is using the Formal Concept Analysis to automatically establish domain ontology. Finally, the ontology is acted as the base for the Naive Bayes classifier to approve the effectiveness of the domain ontology for document classification. The 1752 documents divided into 10 categories are used to assess the effectiveness of the ontology, where 1252 and 500 documents are the training and testing documents, respectively. The Fl-measure is as the assessment criteria and the following three results are obtained. The average recall of Naive Bayes classifier is 0.94. Therefore, in recall, the performance of Naive Bayes classifier is excellent based on the automatically constructed ontology. The average precision of Naive Bayes classifier is 0.81. Therefore, in precision, the performance of Naive Bayes classifier is gored based on the automatically constructed ontology. The average Fl-measure for 10 categories by Naive Bayes classifier is 0.86. Therefore, the performance of Naive Bayes classifier is effective based on the automatically constructed ontology in the point of F 1-measure. Thus, the domain ontology automatically constructed could indeed be acted as the document categories to reach the effectiveness for document classification.
文摘Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce the number of decision implications. Moreover, based on the inference rule we introduce the notion of a-maximal decision implication and prove that the set of all α-maximal decision implications is α-complete and α-non-redundant. Finally, we propose a method to generate the set.
基金The authors would like to express their special thanks of gratitude to the Directorate General for Scientific Research and Technological Development(DGRSDT),for the support of this work under the subvention number C0662300 and the grant number 167/PNE.
文摘Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different,and often contradictory criteria are to be taken into account.Based on the concept of Pareto dominance,the skyline process extracts the most interesting(not dominated in the sense of Pareto)objects from a set of data.Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited.The purpose of this paper is to tackle this problem,known as the large size skyline problem,and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approach-The problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting.Then,an ideal fuzzy formal concept is computed in the sense of some particular defined criteria.By leveraging the elements of this ideal concept,one can reduce the size of the computed Skyline.Findings-An appropriate and rational solution is discussed for the problem of interest.Then,a tool,named SkyRef,is developed.Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implications-The authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches.However,thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implications-The tool developed SkyRef can have many domains applications that require decision-making,personalized recommendation and where the size of skyline has to be reduced.In particular,SkyRef can be used in several real-world applications such as economic,security,medicine and services.Social implications-This work can be expected in all domains that require decision-making like hotel finder,restaurant recommender,recruitment of candidates,etc.Originality/value-This study mixes two research fields artificial intelligence(i.e.formal concept analysis)and databases(i.e.skyline queries).The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational,semantically speaking.On the other hand,this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries,such as relaxation.
文摘Explainable recommendation systems deal with the problem of‘Why’.Besides providing the user with the recommendation,it is also explained why such an object is being recommended.It helps to improve trustworthiness,effectiveness,efficiency,persuasiveness,and user satisfaction towards the system.To recommend the relevant information with an explanation to the user is required.Existing systems provide the top-k recommendation options to the user based on ratings and reviews about the required object but unable to explain the matched-attribute-based recommendation to the user.A framework is proposed to fetch the most specific information that matches the user requirements based on Formal Concept Analysis(FCA).The ranking quality of the recommendation list for the proposed system is evaluated quantitatively with Normalized Discounted Cumulative Gain(NDCG)@k,which is better than the existing systems.Explanation is provided qualitatively by considering trustworthiness criterion i.e.,among the seven explainability evaluation criteria,and its metric satisfies the results of proposed method.This framework can be enhanced to accommodate for more effectiveness and trustworthiness.
文摘More and more cryptographic protocols have been used to achieve various security requirements of distributed systems in the open network environment. However cryptographic protocols are very difficult to design and analyze due to the complexity of the cryptographic protocol execution, and a large number of problems are unsolved that range from the theory framework to the concrete analysis technique. In this paper, we build a new algebra called cryptographic protocol algebra (CPA) for describing the message operations with many cryptographic primitives, and proposed a new algebra model for cryptographic protocols based on the CPA. In the model, expanding processes of the participants knowledge on the protocol runs are characterized with some algebraic notions such as subalgebra, free generator and polynomial algebra, and attack processes are modeled with a new notion similar to that of the exact sequence used in homological algebra. Then we develope a mathematical approach to the cryptographic protocol security analysis. By using algebraic techniques, we have shown that for those cryptographic protocols with some symmetric properties, the execution space generated by an arbitrary number of participants may boil down to a smaller space generated by several honest participants and attackers. Furthermore we discuss the composability problem of cryptographic protocols and give a sufficient condition under which the protocol composed of two correct cryptographic protocols is still correct, and we finally offer a counterexample to show that the statement may not be true when the condition is not met.
文摘Researchers have proposed several security protocols to protect the electronic commerce security in these years;however,not all of them are secure enough.This article extends model checking method with Casper/FDR2 to model and analyze a new electronic protocol.Attacks are found in the protocol and their mechanisms are discussed.A variety of solutions are given to different security flaws.The improved protocol is proven to be robust and secure.
文摘XCD is a design-by-contract based architecture description language that supports modular specifications in terms of components and connectors (i.e., interaction protocols). XCD is supported by a translator that produces formal models in SPIN’s ProMeLa formal verification language, which can then be formally analysed using SPIN’s model checker. XCD is extended with a visual notation set called VXCD. VXCD extends UML’s component diagram and adapts it to XCD’s structure, contractual behaviour, and interaction protocol specifications. Visual VXCD specifications can be translated into textual XCD specifications for formal analysis. To illustrate VXCD, the well-known gas station system is used. The gas system is specified contractually using VXCD’s visual notation set and then formally analysed using SPIN’s model checker for a number of properties including deadlock and race-condition.
基金supported by the National Natural Science Foundation of China(Grant No.90412011)the State Key Basic Research Program(973)(Grant No.2005CB321803)the State"863"High-tech Research and Development Project(Grant No.2003AA 144150).
文摘An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message concurrent. We propose an efficient structures are complex and their executions are automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.
基金国家自然科学基金,国家高技术研究发展计划(863计划),国家重点基础研究发展计划(973计划),the Foundation for Extraordinary Young Researchers under
文摘A new semantic model in Abstract State Model (ASM) for authentication protocols is presented. It highlights the Woo-Lam's ideas for authentication, which is the strongest one in Lowe's definition hierarchy for entity authentication. Apart from the flexible and natural features in forming and analyzing protocols inherited from ASM, the model defines both authentication and secrecy properties explicitly in first order sentences as invariants. The process of proving security properties with respect to an authentication protocol blends the correctness and secrecy properties together to avoid the potential flaws which may happen when treated separately. The security of revised Helsinki protocol is shown as a case study. The new model is different from the previous ones in ASMs.
基金the National Basic Research Program(973)of China(No.2010CB731403)the Information Network Security Key Laboratory Open Project of the Ministry of Public Security of China(No.C09603)the Shanghai Key Scientific and Technological Project(No.11511504302)
文摘With increased cyber attacks over years,information system security assessment becomes more and more important.This paper provides an ontology-based attack model,and then utilizes it to assess the information system security from attack angle.We categorize attacks into a taxonomy suitable for security assessment.The proposed taxonomy consists of five dimensions,which include attack impact,attack vector,attack target,vulnerability and defense.Afterwards we build an ontology according to the taxonomy.In the ontology,attack related concepts included in the five dimensions and relationships between them are formalized and analyzed in detail.We also populate our attack ontology with information from national vulnerability database(NVD)about the vulnerabilities,such as common vulnerabilities and exposures(CVE),common weakness enumeration(CWE),common vulnerability scoring system(CVSS),and common platform enumeration(CPE).Finally we propose an ontology-based framework for security assessment of network and computer systems,and describe the utilization of ontology in the security assessment and the method for evaluating attack efect on the system when it is under attack.
基金supported by the TIN2009-09492 project of the Spanish Ministry of Science and InnovationExcellence project TIC-6064 of Junta de Andalucia co-financed by FEDER funds
文摘In order to address the study of complex systems,the detection of patterns in their dynamics could play a key role in understanding their evolution.In particular,global patterns are required to detect emergent concepts and trends,some of them of a qualitative nature.Formal concept analysis(FCA) is a theory whose goal is to discover and extract knowledge from qualitative data(organized in concept lattices).In complex environments,such as sport competitions,the large amount of information currently available turns concept lattices into complex networks.The authors analyze how to apply FCA reasoning in order to increase confidence in sports predictions by means of detecting regularities from data through the management of intuitive and natural attributes extracted from publicly available information.The complexity of concept lattices-considered as networks with complex topological structure- is analyzed.It is applied to building a knowledge based system for confidence-based reasoning,which simulates how humans tend to avoid the complexity of concept networks by means of bounded reasoning skills.
基金the National Natural Science Foundation of China(No.61273328)the National Key Foundation for Exploring Scientific Instrument of China(No.51227803)
文摘Because of the completeness of concept lattices, the time complexity of constructing concept lattices has become the main factor affecting the application of formal concept analysis(FCA). The key problems in the research of concept lattices are how to improve the generation efficiency and how to reduce the space and time complexity of constructing concept lattices. So far, reviews on lattice construction algorithms have not been comprehensive. In view of this situation, we give a detailed review on two categories of construction algorithms:batch methods and incremental methods. The first category is a formal context that cannot be updated once the concept lattice has been constructed; the second category is a formal context that can be updated after a new object being added to the formal context. We briefly introduce classical and improved construction methods, illustrate the deficiencies of some algorithms and point out the improvements of the follow-up algorithms. Furthermore, we compare and discuss several key algorithms, and also pay attention to the application of concept lattices. Finally,two further research directions of concept lattices are proposed, including parallel construction methods of concept lattices and research of heterogeneous data concept lattices.
基金the Key Projects of Social Sciences of Anhui Provincial Department of Education(SK2018A1064,SK2018A1072)the Natural Scientific Project of Anhui Provincial Department of Education(KJ2019A0371)Innovation Team of Health Information Management and Application Research(BYKC201913),BBMC。
文摘The topic recognition for dynamic topic number can realize the dynamic update of super parameters,and obtain the probability distribution of dynamic topics in time dimension,which helps to clear the understanding and tracking of convection text data.However,the current topic recognition model tends to be based on a fixed number of topics K and lacks multi-granularity analysis of subject knowledge.Therefore,it is impossible to deeply perceive the dynamic change of the topic in the time series.By introducing a novel approach on the basis of Infinite Latent Dirichlet allocation model,a topic feature lattice under the dynamic topic number is constructed.In the model,documents,topics and vocabularies are jointly modeled to generate two probability distribution matrices:Documentstopics and topic-feature words.Afterwards,the association intensity is computed between the topic and its feature vocabulary to establish the topic formal context matrix.Finally,the topic feature is induced according to the formal concept analysis(FCA)theory.The topic feature lattice under dynamic topic number(TFL DTN)model is validated on the real dataset by comparing with the mainstream methods.Experiments show that this model is more in line with actual needs,and achieves better results in semi-automatic modeling of topic visualization analysis.
文摘The majority of existing information systems deals with crisp data through crisp database systems. Traditional Database Management Systems (DBMS) have not taken into account imprecision so one can say there is some sort of lack of flexibility. The reason is that queries retrieve only elements which precisely match to the given Boolean query. That is, an element belongs to the result if the query is true for this element; otherwise, no answers are returned to the user. The aim of this paper is to present a cooperative approach to handling empty answers of fuzzy conjunctive queries by referring to the Formal Concept Analysis (FCA) theory and fuzzy logic. We present an architecture which combines FCA and databases. The processing of fuzzy queries allows detecting the minimal reasons of empty answers. We also use concept lattice in order to provide the user with the nearest answers in the case of a query failure.
基金Supported by the National Natural Science Foundation of China (60970018)the Fundamental Research Funds for the Central Universities
文摘An efficient way to improve the efficiency of the applications based on formal concept analysis (FCA) is to construct the needed part of concept lattice used by applications. Inspired by this idea, an approach that constructs lower concept semi-lattice called non-frequent concept semi-lattice in this paper is introduced, and the method is based on subposition assembly. Primarily, we illustrate the theoretical framework of subposition assembly for non-frequent concept semi-lattice. Second, an algorithm called Nocose based on this framework is proposed. Experiments show both theoretical correctness and practicability of the algorithm Nocose.