The Geometrical Optics(GO)approach and the FAST Emissivity Model(FASTEM)are widely used to estimate the surface radiative components in atmospheric radiative transfer simulations,but their applications are limited in ...The Geometrical Optics(GO)approach and the FAST Emissivity Model(FASTEM)are widely used to estimate the surface radiative components in atmospheric radiative transfer simulations,but their applications are limited in specific conditions.In this study,a two-scale reflectivity model(TSRM)and a two-scale emissivity model(TSEM)are developed from the two-scale roughness theory.Unlike GO which only computes six non-zero elements in the reflectivity matrix,The TSRM includes 16 elements of Stokes reflectivity matrix which are important for improving radiative transfer simulation accuracy in a scattering atmosphere.It covers the frequency range from L-to W-bands.The dependences of all TSRM elements on zenith angle,wind speed,and frequency are derived and analyzed in details.For a set of downwelling radiances in microwave frequencies,the reflected upwelling brightness temperature(BTs)are calculated from both TSRM and GO and compared for analyzing their discrepancies.The TSRM not only includes the effects of GO but also accounts for the small-scale Bragg scattering effect in an order of several degrees in Kelvins in brightness temperature.Also,the third and fourth components of the Stokes vector can only be produced from the TSRM.For the emitted radiation,BT differences in vertical polarization between a TSEM and FASTEM are generally less than 5 K when the satellite zenith angle is less than 40°,whereas those for the horizontal component can be quite significant,greater than 20 K.展开更多
In order to reduce redundant features in air combat information and to meet the requirements of real-time decision in combat, rough set theory is introduced to the tactical decision analysis in cooperative team air co...In order to reduce redundant features in air combat information and to meet the requirements of real-time decision in combat, rough set theory is introduced to the tactical decision analysis in cooperative team air combat. An algorithm of attribute reduction for extracting key combat information and generating tactical rules from given air combat databases is presented. Then, considering the practical requirements of team combat, a method for reduction of attribute-values under single decision attribute is extended to the reduction under multi-decision attributes. Finally, the algorithm is verified with an example for tactical choices in team air combat. The results show that, the redundant attributes in air combat information can be reduced, and that the main combat attributes, i.e., the information about radar command and medium-range guided missile, can be obtained with the algorithm mentioned above, moreover, the minimal reduced strategy for tactical decision can be generated without losing the result of key information classification. The decision rules extracted agree with the real situation of team air combat.展开更多
It is being widely studied how to extract knowledge from a decision table based on rough set theory. The novel problem is how to discretize a decision table having continuous attribute. In order to obtain more reasona...It is being widely studied how to extract knowledge from a decision table based on rough set theory. The novel problem is how to discretize a decision table having continuous attribute. In order to obtain more reasonable discretization results, a discretization algorithm is proposed, which arranges half-global discretization based on the correlational coefficient of each continuous attribute while considering the uniqueness of rough set theory. When choosing heuristic information, stability is combined with rough entropy. In terms of stability, the possibility of classifying objects belonging to certain sub-interval of a given attribute into neighbor sub-intervals is minimized. By doing this, rational discrete intervals can be determined. Rough entropy is employed to decide the optimal cut-points while guaranteeing the consistency of the decision table after discretization. Thought of this algorithm is elaborated through Iris data and then some experiments by comparing outcomes of four discritized datasets are also given, which are calculated by the proposed algorithm and four other typical algorithras for discritization respectively. After that, classification rules are deduced and summarized through rough set based classifiers. Results show that the proposed discretization algorithm is able to generate optimal classification accuracy while minimizing the number of discrete intervals. It displays superiority especially when dealing with a decision table having a large attribute number.展开更多
The basic principles of IF/THEN rules in rough set theory are analyzed first, and then the automatic process of knowledge acquisition is given. The numerical data is qualitatively processed by the classification of me...The basic principles of IF/THEN rules in rough set theory are analyzed first, and then the automatic process of knowledge acquisition is given. The numerical data is qualitatively processed by the classification of membership functions and membership degrees to get the normative decision table. The regular method of relations and the reduction algorithm of attributes are studied. The reduced relations are presented by the multi-representvalue method and its algorithm is offered. The whole knowledge acquisition process has high degree of automation and the extracted knowledge is true and reliable.展开更多
This paper proposes a clustering technique that minimizes the need for subjective human intervention and is based on elements of rough set theory (RST). The proposed algorithm is unified in its approach to clusterin...This paper proposes a clustering technique that minimizes the need for subjective human intervention and is based on elements of rough set theory (RST). The proposed algorithm is unified in its approach to clustering and makes use of both local and global data properties to obtain clustering solutions. It handles single-type and mixed attribute data sets with ease. The results from three data sets of single and mixed attribute types are used to illustrate the technique and establish its efficiency.展开更多
Seismic vulnerability assessment of urban buildings is among the most crucial procedures to post-disaster response and recovery of infrastructure systems.The present study proceeds to estimate the seismic vulnerabilit...Seismic vulnerability assessment of urban buildings is among the most crucial procedures to post-disaster response and recovery of infrastructure systems.The present study proceeds to estimate the seismic vulnerability of urban buildings and proposes a new framework training on the two objectives.First,a comprehensive interpretation of the effective parameters of this phenomenon including physical and human factors is done.Second,the Rough Set theory is used to reduce the integration uncertainties,as there are numerous quantitative and qualitative data.Both objectives were conducted on seven distinct earthquake scenarios with different intensities based on distance from the fault line and the epicenter.The proposed method was implemented by measuring seismic vulnerability for the seven specified seismic scenarios.The final results indicated that among the entire studied buildings,71.5%were highly vulnerable as concerning the highest earthquake scenario(intensity=7 MM and acceleration calculated based on the epicenter),while in the lowest earthquake scenario(intensity=5 MM),the percentage of vulnerable buildings decreased to approximately 57%.Also,the findings proved that the distance from the fault line rather than the earthquake center(epicenter)has a significant effect on the seismic vulnerability of urban buildings.The model was evaluated by comparing the results with the weighted linear combination(WLC)method.The accuracy of the proposed model was substantiated according to evaluation reports.Vulnerability assessment based on the distance from the epicenter and its comparison with the distance from the fault shows significant reliable results.展开更多
In this paper,we propose a novel Intrusion Detection System (IDS) architecture utilizing both the evidence theory and Rough Set Theory (RST). Evidence theory is an effective tool in dealing with uncertainty question. ...In this paper,we propose a novel Intrusion Detection System (IDS) architecture utilizing both the evidence theory and Rough Set Theory (RST). Evidence theory is an effective tool in dealing with uncertainty question. It relies on the expert knowledge to provide evidences,needing the evidences to be independent,and this make it difficult in application. To solve this problem,a hybrid system of rough sets and evidence theory is proposed. Firstly,simplification are made based on Variable Precision Rough Set (VPRS) conditional entropy. Thus,the Basic Belief Assignment (BBA) for all evidences can be calculated. Secondly,Dempster’s rule of combination is used,and a decision-making is given. In the proposed approach,the difficulties in acquiring the BBAs are solved,the correlativity among the evidences is reduced and the subjectivity of evidences is weakened. An illustrative example in an intrusion detection shows that the two theories combination is feasible and effective.展开更多
In order to avoid the discretization in the classical rough set theory, a generlization rough set theory is proposed. At first, the degree of general importance of an attribute and attribute subsets are presented. The...In order to avoid the discretization in the classical rough set theory, a generlization rough set theory is proposed. At first, the degree of general importance of an attribute and attribute subsets are presented. Then, depending on the degree of general importance of attribute, the space distance can be measured with weighted method. At last, a generalization rough set theory based on the general near neighborhood relation is proposed. The proposed theory partitions the universe into the tolerant modules, and forms lower approximation and upper approximation of the set under general near neighborhood relationship, which avoids the discretization in Pawlak's rough set theory.展开更多
A new image recognition method based on fuzzy rough sets theory is proposed, and its implementation discussed. The performance of this method as applied to ferrography image recognition is evaluated. It is shown that...A new image recognition method based on fuzzy rough sets theory is proposed, and its implementation discussed. The performance of this method as applied to ferrography image recognition is evaluated. It is shown that the new method gives better results than fuzzy or rough sets method when used alone.展开更多
The improved method has been presented for knowledge reduction in rough sets (R-S) theory, when R-S is used to model the information expression of oil and vibration diagnosis. Therefore, the typical fault simulation...The improved method has been presented for knowledge reduction in rough sets (R-S) theory, when R-S is used to model the information expression of oil and vibration diagnosis. Therefore, the typical fault simulation tests of rolling bearings have been made, and the application method of R-S has been also analysed in this paper. The diagnosis model of holding rack fault in rolling bearing was presented based on the improved reduction method. It is suited to information fusion to combine information when oil analysis and vibration analysis are combined for fault diagnosis.展开更多
To investigate the judging problem of optimal dividing matrix among several fuzzy dividing matrices in fuzzy dividing space, correspondingly, which is determined by the various choices of cluster samples in the totali...To investigate the judging problem of optimal dividing matrix among several fuzzy dividing matrices in fuzzy dividing space, correspondingly, which is determined by the various choices of cluster samples in the totality sample space, two algorithms are proposed on the basis of the data analysis method in rough sets theory: information system discrete algorithm (algorithm 1) and samples representatives judging algorithm (algorithm 2). On the principle of the farthest distance, algorithm 1 transforms continuous data into discrete form which could be transacted by rough sets theory. Taking the approximate precision as a criterion, algorithm 2 chooses the sample space with a good representative. Hence, the clustering sample set in inducing and computing optimal dividing matrix can be achieved. Several theorems are proposed to provide strict theoretic foundations for the execution of the algorithm model. An applied example based on the new algorithm model is given, whose result verifies the feasibility of this new algorithm model.展开更多
In this paper,we propose two intrusion detection methods which combine rough set theory and Fuzzy C-Means for network intrusion detection.The first step consists of feature selection which is based on rough set theory...In this paper,we propose two intrusion detection methods which combine rough set theory and Fuzzy C-Means for network intrusion detection.The first step consists of feature selection which is based on rough set theory.The next phase is clustering by using Fuzzy C-Means.Rough set theory is an efficient tool for further reducing redundancy.Fuzzy C-Means allows the objects to belong to several clusters simultaneously,with different degrees of membership.To evaluate the performance of the introduced approaches,we apply them to the international Knowledge Discovery and Data mining intrusion detection dataset.In the experimentations,we compare the performance of two rough set theory based hybrid methods for network intrusion detection.Experimental results illustrate that our algorithms are accurate models for handling complex attack patterns in large network.And these two methods can increase the efficiency and reduce the dataset by looking for overlapping categories.展开更多
The assessment mode of the CPA audit independence risk based on the Rough Set Theory is a risk assessment method and using the Analytical Hierarchy Process, whose aim is to solve the bid management in the process of t...The assessment mode of the CPA audit independence risk based on the Rough Set Theory is a risk assessment method and using the Analytical Hierarchy Process, whose aim is to solve the bid management in the process of the Financial Statement Insurance. Firstly, according to the general instance of the accountant office, the experts grade the risk elements to establish the decision-making table; secondly, construct the judgment matrix using the attribution dependence degree of Variable Precision Rough Set to obtain relative importance, and further get the general importance for all of risk elements; Finally, establish the general assessment mode of the audit independence risk.展开更多
Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem ...Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem solution of complex system without depending on the domain of problem.It is robust to many kinds of problems.The paper combines Genetic Algorithms and rough sets theory to compute granular of knowledge through an example of information table. The combination enable us to compute granular of knowledge effectively.It is also useful for computer auto-computing and information processing.展开更多
Rough set theory, proposed by Pawlak in 1982, is a tool for dealing with uncertainty and vagueness aspects of knowledge model. The main idea of rough sets corresponds to the lower and upper approximations based on equ...Rough set theory, proposed by Pawlak in 1982, is a tool for dealing with uncertainty and vagueness aspects of knowledge model. The main idea of rough sets corresponds to the lower and upper approximations based on equivalence relations. This paper studies the rough set and its extension. In our talk, we present a linear algebra approach to rough set and its extension, give an equivalent definition of the lower and upper approximations of rough set based on the characteristic function of sets, and then we explain the lower and upper approximations as the colinear map and linear map of sets, respectively. Finally, we define the rough sets over fuzzy lattices, which cover the rough set and fuzzy rough set,and the independent axiomatic systems are constructed to characterize the lower and upper approximations of rough set over fuzzy lattices,respectively,based on inner and outer products. The axiomatic systems unify the axiomization of Pawlak’s rough sets and fuzzy rough sets.展开更多
This article focuses on the relationship between mathematical morphology operations and rough sets,mainly based on the context of image retrieval and the basic image correspondence problem.Mathematical morphological p...This article focuses on the relationship between mathematical morphology operations and rough sets,mainly based on the context of image retrieval and the basic image correspondence problem.Mathematical morphological procedures and set approximations in rough set theory have some clear parallels.Numerous initiatives have been made to connect rough sets with mathematical morphology.Numerous significant publications have been written in this field.Others attempt to show a direct connection between mathematical morphology and rough sets through relations,a pair of dual operations,and neighborhood systems.Rough sets are used to suggest a strategy to approximatemathematicalmorphology within the general paradigm of soft computing.A single framework is defined using a different technique that incorporates the key ideas of both rough sets and mathematical morphology.This paper examines rough set theory from the viewpoint of mathematical morphology to derive rough forms of themorphological structures of dilation,erosion,opening,and closing.These newly defined structures are applied to develop algorithm for the differential analysis of chest X-ray images from a COVID-19 patient with acute pneumonia and a health subject.The algorithm and rough morphological operations show promise for the delineation of lung occlusion in COVID-19 patients from chest X-rays.The foundations of mathematical morphology are covered in this article.After that,rough set theory ideas are taken into account,and their connections are examined.Finally,a suggested image retrieval application of the concepts from these two fields is provided.展开更多
The presence of numerous uncertainties in hybrid decision information systems(HDISs)renders attribute reduction a formidable task.Currently available attribute reduction algorithms,including those based on Pawlak attr...The presence of numerous uncertainties in hybrid decision information systems(HDISs)renders attribute reduction a formidable task.Currently available attribute reduction algorithms,including those based on Pawlak attribute importance,Skowron discernibility matrix,and information entropy,struggle to effectively manages multiple uncertainties simultaneously in HDISs like the precise measurement of disparities between nominal attribute values,and attributes with fuzzy boundaries and abnormal values.In order to address the aforementioned issues,this paper delves into the study of attribute reduction withinHDISs.First of all,a novel metric based on the decision attribute is introduced to solve the problem of accurately measuring the differences between nominal attribute values.The newly introduced distance metric has been christened the supervised distance that can effectively quantify the differences between the nominal attribute values.Then,based on the newly developed metric,a novel fuzzy relationship is defined from the perspective of“feedback on parity of attribute values to attribute sets”.This new fuzzy relationship serves as a valuable tool in addressing the challenges posed by abnormal attribute values.Furthermore,leveraging the newly introduced fuzzy relationship,the fuzzy conditional information entropy is defined as a solution to the challenges posed by fuzzy attributes.It effectively quantifies the uncertainty associated with fuzzy attribute values,thereby providing a robust framework for handling fuzzy information in hybrid information systems.Finally,an algorithm for attribute reduction utilizing the fuzzy conditional information entropy is presented.The experimental results on 12 datasets show that the average reduction rate of our algorithm reaches 84.04%,and the classification accuracy is improved by 3.91%compared to the original dataset,and by an average of 11.25%compared to the other 9 state-of-the-art reduction algorithms.The comprehensive analysis of these research results clearly indicates that our algorithm is highly effective in managing the intricate uncertainties inherent in hybrid data.展开更多
Rough set theory places great importance on approximation accuracy,which is used to gauge how well a rough set model describes a target concept.However,traditional approximation accuracy has limitations since it varie...Rough set theory places great importance on approximation accuracy,which is used to gauge how well a rough set model describes a target concept.However,traditional approximation accuracy has limitations since it varies with changes in the target concept and cannot evaluate the overall descriptive ability of a rough set model.To overcome this,two types of average approximation accuracy that objectively assess a rough set model’s ability to approximate all information granules is proposed.The first is the relative average approximation accuracy,which is based on all sets in the universe and has several basic properties.The second is the absolute average approximation accuracy,which is based on undefinable sets and has yielded significant conclusions.We also explore the relationship between these two types of average approximation accuracy.Finally,the average approximation accuracy has practical applications in addressing missing attribute values in incomplete information tables.展开更多
Due to the characteristics of high resolution and rich texture information,visible light images are widely used for maritime ship detection.However,these images are suscep-tible to sea fog and ships of different sizes...Due to the characteristics of high resolution and rich texture information,visible light images are widely used for maritime ship detection.However,these images are suscep-tible to sea fog and ships of different sizes,which can result in missed detections and false alarms,ultimately resulting in lower detection accuracy.To address these issues,a novel multi-granularity feature enhancement network,MFENet,which includes a three-way dehazing module(3WDM)and a multi-granularity feature enhancement module(MFEM)is proposed.The 3WDM eliminates sea fog interference by using an image clarity automatic classification algorithm based on three-way decisions and FFA-Net to obtain clear image samples.Additionally,the MFEM improves the accuracy of detecting ships of different sizes by utilising an improved super-resolution reconstruction con-volutional neural network to enhance the resolution and semantic representation capa-bility of the feature maps from YOLOv7.Experimental results demonstrate that MFENet surpasses the other 15 competing models in terms of the mean Average Pre-cision metric on two benchmark datasets,achieving 96.28%on the McShips dataset and 97.71%on the SeaShips dataset.展开更多
基金funded by the National Key Research and Development Program(Grant No.2022YFC3004200)the National Key Research and Development Program of China(Grant No.2021YFB3900400)+1 种基金Hunan Provincial Natural Science Foundation of China(Grant No.2021JC0009)the National Natural Science Foundation of China(Grant No.U2142212).
文摘The Geometrical Optics(GO)approach and the FAST Emissivity Model(FASTEM)are widely used to estimate the surface radiative components in atmospheric radiative transfer simulations,but their applications are limited in specific conditions.In this study,a two-scale reflectivity model(TSRM)and a two-scale emissivity model(TSEM)are developed from the two-scale roughness theory.Unlike GO which only computes six non-zero elements in the reflectivity matrix,The TSRM includes 16 elements of Stokes reflectivity matrix which are important for improving radiative transfer simulation accuracy in a scattering atmosphere.It covers the frequency range from L-to W-bands.The dependences of all TSRM elements on zenith angle,wind speed,and frequency are derived and analyzed in details.For a set of downwelling radiances in microwave frequencies,the reflected upwelling brightness temperature(BTs)are calculated from both TSRM and GO and compared for analyzing their discrepancies.The TSRM not only includes the effects of GO but also accounts for the small-scale Bragg scattering effect in an order of several degrees in Kelvins in brightness temperature.Also,the third and fourth components of the Stokes vector can only be produced from the TSRM.For the emitted radiation,BT differences in vertical polarization between a TSEM and FASTEM are generally less than 5 K when the satellite zenith angle is less than 40°,whereas those for the horizontal component can be quite significant,greater than 20 K.
基金Preliminary research foundation of national defense
文摘In order to reduce redundant features in air combat information and to meet the requirements of real-time decision in combat, rough set theory is introduced to the tactical decision analysis in cooperative team air combat. An algorithm of attribute reduction for extracting key combat information and generating tactical rules from given air combat databases is presented. Then, considering the practical requirements of team combat, a method for reduction of attribute-values under single decision attribute is extended to the reduction under multi-decision attributes. Finally, the algorithm is verified with an example for tactical choices in team air combat. The results show that, the redundant attributes in air combat information can be reduced, and that the main combat attributes, i.e., the information about radar command and medium-range guided missile, can be obtained with the algorithm mentioned above, moreover, the minimal reduced strategy for tactical decision can be generated without losing the result of key information classification. The decision rules extracted agree with the real situation of team air combat.
文摘It is being widely studied how to extract knowledge from a decision table based on rough set theory. The novel problem is how to discretize a decision table having continuous attribute. In order to obtain more reasonable discretization results, a discretization algorithm is proposed, which arranges half-global discretization based on the correlational coefficient of each continuous attribute while considering the uniqueness of rough set theory. When choosing heuristic information, stability is combined with rough entropy. In terms of stability, the possibility of classifying objects belonging to certain sub-interval of a given attribute into neighbor sub-intervals is minimized. By doing this, rational discrete intervals can be determined. Rough entropy is employed to decide the optimal cut-points while guaranteeing the consistency of the decision table after discretization. Thought of this algorithm is elaborated through Iris data and then some experiments by comparing outcomes of four discritized datasets are also given, which are calculated by the proposed algorithm and four other typical algorithras for discritization respectively. After that, classification rules are deduced and summarized through rough set based classifiers. Results show that the proposed discretization algorithm is able to generate optimal classification accuracy while minimizing the number of discrete intervals. It displays superiority especially when dealing with a decision table having a large attribute number.
基金the National Natural Science Foundation of China (50275113).
文摘The basic principles of IF/THEN rules in rough set theory are analyzed first, and then the automatic process of knowledge acquisition is given. The numerical data is qualitatively processed by the classification of membership functions and membership degrees to get the normative decision table. The regular method of relations and the reduction algorithm of attributes are studied. The reduced relations are presented by the multi-representvalue method and its algorithm is offered. The whole knowledge acquisition process has high degree of automation and the extracted knowledge is true and reliable.
文摘This paper proposes a clustering technique that minimizes the need for subjective human intervention and is based on elements of rough set theory (RST). The proposed algorithm is unified in its approach to clustering and makes use of both local and global data properties to obtain clustering solutions. It handles single-type and mixed attribute data sets with ease. The results from three data sets of single and mixed attribute types are used to illustrate the technique and establish its efficiency.
文摘Seismic vulnerability assessment of urban buildings is among the most crucial procedures to post-disaster response and recovery of infrastructure systems.The present study proceeds to estimate the seismic vulnerability of urban buildings and proposes a new framework training on the two objectives.First,a comprehensive interpretation of the effective parameters of this phenomenon including physical and human factors is done.Second,the Rough Set theory is used to reduce the integration uncertainties,as there are numerous quantitative and qualitative data.Both objectives were conducted on seven distinct earthquake scenarios with different intensities based on distance from the fault line and the epicenter.The proposed method was implemented by measuring seismic vulnerability for the seven specified seismic scenarios.The final results indicated that among the entire studied buildings,71.5%were highly vulnerable as concerning the highest earthquake scenario(intensity=7 MM and acceleration calculated based on the epicenter),while in the lowest earthquake scenario(intensity=5 MM),the percentage of vulnerable buildings decreased to approximately 57%.Also,the findings proved that the distance from the fault line rather than the earthquake center(epicenter)has a significant effect on the seismic vulnerability of urban buildings.The model was evaluated by comparing the results with the weighted linear combination(WLC)method.The accuracy of the proposed model was substantiated according to evaluation reports.Vulnerability assessment based on the distance from the epicenter and its comparison with the distance from the fault shows significant reliable results.
基金Supported by the National Natural Science Foundation of China (No. 60774029)
文摘In this paper,we propose a novel Intrusion Detection System (IDS) architecture utilizing both the evidence theory and Rough Set Theory (RST). Evidence theory is an effective tool in dealing with uncertainty question. It relies on the expert knowledge to provide evidences,needing the evidences to be independent,and this make it difficult in application. To solve this problem,a hybrid system of rough sets and evidence theory is proposed. Firstly,simplification are made based on Variable Precision Rough Set (VPRS) conditional entropy. Thus,the Basic Belief Assignment (BBA) for all evidences can be calculated. Secondly,Dempster’s rule of combination is used,and a decision-making is given. In the proposed approach,the difficulties in acquiring the BBAs are solved,the correlativity among the evidences is reduced and the subjectivity of evidences is weakened. An illustrative example in an intrusion detection shows that the two theories combination is feasible and effective.
基金Natural Science Foundation of Jiangsu Province of China ( No.BK2006176)High-Tech Key Laboratory of Jiangsu,China (No.BM2007201)
文摘In order to avoid the discretization in the classical rough set theory, a generlization rough set theory is proposed. At first, the degree of general importance of an attribute and attribute subsets are presented. Then, depending on the degree of general importance of attribute, the space distance can be measured with weighted method. At last, a generalization rough set theory based on the general near neighborhood relation is proposed. The proposed theory partitions the universe into the tolerant modules, and forms lower approximation and upper approximation of the set under general near neighborhood relationship, which avoids the discretization in Pawlak's rough set theory.
文摘A new image recognition method based on fuzzy rough sets theory is proposed, and its implementation discussed. The performance of this method as applied to ferrography image recognition is evaluated. It is shown that the new method gives better results than fuzzy or rough sets method when used alone.
文摘The improved method has been presented for knowledge reduction in rough sets (R-S) theory, when R-S is used to model the information expression of oil and vibration diagnosis. Therefore, the typical fault simulation tests of rolling bearings have been made, and the application method of R-S has been also analysed in this paper. The diagnosis model of holding rack fault in rolling bearing was presented based on the improved reduction method. It is suited to information fusion to combine information when oil analysis and vibration analysis are combined for fault diagnosis.
文摘To investigate the judging problem of optimal dividing matrix among several fuzzy dividing matrices in fuzzy dividing space, correspondingly, which is determined by the various choices of cluster samples in the totality sample space, two algorithms are proposed on the basis of the data analysis method in rough sets theory: information system discrete algorithm (algorithm 1) and samples representatives judging algorithm (algorithm 2). On the principle of the farthest distance, algorithm 1 transforms continuous data into discrete form which could be transacted by rough sets theory. Taking the approximate precision as a criterion, algorithm 2 chooses the sample space with a good representative. Hence, the clustering sample set in inducing and computing optimal dividing matrix can be achieved. Several theorems are proposed to provide strict theoretic foundations for the execution of the algorithm model. An applied example based on the new algorithm model is given, whose result verifies the feasibility of this new algorithm model.
基金Sponsored by the National Social Science Fund(Grant No.13CFX049)the Shanghai University Young Teacher Training Program(Grant No.hdzf10008)the Research Fund for East China University of Political Science and Law(Grant No.11H2K034)
文摘In this paper,we propose two intrusion detection methods which combine rough set theory and Fuzzy C-Means for network intrusion detection.The first step consists of feature selection which is based on rough set theory.The next phase is clustering by using Fuzzy C-Means.Rough set theory is an efficient tool for further reducing redundancy.Fuzzy C-Means allows the objects to belong to several clusters simultaneously,with different degrees of membership.To evaluate the performance of the introduced approaches,we apply them to the international Knowledge Discovery and Data mining intrusion detection dataset.In the experimentations,we compare the performance of two rough set theory based hybrid methods for network intrusion detection.Experimental results illustrate that our algorithms are accurate models for handling complex attack patterns in large network.And these two methods can increase the efficiency and reduce the dataset by looking for overlapping categories.
文摘The assessment mode of the CPA audit independence risk based on the Rough Set Theory is a risk assessment method and using the Analytical Hierarchy Process, whose aim is to solve the bid management in the process of the Financial Statement Insurance. Firstly, according to the general instance of the accountant office, the experts grade the risk elements to establish the decision-making table; secondly, construct the judgment matrix using the attribution dependence degree of Variable Precision Rough Set to obtain relative importance, and further get the general importance for all of risk elements; Finally, establish the general assessment mode of the audit independence risk.
文摘Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem solution of complex system without depending on the domain of problem.It is robust to many kinds of problems.The paper combines Genetic Algorithms and rough sets theory to compute granular of knowledge through an example of information table. The combination enable us to compute granular of knowledge effectively.It is also useful for computer auto-computing and information processing.
文摘Rough set theory, proposed by Pawlak in 1982, is a tool for dealing with uncertainty and vagueness aspects of knowledge model. The main idea of rough sets corresponds to the lower and upper approximations based on equivalence relations. This paper studies the rough set and its extension. In our talk, we present a linear algebra approach to rough set and its extension, give an equivalent definition of the lower and upper approximations of rough set based on the characteristic function of sets, and then we explain the lower and upper approximations as the colinear map and linear map of sets, respectively. Finally, we define the rough sets over fuzzy lattices, which cover the rough set and fuzzy rough set,and the independent axiomatic systems are constructed to characterize the lower and upper approximations of rough set over fuzzy lattices,respectively,based on inner and outer products. The axiomatic systems unify the axiomization of Pawlak’s rough sets and fuzzy rough sets.
文摘This article focuses on the relationship between mathematical morphology operations and rough sets,mainly based on the context of image retrieval and the basic image correspondence problem.Mathematical morphological procedures and set approximations in rough set theory have some clear parallels.Numerous initiatives have been made to connect rough sets with mathematical morphology.Numerous significant publications have been written in this field.Others attempt to show a direct connection between mathematical morphology and rough sets through relations,a pair of dual operations,and neighborhood systems.Rough sets are used to suggest a strategy to approximatemathematicalmorphology within the general paradigm of soft computing.A single framework is defined using a different technique that incorporates the key ideas of both rough sets and mathematical morphology.This paper examines rough set theory from the viewpoint of mathematical morphology to derive rough forms of themorphological structures of dilation,erosion,opening,and closing.These newly defined structures are applied to develop algorithm for the differential analysis of chest X-ray images from a COVID-19 patient with acute pneumonia and a health subject.The algorithm and rough morphological operations show promise for the delineation of lung occlusion in COVID-19 patients from chest X-rays.The foundations of mathematical morphology are covered in this article.After that,rough set theory ideas are taken into account,and their connections are examined.Finally,a suggested image retrieval application of the concepts from these two fields is provided.
基金Anhui Province Natural Science Research Project of Colleges and Universities(2023AH040321)Excellent Scientific Research and Innovation Team of Anhui Colleges(2022AH010098).
文摘The presence of numerous uncertainties in hybrid decision information systems(HDISs)renders attribute reduction a formidable task.Currently available attribute reduction algorithms,including those based on Pawlak attribute importance,Skowron discernibility matrix,and information entropy,struggle to effectively manages multiple uncertainties simultaneously in HDISs like the precise measurement of disparities between nominal attribute values,and attributes with fuzzy boundaries and abnormal values.In order to address the aforementioned issues,this paper delves into the study of attribute reduction withinHDISs.First of all,a novel metric based on the decision attribute is introduced to solve the problem of accurately measuring the differences between nominal attribute values.The newly introduced distance metric has been christened the supervised distance that can effectively quantify the differences between the nominal attribute values.Then,based on the newly developed metric,a novel fuzzy relationship is defined from the perspective of“feedback on parity of attribute values to attribute sets”.This new fuzzy relationship serves as a valuable tool in addressing the challenges posed by abnormal attribute values.Furthermore,leveraging the newly introduced fuzzy relationship,the fuzzy conditional information entropy is defined as a solution to the challenges posed by fuzzy attributes.It effectively quantifies the uncertainty associated with fuzzy attribute values,thereby providing a robust framework for handling fuzzy information in hybrid information systems.Finally,an algorithm for attribute reduction utilizing the fuzzy conditional information entropy is presented.The experimental results on 12 datasets show that the average reduction rate of our algorithm reaches 84.04%,and the classification accuracy is improved by 3.91%compared to the original dataset,and by an average of 11.25%compared to the other 9 state-of-the-art reduction algorithms.The comprehensive analysis of these research results clearly indicates that our algorithm is highly effective in managing the intricate uncertainties inherent in hybrid data.
基金National Natural Science Foundation of China,Grant/Award Number:61976254Natural Science Foundation of Fujian Province,Grant/Award Numbers:2020J01707,2020J01710。
文摘Rough set theory places great importance on approximation accuracy,which is used to gauge how well a rough set model describes a target concept.However,traditional approximation accuracy has limitations since it varies with changes in the target concept and cannot evaluate the overall descriptive ability of a rough set model.To overcome this,two types of average approximation accuracy that objectively assess a rough set model’s ability to approximate all information granules is proposed.The first is the relative average approximation accuracy,which is based on all sets in the universe and has several basic properties.The second is the absolute average approximation accuracy,which is based on undefinable sets and has yielded significant conclusions.We also explore the relationship between these two types of average approximation accuracy.Finally,the average approximation accuracy has practical applications in addressing missing attribute values in incomplete information tables.
基金National Key Research and Development Program of China,Grant/Award Number:2022YFB3104700National Natural Science Foundation of China,Grant/Award Numbers:62376198,61906137,62076040,62076182,62163016,62006172+1 种基金The China National Scientific Sea‐floor Observatory,The Natural Science Foundation of Shanghai,Grant/Award Number:22ZR1466700The Jiangxi Provincial Natural Science Fund,Grant/Award Number:20212ACB202001。
文摘Due to the characteristics of high resolution and rich texture information,visible light images are widely used for maritime ship detection.However,these images are suscep-tible to sea fog and ships of different sizes,which can result in missed detections and false alarms,ultimately resulting in lower detection accuracy.To address these issues,a novel multi-granularity feature enhancement network,MFENet,which includes a three-way dehazing module(3WDM)and a multi-granularity feature enhancement module(MFEM)is proposed.The 3WDM eliminates sea fog interference by using an image clarity automatic classification algorithm based on three-way decisions and FFA-Net to obtain clear image samples.Additionally,the MFEM improves the accuracy of detecting ships of different sizes by utilising an improved super-resolution reconstruction con-volutional neural network to enhance the resolution and semantic representation capa-bility of the feature maps from YOLOv7.Experimental results demonstrate that MFENet surpasses the other 15 competing models in terms of the mean Average Pre-cision metric on two benchmark datasets,achieving 96.28%on the McShips dataset and 97.71%on the SeaShips dataset.
基金support by National Natural Science Foundation of China(61202354,51507084)Nanjing University of Post and Telecommunications Science Foundation(NUPTSF)(NT214203)