This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained mode...This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained model for prediction often cannot maintain a good performance when facing small amount of historical data of the new target frequency.Moreover,the cognitive radio equipment usually implements the dynamic spectrum access in real time which means the time to recollect the data of the new task frequency band and retrain the model is very limited.To address the above issues,we develop a crossband data augmentation framework for spectrum prediction by leveraging the recent advances of generative adversarial network(GAN)and deep transfer learning.Firstly,through the similarity measurement,we pre-train a GAN model using the historical data of the frequency band that is the most similar to the target frequency band.Then,through the data augmentation by feeding the small amount of the target data into the pre-trained GAN,temporal-spectral residual network is further trained using deep transfer learning and the generated data with high similarity from GAN.Finally,experiment results demonstrate the effectiveness of the proposed framework.展开更多
With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. Fir...With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. First, some valid mining task schedules are generated, and then au tonomous and local mining are executed periodically, finally, previous results are merged and refined. The framework based on the model creates a communication mechanism to in corporate domain knowledge into continuous process through ontology service. The local and merge mining are transparent to the end user and heterogeneous data ,source by ontology. Experiments suggest that the framework should be useful in guiding the continuous mining process.展开更多
The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many d...The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.展开更多
Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundame...Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundamental impact on corporate environmental information disclosure. In the big data era of information sharing, companies will have a new understanding of the emergence, balance, and final consideration of social responsibility, and will have new changes in their overall decision-making and information disclosure methods. "Knowing" and "behavior" will be combined on the basis of rational judgment, so that corporate environmental information disclosure is more in line with the overall social development requirements. Based on the background of big data, this article starts with the disclosure of impact factors, footholds, and path choices. It describes the evolution of corporate environmental information disclosure and provides reference suggestions for enterprises to disclose environmental information truthfully and perform social responsibilities.展开更多
The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide heal...The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide healthcare services without physical appearance.With the use of sensors,IoMT applications are used in healthcare management.In such applications,one of the most important factors is data security,given that its transmission over the network may cause obtrusion.For data security in IoMT systems,blockchain is used due to its numerous blocks for secure data storage.In this study,Blockchain-assisted secure data management framework(BSDMF)and Proof of Activity(PoA)protocol using malicious code detection algorithm is used in the proposed data security for the healthcare system.The main aim is to enhance the data security over the networks.The PoA protocol enhances high security of data from the literature review.By replacing the malicious node from the block,the PoA can provide high security for medical data in the blockchain.Comparison with existing systems shows that the proposed simulation with BSD-Malicious code detection algorithm achieves higher accuracy ratio,precision ratio,security,and efficiency and less response time for Blockchain-enabled healthcare systems.展开更多
Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain u...Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain usage models of the PoC device. PoC devices that are used for PoC testing and diagnostic applications are defined CAT 1 devices;PoC devices that are used for patient monitoring are defined as CAT 2 devices (PoCM);PoC devices that are used for as interfacing with other devices are defined as CAT 3 devices (PoCI). The PoCI devices provide an interface gateway for collecting and aggregating data from other medical devices. In all categories, data security is an important aspect. This paper presents a security framework concept, which is applicable for all of the classes of PoC operation. It outlines the concepts and security framework for preventing security challenges in unauthorized access to data, unintended data flow, and data tampering during communication between system entities, the user, and the PoC system. The security framework includes secure layering of basic PoC system architecture, protection of PoC devices in the context of application and network. Developing the security framework is taken into account of a thread model of the PoC system. A proposal for a low-level protocol is discussed. This protocol is independent of communications technologies, and it is elaborated in relation to providing security. An algorithm that can be used to overcome the threat challenges has been shown using the elements in the protocol. The paper further discusses the vulnerability scanning process for the PoC system interconnected network. The paper also presents a four-step process of authentication and authorization framework for providing the security for the PoC system. Finally, the paper concludes with the machine to machine (M2M) security viewpoint and discusses the key stakeholders within an actual deployment of the PoC system and its security challenges.展开更多
Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, howev...Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, however educational content is more than information and data. This paper presents a new data quality framework for assessing digital educational content used for teaching in distance learning environments. The model relies on the ISO2500 series quality standard and beside providing the mechanisms for multi-facet quality assessment it also supports organizations that design, create, manage and use educational content with the quality tools (expressed as quality metrics and measurement methods) to provide a more efficient distance education experience. The model describes the quality characteristics of the educational material content using data and software quality characteristics.展开更多
Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Man...Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Managers of National Antarctic Programmes(COMNAP) have been paying close attention to and promoting actively. Through the co effort of international Antarctic organizations and member countries concerned in recent years, Antarctic Data Directory Syatem(ADDS) is established as the most important basic programme for development of the international Antarctic data management system. At present, Joint Committee on Antarctic Data Management(JCADM) is responsible for organizing and coordinating the international Antarctic data management, and implementing the project ADDS.In this paper the background on Antarctic data management in time sequence and the structure of international framework are introduced, meanwhile, it is necessary to develop ADDS first of all. The ADDS mainly consists of the two principal parts: National Antarctic Data Center(NADCs) of all the party members and Antarctic Main Directory(AMD), the best available technology for creating ADDS is to make full use of International Directory Network(IDN) and adopt its Directory Interchange Formats(DIF). In the light of the above requirements, combined with Chinese specific situation, the contents, technical and administrative methods on Chinese Antarctic data management are discussed to promote our related work.展开更多
An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing...An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.展开更多
基金This work was supported by the Science and Technology Innovation 2030-Key Project of“New Generation Artificial Intelligence”of China under Grant 2018AAA0102303the Natural Science Foundation for Distinguished Young Scholars of Jiangsu Province(No.BK20190030)the National Natural Science Foundation of China(No.61631020,No.61871398,No.61931011 and No.U20B2038).
文摘This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained model for prediction often cannot maintain a good performance when facing small amount of historical data of the new target frequency.Moreover,the cognitive radio equipment usually implements the dynamic spectrum access in real time which means the time to recollect the data of the new task frequency band and retrain the model is very limited.To address the above issues,we develop a crossband data augmentation framework for spectrum prediction by leveraging the recent advances of generative adversarial network(GAN)and deep transfer learning.Firstly,through the similarity measurement,we pre-train a GAN model using the historical data of the frequency band that is the most similar to the target frequency band.Then,through the data augmentation by feeding the small amount of the target data into the pre-trained GAN,temporal-spectral residual network is further trained using deep transfer learning and the generated data with high similarity from GAN.Finally,experiment results demonstrate the effectiveness of the proposed framework.
基金Supported by the National Natural Science Foun-dation of China (60173058 ,70372024)
文摘With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. First, some valid mining task schedules are generated, and then au tonomous and local mining are executed periodically, finally, previous results are merged and refined. The framework based on the model creates a communication mechanism to in corporate domain knowledge into continuous process through ontology service. The local and merge mining are transparent to the end user and heterogeneous data ,source by ontology. Experiments suggest that the framework should be useful in guiding the continuous mining process.
文摘The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.
基金supported by the Philosophy and Social Science Research Project of Daqing City (Grant No. DSGB2017112)the Postgraduate Innovation Research Project of Heilongjiang Bayi Agricultural University (Grant No. YJSCX2017Y79)
文摘Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundamental impact on corporate environmental information disclosure. In the big data era of information sharing, companies will have a new understanding of the emergence, balance, and final consideration of social responsibility, and will have new changes in their overall decision-making and information disclosure methods. "Knowing" and "behavior" will be combined on the basis of rational judgment, so that corporate environmental information disclosure is more in line with the overall social development requirements. Based on the background of big data, this article starts with the disclosure of impact factors, footholds, and path choices. It describes the evolution of corporate environmental information disclosure and provides reference suggestions for enterprises to disclose environmental information truthfully and perform social responsibilities.
基金Taif University Researchers Supporting Project Number(TURSP-2020/98),Taif University,Taif,Saudi Arabia.
文摘The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide healthcare services without physical appearance.With the use of sensors,IoMT applications are used in healthcare management.In such applications,one of the most important factors is data security,given that its transmission over the network may cause obtrusion.For data security in IoMT systems,blockchain is used due to its numerous blocks for secure data storage.In this study,Blockchain-assisted secure data management framework(BSDMF)and Proof of Activity(PoA)protocol using malicious code detection algorithm is used in the proposed data security for the healthcare system.The main aim is to enhance the data security over the networks.The PoA protocol enhances high security of data from the literature review.By replacing the malicious node from the block,the PoA can provide high security for medical data in the blockchain.Comparison with existing systems shows that the proposed simulation with BSD-Malicious code detection algorithm achieves higher accuracy ratio,precision ratio,security,and efficiency and less response time for Blockchain-enabled healthcare systems.
文摘Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain usage models of the PoC device. PoC devices that are used for PoC testing and diagnostic applications are defined CAT 1 devices;PoC devices that are used for patient monitoring are defined as CAT 2 devices (PoCM);PoC devices that are used for as interfacing with other devices are defined as CAT 3 devices (PoCI). The PoCI devices provide an interface gateway for collecting and aggregating data from other medical devices. In all categories, data security is an important aspect. This paper presents a security framework concept, which is applicable for all of the classes of PoC operation. It outlines the concepts and security framework for preventing security challenges in unauthorized access to data, unintended data flow, and data tampering during communication between system entities, the user, and the PoC system. The security framework includes secure layering of basic PoC system architecture, protection of PoC devices in the context of application and network. Developing the security framework is taken into account of a thread model of the PoC system. A proposal for a low-level protocol is discussed. This protocol is independent of communications technologies, and it is elaborated in relation to providing security. An algorithm that can be used to overcome the threat challenges has been shown using the elements in the protocol. The paper further discusses the vulnerability scanning process for the PoC system interconnected network. The paper also presents a four-step process of authentication and authorization framework for providing the security for the PoC system. Finally, the paper concludes with the machine to machine (M2M) security viewpoint and discusses the key stakeholders within an actual deployment of the PoC system and its security challenges.
文摘Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, however educational content is more than information and data. This paper presents a new data quality framework for assessing digital educational content used for teaching in distance learning environments. The model relies on the ISO2500 series quality standard and beside providing the mechanisms for multi-facet quality assessment it also supports organizations that design, create, manage and use educational content with the quality tools (expressed as quality metrics and measurement methods) to provide a more efficient distance education experience. The model describes the quality characteristics of the educational material content using data and software quality characteristics.
文摘Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Managers of National Antarctic Programmes(COMNAP) have been paying close attention to and promoting actively. Through the co effort of international Antarctic organizations and member countries concerned in recent years, Antarctic Data Directory Syatem(ADDS) is established as the most important basic programme for development of the international Antarctic data management system. At present, Joint Committee on Antarctic Data Management(JCADM) is responsible for organizing and coordinating the international Antarctic data management, and implementing the project ADDS.In this paper the background on Antarctic data management in time sequence and the structure of international framework are introduced, meanwhile, it is necessary to develop ADDS first of all. The ADDS mainly consists of the two principal parts: National Antarctic Data Center(NADCs) of all the party members and Antarctic Main Directory(AMD), the best available technology for creating ADDS is to make full use of International Directory Network(IDN) and adopt its Directory Interchange Formats(DIF). In the light of the above requirements, combined with Chinese specific situation, the contents, technical and administrative methods on Chinese Antarctic data management are discussed to promote our related work.
基金This project supported by the National High-Tech Research and Development Plan (863-804-3)
文摘An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.