In the ancient block Hill cipher, the cipher text is obtained by multiplying the blocks of the plain text with the key matrix. To strengthen the keymatrix, a double guard Hill cipher was proposed with two key matrices...In the ancient block Hill cipher, the cipher text is obtained by multiplying the blocks of the plain text with the key matrix. To strengthen the keymatrix, a double guard Hill cipher was proposed with two key matrices, a private key matrix and its modified key matrix along with permutation. In the ancient block Hill cipher, the cipher text is obtained by multiplying the blocks of the plain text with the key matrix. To strengthen the key matrix, a double guard Hill cipher was proposed with two key matrices, a private key matrix and its modified key matrix along with permutation. In this paper a novel modification is performed to the double guard Hill cipher in order to reduce the number of calculation to obtain the cipher text by using non-square matrices. This modified double guard Hill cipher uses a non-square matrix of order (p × q) as its private keymatrix.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policy...Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policymanagement efficiency and difficulty in accurately describing the access control policy. To overcome theseproblems, this paper proposes a big data access control mechanism based on a two-layer permission decisionstructure. This mechanism extends the attribute-based access control (ABAC) model. Business attributes areintroduced in the ABAC model as business constraints between entities. The proposed mechanism implementsa two-layer permission decision structure composed of the inherent attributes of access control entities and thebusiness attributes, which constitute the general permission decision algorithm based on logical calculation andthe business permission decision algorithm based on a bi-directional long short-term memory (BiLSTM) neuralnetwork, respectively. The general permission decision algorithm is used to implement accurate policy decisions,while the business permission decision algorithm implements fuzzy decisions based on the business constraints.The BiLSTM neural network is used to calculate the similarity of the business attributes to realize intelligent,adaptive, and efficient access control permission decisions. Through the two-layer permission decision structure,the complex and diverse big data access control management requirements can be satisfied by considering thesecurity and availability of resources. Experimental results show that the proposed mechanism is effective andreliable. In summary, it can efficiently support the secure sharing of big data resources.展开更多
The use of the Internet of Things(IoT)is expanding at an unprecedented scale in many critical applications due to the ability to interconnect and utilize a plethora of wide range of devices.In critical infrastructure ...The use of the Internet of Things(IoT)is expanding at an unprecedented scale in many critical applications due to the ability to interconnect and utilize a plethora of wide range of devices.In critical infrastructure domains like oil and gas supply,intelligent transportation,power grids,and autonomous agriculture,it is essential to guarantee the confidentiality,integrity,and authenticity of data collected and exchanged.However,the limited resources coupled with the heterogeneity of IoT devices make it inefficient or sometimes infeasible to achieve secure data transmission using traditional cryptographic techniques.Consequently,designing a lightweight secure data transmission scheme is becoming essential.In this article,we propose lightweight secure data transmission(LSDT)scheme for IoT environments.LSDT consists of three phases and utilizes an effective combination of symmetric keys and the Elliptic Curve Menezes-Qu-Vanstone asymmetric key agreement protocol.We design the simulation environment and experiments to evaluate the performance of the LSDT scheme in terms of communication and computation costs.Security and performance analysis indicates that the LSDT scheme is secure,suitable for IoT applications,and performs better in comparison to other related security schemes.展开更多
With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.Th...With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.The best way to enhance traffic flow for vehicles and traffic management departments is to share thedata they receive.There needs to be more protection for the VANET systems.An effective and safe methodof outsourcing is suggested,which reduces computation costs by achieving data security using a homomorphicmapping based on the conjugate operation of matrices.This research proposes a VANET-based data outsourcingsystem to fix the issues.To keep data outsourcing secure,the suggested model takes cryptography models intoaccount.Fog will keep the generated keys for the purpose of vehicle authentication.For controlling and overseeingthe outsourced data while preserving privacy,the suggested approach considers the Trusted Certified Auditor(TCA).Using the secret key,TCA can identify the genuine identity of VANETs when harmful messages aredetected.The proposed model develops a TCA-based unique static vehicle labeling system using cryptography(TCA-USVLC)for secure data outsourcing and privacy preservation in VANETs.The proposed model calculatesthe trust of vehicles in 16 ms for an average of 180 vehicles and achieves 98.6%accuracy for data encryption toprovide security.The proposedmodel achieved 98.5%accuracy in data outsourcing and 98.6%accuracy in privacypreservation in fog-enabled VANETs.Elliptical curve cryptography models can be applied in the future for betterencryption and decryption rates with lightweight cryptography operations.展开更多
Medical institution data compliance is an exogenous product of the digital society,serving as a crucial means to maintain and balance the relationship between data protection and data sharing,as well as individual int...Medical institution data compliance is an exogenous product of the digital society,serving as a crucial means to maintain and balance the relationship between data protection and data sharing,as well as individual interests and public interests.The implementation of the Healthy China Initiative greatly benefits from its practical significance.In practice,data from medical institutions takes varied forms,including personally identifiable data collected before diagnosis and treatment,clinical medical data generated during diagnosis and treatment,medical data collected in public health management,and potential medical data generated in daily life.In the new journey of comprehensively promoting the Chinese path to modernization,it is necessary to clarify the shift from an individual-oriented to a societal-oriented value system,highlighting the reinforcing role of the trust concept.Guided by the principle of minimizing data utilization,the focus is on the new developments and changes in medical institution data in the postpandemic era.This involves a series of measures such as fulfilling the obligation of notification and consent,specifying the scope of data collection and usage,strengthening the standardized use of relevant technical measures,and establishing a sound legal responsibility system for data compliance.Through these measures,a flexible and efficient medical institution data compliance system can be constructed.展开更多
As more medical data become digitalized,machine learning is regarded as a promising tool for constructing medical decision support systems.Even with vast medical data volumes,machine learning is still not fully exploi...As more medical data become digitalized,machine learning is regarded as a promising tool for constructing medical decision support systems.Even with vast medical data volumes,machine learning is still not fully exploiting its potential because the data usually sits in data silos,and privacy and security regulations restrict their access and use.To address these issues,we built a secured and explainable machine learning framework,called explainable federated XGBoost(EXPERTS),which can share valuable information among different medical institutions to improve the learning results without sharing the patients’ data.It also reveals how the machine makes a decision through eigenvalues to offer a more insightful answer to medical professionals.To study the performance,we evaluate our approach by real-world datasets,and our approach outperforms the benchmark algorithms under both federated learning and non-federated learning frameworks.展开更多
The security-related problem during the data exchange is not considered in the SyncML protocol. To solve this problem, SyncML is enhanced with a secure data synchronization exchange service application program interfa...The security-related problem during the data exchange is not considered in the SyncML protocol. To solve this problem, SyncML is enhanced with a secure data synchronization exchange service application program interface (SDSXS-API) to ensure the reliability, the integrity, and the security in the data transmission and exchange. The design and the implementation of SDSXS-API are also given. The proposed APIs can be conveniently used as a uniform exchange interface for any related application programs. And their effectiveness is verified in the prototype mobile database system.展开更多
With the emergence of cloud technologies,the services of healthcare systems have grown.Simultaneously,machine learning systems have become important tools for developing matured and decision-making computer applicatio...With the emergence of cloud technologies,the services of healthcare systems have grown.Simultaneously,machine learning systems have become important tools for developing matured and decision-making computer applications.Both cloud computing and machine learning technologies have contributed significantly to the success of healthcare services.However,in some areas,these technologies are needed to provide and decide the next course of action for patients suffering from diabetic kidney disease(DKD)while ensuring privacy preservation of the medical data.To address the cloud data privacy problem,we proposed a DKD prediction module in a framework using cloud computing services and a data control scheme.This framework can provide improved and early treatment before end-stage renal failure.For prediction purposes,we implemented the following machine learning algorithms:support vector machine(SVM),random forest(RF),decision tree(DT),naïve Bayes(NB),deep learning(DL),and k nearest neighbor(KNN).These classification techniques combined with the cloud computing services significantly improved the decision making in the progress of DKD patients.We applied these classifiers to the UCI Machine Learning Repository for chronic kidney disease using various clinical features,which are categorized as single,combination of selected features,and all features.During single clinical feature experiments,machine learning classifiers SVM,RF,and KNN outperformed the remaining classification techniques,whereas in combined clinical feature experiments,the maximum accuracy was achieved for the combination of DL and RF.All the feature experiments presented increased accuracy and increased F-measure metrics from SVM,DL,and RF.展开更多
In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dy...In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.展开更多
Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the clou...Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.展开更多
Privacy-preserving data publishing (PPDP) is one of the hot issues in the field of the network security. The existing PPDP technique cannot deal with generality attacks, which explicitly contain the sensitivity atta...Privacy-preserving data publishing (PPDP) is one of the hot issues in the field of the network security. The existing PPDP technique cannot deal with generality attacks, which explicitly contain the sensitivity attack and the similarity attack. This paper proposes a novel model, (w,γ, k)-anonymity, to avoid generality attacks on both cases of numeric and categorical attributes. We show that the optimal (w, γ, k)-anonymity problem is NP-hard and conduct the Top-down Local recoding (TDL) algorithm to implement the model. Our experiments validate the improvement of our model with real data.展开更多
Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industr...Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.展开更多
These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairnes...These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairness because the seller and the buyer may not fully trust each other.Therefore,in this paper,a blockchain-based secure and fair data trading system is proposed by taking advantage of the smart contract and matchmaking encryption.The proposed system enables bilateral authorization,where data trading between a seller and a buyer is accomplished only if their policies,required by each other,are satisfied simultaneously.This can be achieved by exploiting the security features of the matchmaking encryption.To guarantee non-repudiation and fairness between trading parties,the proposed system leverages a smart contract to ensure that the parties honestly carry out the data trading protocol.However,the smart contract in the proposed system does not include complex cryptographic operations for the efficiency of onchain processes.Instead,these operations are carried out by off-chain parties and their results are used as input for the on-chain procedure.The system also uses an arbitration protocol to resolve disputes based on the trading proof recorded on the blockchain.The performance of the protocol is evaluated in terms of off-chain computation overhead and on-chain gas consumption.The results of the experiments demonstrate that the proposed protocols can enable the implementation of a cost-effective data trading system.展开更多
Big data has been taken as a Chinese national strategy in order to satisfy the developments of the social and economic requirements and the development of new information technology. The prosperity of big data brings ...Big data has been taken as a Chinese national strategy in order to satisfy the developments of the social and economic requirements and the development of new information technology. The prosperity of big data brings not only convenience to people's daily life and more opportunities to enterprises, but more challenges with information security as well. This paper has a research on new types and features of information security issues in the age of big data, and puts forward the solutions for the above issues: build up the big data security management platform, set up the establishment of information security system and implement relevant laws and regulations.展开更多
Data aggregation technology reduces traffic overhead of wireless sensor network and extends effective working time of the network,yet continued operation of wireless sensor networks increases the probability of aggreg...Data aggregation technology reduces traffic overhead of wireless sensor network and extends effective working time of the network,yet continued operation of wireless sensor networks increases the probability of aggregation nodes being captured and probability of aggregated data being tampered.Thus it will seriously affect the security performance of the network. For network security issues,a stateful public key based SDAM( secure data aggregation model) is proposed for wireless sensor networks( WSNs),which employs a new stateful public key encryption to provide efficient end-to-end security. Moreover,the security aggregation model will not impose any bound on the aggregation function property,so as to realize the low cost and high security level at the same time.展开更多
Cloud computing is very useful for big data owner who doesn't want to manage IT infrastructure and big data technique details. However, it is hard for big data owner to trust multi-layer outsourced big data system...Cloud computing is very useful for big data owner who doesn't want to manage IT infrastructure and big data technique details. However, it is hard for big data owner to trust multi-layer outsourced big data system in cloud environment and to verify which outsourced service leads to the problem. Similarly, the cloud service provider cannot simply trust the data computation applications. At last,the verification data itself may also leak the sensitive information from the cloud service provider and data owner. We propose a new three-level definition of the verification, threat model, corresponding trusted policies based on different roles for outsourced big data system in cloud. We also provide two policy enforcement methods for building trusted data computation environment by measuring both the Map Reduce application and its behaviors based on trusted computing and aspect-oriented programming. To prevent sensitive information leakage from verification process,we provide a privacy-preserved verification method. Finally, we implement the TPTVer, a Trusted third Party based Trusted Verifier as a proof of concept system. Our evaluation and analysis show that TPTVer can provide trusted verification for multi-layered outsourced big data system in the cloud with low overhead.展开更多
With the development of information technology,the Internet of Things(IoT)has gradually become the third wave of the worldwide information industry revolution after the computer and the Internet.The application of the...With the development of information technology,the Internet of Things(IoT)has gradually become the third wave of the worldwide information industry revolution after the computer and the Internet.The application of the IoT has brought great convenience to people’s production and life.However,the potential information security problems in various IoT applications are gradually exposed and people pay more attention to them.The traditional centralized data storage and management model of the IoT is easy to cause transmission delay,single point of failure,privacy disclosure and other problems,and eventually leads to unpredictable behavior of the system.Blockchain technology can effectively improve the operation and data security status of the IoT.Referring to the storage model of the Fabric blockchain project,this paper designs a data security storage model suitable for the IoT system.The simulation results show that the model is not only effective and extensible,but also can better protect the data security of the Internet of Things.展开更多
Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics data...Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics database, gravity database, geochemistry database and remote sensing database, is very significant. We developed national important mining zone database (NIMZDB) to manage 14 national important mining zone databases to support a new round prediction of ore deposit. We found that attention should be paid to the following issues: ① data accuracy: integrity, logic consistency, attribute, spatial and time accuracy; ② management of both attribute and spatial data in the same system;③ transforming data between MapGIS and ArcGIS; ④ data sharing and security; ⑤ data searches that can query both attribute and spatial data. Accuracy of input data is guaranteed and the search, analysis and translation of data between MapGIS and ArcGIS has been made convenient via the development of a checking data module and a managing data module based on MapGIS and ArcGIS. Using AreSDE, we based data sharing on a client/server system, and attribute and spatial data are also managed in the same system.展开更多
文摘In the ancient block Hill cipher, the cipher text is obtained by multiplying the blocks of the plain text with the key matrix. To strengthen the keymatrix, a double guard Hill cipher was proposed with two key matrices, a private key matrix and its modified key matrix along with permutation. In the ancient block Hill cipher, the cipher text is obtained by multiplying the blocks of the plain text with the key matrix. To strengthen the key matrix, a double guard Hill cipher was proposed with two key matrices, a private key matrix and its modified key matrix along with permutation. In this paper a novel modification is performed to the double guard Hill cipher in order to reduce the number of calculation to obtain the cipher text by using non-square matrices. This modified double guard Hill cipher uses a non-square matrix of order (p × q) as its private keymatrix.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金Key Research and Development and Promotion Program of Henan Province(No.222102210069)Zhongyuan Science and Technology Innovation Leading Talent Project(224200510003)National Natural Science Foundation of China(No.62102449).
文摘Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policymanagement efficiency and difficulty in accurately describing the access control policy. To overcome theseproblems, this paper proposes a big data access control mechanism based on a two-layer permission decisionstructure. This mechanism extends the attribute-based access control (ABAC) model. Business attributes areintroduced in the ABAC model as business constraints between entities. The proposed mechanism implementsa two-layer permission decision structure composed of the inherent attributes of access control entities and thebusiness attributes, which constitute the general permission decision algorithm based on logical calculation andthe business permission decision algorithm based on a bi-directional long short-term memory (BiLSTM) neuralnetwork, respectively. The general permission decision algorithm is used to implement accurate policy decisions,while the business permission decision algorithm implements fuzzy decisions based on the business constraints.The BiLSTM neural network is used to calculate the similarity of the business attributes to realize intelligent,adaptive, and efficient access control permission decisions. Through the two-layer permission decision structure,the complex and diverse big data access control management requirements can be satisfied by considering thesecurity and availability of resources. Experimental results show that the proposed mechanism is effective andreliable. In summary, it can efficiently support the secure sharing of big data resources.
基金support of the Interdisciplinary Research Center for Intelligent Secure Systems(IRC-ISS)Internal Fund Grant#INSS2202.
文摘The use of the Internet of Things(IoT)is expanding at an unprecedented scale in many critical applications due to the ability to interconnect and utilize a plethora of wide range of devices.In critical infrastructure domains like oil and gas supply,intelligent transportation,power grids,and autonomous agriculture,it is essential to guarantee the confidentiality,integrity,and authenticity of data collected and exchanged.However,the limited resources coupled with the heterogeneity of IoT devices make it inefficient or sometimes infeasible to achieve secure data transmission using traditional cryptographic techniques.Consequently,designing a lightweight secure data transmission scheme is becoming essential.In this article,we propose lightweight secure data transmission(LSDT)scheme for IoT environments.LSDT consists of three phases and utilizes an effective combination of symmetric keys and the Elliptic Curve Menezes-Qu-Vanstone asymmetric key agreement protocol.We design the simulation environment and experiments to evaluate the performance of the LSDT scheme in terms of communication and computation costs.Security and performance analysis indicates that the LSDT scheme is secure,suitable for IoT applications,and performs better in comparison to other related security schemes.
文摘With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.The best way to enhance traffic flow for vehicles and traffic management departments is to share thedata they receive.There needs to be more protection for the VANET systems.An effective and safe methodof outsourcing is suggested,which reduces computation costs by achieving data security using a homomorphicmapping based on the conjugate operation of matrices.This research proposes a VANET-based data outsourcingsystem to fix the issues.To keep data outsourcing secure,the suggested model takes cryptography models intoaccount.Fog will keep the generated keys for the purpose of vehicle authentication.For controlling and overseeingthe outsourced data while preserving privacy,the suggested approach considers the Trusted Certified Auditor(TCA).Using the secret key,TCA can identify the genuine identity of VANETs when harmful messages aredetected.The proposed model develops a TCA-based unique static vehicle labeling system using cryptography(TCA-USVLC)for secure data outsourcing and privacy preservation in VANETs.The proposed model calculatesthe trust of vehicles in 16 ms for an average of 180 vehicles and achieves 98.6%accuracy for data encryption toprovide security.The proposedmodel achieved 98.5%accuracy in data outsourcing and 98.6%accuracy in privacypreservation in fog-enabled VANETs.Elliptical curve cryptography models can be applied in the future for betterencryption and decryption rates with lightweight cryptography operations.
文摘Medical institution data compliance is an exogenous product of the digital society,serving as a crucial means to maintain and balance the relationship between data protection and data sharing,as well as individual interests and public interests.The implementation of the Healthy China Initiative greatly benefits from its practical significance.In practice,data from medical institutions takes varied forms,including personally identifiable data collected before diagnosis and treatment,clinical medical data generated during diagnosis and treatment,medical data collected in public health management,and potential medical data generated in daily life.In the new journey of comprehensively promoting the Chinese path to modernization,it is necessary to clarify the shift from an individual-oriented to a societal-oriented value system,highlighting the reinforcing role of the trust concept.Guided by the principle of minimizing data utilization,the focus is on the new developments and changes in medical institution data in the postpandemic era.This involves a series of measures such as fulfilling the obligation of notification and consent,specifying the scope of data collection and usage,strengthening the standardized use of relevant technical measures,and establishing a sound legal responsibility system for data compliance.Through these measures,a flexible and efficient medical institution data compliance system can be constructed.
文摘As more medical data become digitalized,machine learning is regarded as a promising tool for constructing medical decision support systems.Even with vast medical data volumes,machine learning is still not fully exploiting its potential because the data usually sits in data silos,and privacy and security regulations restrict their access and use.To address these issues,we built a secured and explainable machine learning framework,called explainable federated XGBoost(EXPERTS),which can share valuable information among different medical institutions to improve the learning results without sharing the patients’ data.It also reveals how the machine makes a decision through eigenvalues to offer a more insightful answer to medical professionals.To study the performance,we evaluate our approach by real-world datasets,and our approach outperforms the benchmark algorithms under both federated learning and non-federated learning frameworks.
文摘The security-related problem during the data exchange is not considered in the SyncML protocol. To solve this problem, SyncML is enhanced with a secure data synchronization exchange service application program interface (SDSXS-API) to ensure the reliability, the integrity, and the security in the data transmission and exchange. The design and the implementation of SDSXS-API are also given. The proposed APIs can be conveniently used as a uniform exchange interface for any related application programs. And their effectiveness is verified in the prototype mobile database system.
文摘With the emergence of cloud technologies,the services of healthcare systems have grown.Simultaneously,machine learning systems have become important tools for developing matured and decision-making computer applications.Both cloud computing and machine learning technologies have contributed significantly to the success of healthcare services.However,in some areas,these technologies are needed to provide and decide the next course of action for patients suffering from diabetic kidney disease(DKD)while ensuring privacy preservation of the medical data.To address the cloud data privacy problem,we proposed a DKD prediction module in a framework using cloud computing services and a data control scheme.This framework can provide improved and early treatment before end-stage renal failure.For prediction purposes,we implemented the following machine learning algorithms:support vector machine(SVM),random forest(RF),decision tree(DT),naïve Bayes(NB),deep learning(DL),and k nearest neighbor(KNN).These classification techniques combined with the cloud computing services significantly improved the decision making in the progress of DKD patients.We applied these classifiers to the UCI Machine Learning Repository for chronic kidney disease using various clinical features,which are categorized as single,combination of selected features,and all features.During single clinical feature experiments,machine learning classifiers SVM,RF,and KNN outperformed the remaining classification techniques,whereas in combined clinical feature experiments,the maximum accuracy was achieved for the combination of DL and RF.All the feature experiments presented increased accuracy and increased F-measure metrics from SVM,DL,and RF.
基金supported by the National Key Basic Research Program of China(973 program) under Grant No.2012CB315901
文摘In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.
基金This work was supported by the National Natural Science Foundation of China(No.61702276)the Startup Foundation for Introducing Talent of Nanjing University of Information Science and Technology under Grant 2016r055 and the Priority Academic Program Development(PAPD)of Jiangsu Higher Education Institutions.The authors are grateful for the anonymous reviewers who made constructive comments and improvements.
文摘Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.
基金supported in part by Research Fund for the Doctoral Program of Higher Education of China(No.20120009110007)Program for Innovative Research Team in University of Ministry of Education of China (No.IRT201206)+3 种基金Program for New Century Excellent Talents in University(NCET-110565)the Fundamental Research Funds for the Central Universities(No.2012JBZ010)the Open Project Program of Beijing Key Laboratory of Trusted Computing at Beijing University of TechnologyBeijing Higher Education Young Elite Teacher Project(No. YETP0542)
文摘Privacy-preserving data publishing (PPDP) is one of the hot issues in the field of the network security. The existing PPDP technique cannot deal with generality attacks, which explicitly contain the sensitivity attack and the similarity attack. This paper proposes a novel model, (w,γ, k)-anonymity, to avoid generality attacks on both cases of numeric and categorical attributes. We show that the optimal (w, γ, k)-anonymity problem is NP-hard and conduct the Top-down Local recoding (TDL) algorithm to implement the model. Our experiments validate the improvement of our model with real data.
文摘Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(No.2022R1I1A3063257)supported by Electronics and Telecommunications Research Institute(ETRI)grant funded by the Korean Government[22ZR1300,Research on Intelligent Cyber Security and Trust Infra].
文摘These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairness because the seller and the buyer may not fully trust each other.Therefore,in this paper,a blockchain-based secure and fair data trading system is proposed by taking advantage of the smart contract and matchmaking encryption.The proposed system enables bilateral authorization,where data trading between a seller and a buyer is accomplished only if their policies,required by each other,are satisfied simultaneously.This can be achieved by exploiting the security features of the matchmaking encryption.To guarantee non-repudiation and fairness between trading parties,the proposed system leverages a smart contract to ensure that the parties honestly carry out the data trading protocol.However,the smart contract in the proposed system does not include complex cryptographic operations for the efficiency of onchain processes.Instead,these operations are carried out by off-chain parties and their results are used as input for the on-chain procedure.The system also uses an arbitration protocol to resolve disputes based on the trading proof recorded on the blockchain.The performance of the protocol is evaluated in terms of off-chain computation overhead and on-chain gas consumption.The results of the experiments demonstrate that the proposed protocols can enable the implementation of a cost-effective data trading system.
基金supported by National Key Technology Support Program(No.2013BAD17B06)Major Program of National Social Science Fund(No.15ZDB154)
文摘Big data has been taken as a Chinese national strategy in order to satisfy the developments of the social and economic requirements and the development of new information technology. The prosperity of big data brings not only convenience to people's daily life and more opportunities to enterprises, but more challenges with information security as well. This paper has a research on new types and features of information security issues in the age of big data, and puts forward the solutions for the above issues: build up the big data security management platform, set up the establishment of information security system and implement relevant laws and regulations.
基金Support by the National High Technology Research and Development Program of China(No.2012AA120802)the National Natural Science Foundation of China(No.61302074)+1 种基金Specialized Research Fund for the Doctoral Program of Higher Education(No.20122301120004)Natural Science Foundation of Heilongjiang Province(No.QC2013C061)
文摘Data aggregation technology reduces traffic overhead of wireless sensor network and extends effective working time of the network,yet continued operation of wireless sensor networks increases the probability of aggregation nodes being captured and probability of aggregated data being tampered.Thus it will seriously affect the security performance of the network. For network security issues,a stateful public key based SDAM( secure data aggregation model) is proposed for wireless sensor networks( WSNs),which employs a new stateful public key encryption to provide efficient end-to-end security. Moreover,the security aggregation model will not impose any bound on the aggregation function property,so as to realize the low cost and high security level at the same time.
基金partially supported by grants from the China 863 High-tech Program (Grant No. 2015AA016002)the Specialized Research Fund for the Doctoral Program of Higher Education (Grant No. 20131103120001)+2 种基金the National Key Research and Development Program of China (Grant No. 2016YFB0800204)the National Science Foundation of China (No. 61502017)the Scientific Research Common Program of Beijing Municipal Commission of Education (KM201710005024)
文摘Cloud computing is very useful for big data owner who doesn't want to manage IT infrastructure and big data technique details. However, it is hard for big data owner to trust multi-layer outsourced big data system in cloud environment and to verify which outsourced service leads to the problem. Similarly, the cloud service provider cannot simply trust the data computation applications. At last,the verification data itself may also leak the sensitive information from the cloud service provider and data owner. We propose a new three-level definition of the verification, threat model, corresponding trusted policies based on different roles for outsourced big data system in cloud. We also provide two policy enforcement methods for building trusted data computation environment by measuring both the Map Reduce application and its behaviors based on trusted computing and aspect-oriented programming. To prevent sensitive information leakage from verification process,we provide a privacy-preserved verification method. Finally, we implement the TPTVer, a Trusted third Party based Trusted Verifier as a proof of concept system. Our evaluation and analysis show that TPTVer can provide trusted verification for multi-layered outsourced big data system in the cloud with low overhead.
基金supported by the National Social Science Foundation Project of China under Grant 16BTQ085.
文摘With the development of information technology,the Internet of Things(IoT)has gradually become the third wave of the worldwide information industry revolution after the computer and the Internet.The application of the IoT has brought great convenience to people’s production and life.However,the potential information security problems in various IoT applications are gradually exposed and people pay more attention to them.The traditional centralized data storage and management model of the IoT is easy to cause transmission delay,single point of failure,privacy disclosure and other problems,and eventually leads to unpredictable behavior of the system.Blockchain technology can effectively improve the operation and data security status of the IoT.Referring to the storage model of the Fabric blockchain project,this paper designs a data security storage model suitable for the IoT system.The simulation results show that the model is not only effective and extensible,but also can better protect the data security of the Internet of Things.
基金This paper is financially supported by the National I mportant MiningZone Database ( No .200210000004)Prediction and Assessment ofMineral Resources and Social Service (No .1212010331402) .
文摘Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics database, gravity database, geochemistry database and remote sensing database, is very significant. We developed national important mining zone database (NIMZDB) to manage 14 national important mining zone databases to support a new round prediction of ore deposit. We found that attention should be paid to the following issues: ① data accuracy: integrity, logic consistency, attribute, spatial and time accuracy; ② management of both attribute and spatial data in the same system;③ transforming data between MapGIS and ArcGIS; ④ data sharing and security; ⑤ data searches that can query both attribute and spatial data. Accuracy of input data is guaranteed and the search, analysis and translation of data between MapGIS and ArcGIS has been made convenient via the development of a checking data module and a managing data module based on MapGIS and ArcGIS. Using AreSDE, we based data sharing on a client/server system, and attribute and spatial data are also managed in the same system.