With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laborat...Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.展开更多
Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This a...Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.展开更多
For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are ac...For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage p...As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.展开更多
A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is stil...A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is still a majorconcern. Encryption, network security, and adherence to data protection laws are key to ensuring the confidentialityand integrity of healthcare data in the cloud. The computational overhead of encryption technologies could leadto delays in data access and processing rates. To address these challenges, we introduced the Enhanced ParallelMulti-Key Encryption Algorithm (EPM-KEA), aiming to bolster healthcare data security and facilitate the securestorage of critical patient records in the cloud. The data was gathered from two categories Authorization forHospital Admission (AIH) and Authorization for High Complexity Operations.We use Z-score normalization forpreprocessing. The primary goal of implementing encryption techniques is to secure and store massive amountsof data on the cloud. It is feasible that cloud storage alternatives for protecting healthcare data will become morewidely available if security issues can be successfully fixed. As a result of our analysis using specific parametersincluding Execution time (42%), Encryption time (45%), Decryption time (40%), Security level (97%), and Energyconsumption (53%), the system demonstrated favorable performance when compared to the traditional method.This suggests that by addressing these security concerns, there is the potential for broader accessibility to cloudstorage solutions for safeguarding healthcare data.展开更多
Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud...Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud Computing,resulting in data processing and management inefficiency.This paper proposes a simplified and improved kernel on an x86 system designed for Big Data and Cloud Computing purposes.The proposed algorithm utilizes the performance benefits from the improved Input/Output(I/O)performance.The performance engineering runs the data-oriented design on traditional data management to improve data processing speed by reducing memory access overheads in conventional data management.The OS incorporates a data-oriented design to“modernize”various Data Science and management aspects.The resulting OS contains a basic input/output system(BIOS)bootloader that boots into Intel 32-bit protected mode,a text display terminal,4 GB paging memory,4096 heap block size,a Hard Disk Drive(HDD)I/O Advanced Technology Attachment(ATA)driver and more.There are also I/O scheduling algorithm prototypes that demonstrate how a simple Sweeping algorithm is superior to more conventionally known I/O scheduling algorithms.A MapReduce prototype is implemented using Message Passing Interface(MPI)for big data purposes.An attempt was made to optimize binary search using modern performance engineering and data-oriented design.展开更多
The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a w...The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.展开更多
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes...Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.展开更多
In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clusteri...In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clustering with cloud model. Firstly, taking data source as a complex network, after the topography of network is obtained, the cloud model of each node data is determined by fuzzy analytic hierarchy process (AHP). Secondly, by calculating expectation, entropy and hyper entropy of the cloud model, comprehensive coupling strength is got and then it is regarded as the edge weight of topography. Finally, distribution curve is obtained by iterating the phase of each node by means of phase synchronization model. Thus classification of data source is completed. This method can not only provide convenience for storage, cleaning and compression of data, but also improve the efficiency of data analysis.展开更多
Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the dat...Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.展开更多
With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dyn...With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtuMized resources are provided as services. With virtualization technology, cloud computing offers diverse services (such as virtual computing, virtual storage, virtual bandwidth, etc.) for the public by means of multi-tenancy mode. Although users are enjoying the capabilities of super-computing and mass storage supplied by cloud computing, cloud security still remains as a hot spot problem, which is in essence the trust management between data owners and storage service providers. In this paper, we propose a data coloring method based on cloud watermarking to recognize and ensure mutual reputations. The experimental results show that the robustness of reverse cloud generator can guarantee users' embedded social reputation identifications. Hence, our work provides a reference solution to the critical problem of cloud security.展开更多
Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the clou...Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.展开更多
In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dy...In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.展开更多
The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in...The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.展开更多
How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data cente...How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data centers. The winner tree is introduced to make the data nodes as the leaf nodes of the tree and the final winner on the purpose of reducing energy consumption is selected. The complexity of large-scale cloud data centers is fully consider, and the task comparson coefficient is defined to make task scheduling strategy more reasonable. Experiments and performance analysis show that the proposed algorithm can effectively improve the node utilization, and reduce the overall power consumption of the cloud data center.展开更多
Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server a...Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.展开更多
To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of polici...To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.展开更多
Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were us...Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were used to initialize the Global/Regional Assimilation and Prediction System model(GRAPES) in China to predict precipitation in a rainstorm case in the country. Three prediction experiments were conducted and were used to investigate the impacts of FY-2C satellite data on cloud analysis of LAPS and on short range precipitation forecasts. In the first experiment, the initial cloud fields was zero value. In the second, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS without combining the satellite data. In the third experiment, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS including satellite data. The results indicated that the FY-2C satellite data combination in LAPS can show more realistic cloud distributions, and the model simulation for precipitation in 1–6 h had certain improvements over that when satellite data and complex cloud analysis were not applied.展开更多
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
文摘Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.
文摘Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.
基金supported by the National Natural Science Foundation of China (62173103)the Fundamental Research Funds for the Central Universities of China (3072022JC0402,3072022JC0403)。
文摘For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
基金supported by the Ministerio Espanol de Ciencia e Innovación under Project Number PID2020-115570GB-C22,MCIN/AEI/10.13039/501100011033by the Cátedra de Empresa Tecnología para las Personas(UGR-Fujitsu).
文摘As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.
文摘A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is still a majorconcern. Encryption, network security, and adherence to data protection laws are key to ensuring the confidentialityand integrity of healthcare data in the cloud. The computational overhead of encryption technologies could leadto delays in data access and processing rates. To address these challenges, we introduced the Enhanced ParallelMulti-Key Encryption Algorithm (EPM-KEA), aiming to bolster healthcare data security and facilitate the securestorage of critical patient records in the cloud. The data was gathered from two categories Authorization forHospital Admission (AIH) and Authorization for High Complexity Operations.We use Z-score normalization forpreprocessing. The primary goal of implementing encryption techniques is to secure and store massive amountsof data on the cloud. It is feasible that cloud storage alternatives for protecting healthcare data will become morewidely available if security issues can be successfully fixed. As a result of our analysis using specific parametersincluding Execution time (42%), Encryption time (45%), Decryption time (40%), Security level (97%), and Energyconsumption (53%), the system demonstrated favorable performance when compared to the traditional method.This suggests that by addressing these security concerns, there is the potential for broader accessibility to cloudstorage solutions for safeguarding healthcare data.
文摘Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud Computing,resulting in data processing and management inefficiency.This paper proposes a simplified and improved kernel on an x86 system designed for Big Data and Cloud Computing purposes.The proposed algorithm utilizes the performance benefits from the improved Input/Output(I/O)performance.The performance engineering runs the data-oriented design on traditional data management to improve data processing speed by reducing memory access overheads in conventional data management.The OS incorporates a data-oriented design to“modernize”various Data Science and management aspects.The resulting OS contains a basic input/output system(BIOS)bootloader that boots into Intel 32-bit protected mode,a text display terminal,4 GB paging memory,4096 heap block size,a Hard Disk Drive(HDD)I/O Advanced Technology Attachment(ATA)driver and more.There are also I/O scheduling algorithm prototypes that demonstrate how a simple Sweeping algorithm is superior to more conventionally known I/O scheduling algorithms.A MapReduce prototype is implemented using Message Passing Interface(MPI)for big data purposes.An attempt was made to optimize binary search using modern performance engineering and data-oriented design.
文摘The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.
文摘Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.
基金National Natural Science Foundation of China(No.61171057,No.61503345)Science Foundation for North University of China(No.110246)+1 种基金Specialized Research Fund for Doctoral Program of Higher Education of China(No.20121420110004)International Office of Shanxi Province Education Department of China,and Basic Research Project in Shanxi Province(Young Foundation)
文摘In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clustering with cloud model. Firstly, taking data source as a complex network, after the topography of network is obtained, the cloud model of each node data is determined by fuzzy analytic hierarchy process (AHP). Secondly, by calculating expectation, entropy and hyper entropy of the cloud model, comprehensive coupling strength is got and then it is regarded as the edge weight of topography. Finally, distribution curve is obtained by iterating the phase of each node by means of phase synchronization model. Thus classification of data source is completed. This method can not only provide convenience for storage, cleaning and compression of data, but also improve the efficiency of data analysis.
基金supported by the National Key R&D Program of China (Grant No.2017YFC1502104)the National Natural Science Foundation of China (Grant Nos.41775099 and 41605026)Grant No.NJCAR2016ZD02,and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD)
文摘Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.
基金supported by National Basic Research Program of China (973 Program) (No. 2007CB310800)China Postdoctoral Science Foundation (No. 20090460107 and No. 201003794)
文摘With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtuMized resources are provided as services. With virtualization technology, cloud computing offers diverse services (such as virtual computing, virtual storage, virtual bandwidth, etc.) for the public by means of multi-tenancy mode. Although users are enjoying the capabilities of super-computing and mass storage supplied by cloud computing, cloud security still remains as a hot spot problem, which is in essence the trust management between data owners and storage service providers. In this paper, we propose a data coloring method based on cloud watermarking to recognize and ensure mutual reputations. The experimental results show that the robustness of reverse cloud generator can guarantee users' embedded social reputation identifications. Hence, our work provides a reference solution to the critical problem of cloud security.
基金This work was supported by the National Natural Science Foundation of China(No.61702276)the Startup Foundation for Introducing Talent of Nanjing University of Information Science and Technology under Grant 2016r055 and the Priority Academic Program Development(PAPD)of Jiangsu Higher Education Institutions.The authors are grateful for the anonymous reviewers who made constructive comments and improvements.
文摘Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.
基金supported by the National Key Basic Research Program of China(973 program) under Grant No.2012CB315901
文摘In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.
基金supported in part by the National Nature Science Foundation of China under Grant No.61402413 and 61340058 the "Six Kinds Peak Talents Plan" project of Jiangsu Province under Grant No.ll-JY-009+2 种基金the Nature Science Foundation of Zhejiang Province under Grant No.LY14F020019, Z14F020006 and Y1101183the China Postdoctoral Science Foundation funded project under Grant No.2012M511732Jiangsu Province Postdoctoral Science Foundation funded project Grant No.1102014C
文摘The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.
基金supported by the National Natural Science Foundation of China(6120200461272084)+9 种基金the National Key Basic Research Program of China(973 Program)(2011CB302903)the Specialized Research Fund for the Doctoral Program of Higher Education(2009322312000120113223110003)the China Postdoctoral Science Foundation Funded Project(2011M5000952012T50514)the Natural Science Foundation of Jiangsu Province(BK2011754BK2009426)the Jiangsu Postdoctoral Science Foundation Funded Project(1102103C)the Natural Science Fund of Higher Education of Jiangsu Province(12KJB520007)the Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions(yx002001)
文摘How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data centers. The winner tree is introduced to make the data nodes as the leaf nodes of the tree and the final winner on the purpose of reducing energy consumption is selected. The complexity of large-scale cloud data centers is fully consider, and the task comparson coefficient is defined to make task scheduling strategy more reasonable. Experiments and performance analysis show that the proposed algorithm can effectively improve the node utilization, and reduce the overall power consumption of the cloud data center.
基金The authors would like to thank the reviewers for their detailed reviews and constructive comments, which have helped improve the quality of this paper. This work is supported by National Natural Science Foundation of China (No: 61379144), Foundation of Science and Technology on Information Assurance Laboratory (No: KJ-13-002) and the Graduate Innovation Fund of the National University of Defense Technology.
文摘Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.
基金funded by the National Key Research and Development Program of China(Grant No.2016YFA0600304)the Strategic Priority Research Program of Chinese Academy of Sciences(Grant No.XDA19030201).
文摘To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.
基金supported by the National Natural Science Foundation of China (41375025, 41275114, and 41275039)the National High Technology Research and Development Program of China (863 Program, 2012AA120903)+1 种基金the Public Benefit Research Foundation of the China Meteorological Administration (GYHY201106044 and GYHY201406001)the China Meteorological Administration Torrential Flood Project
文摘Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were used to initialize the Global/Regional Assimilation and Prediction System model(GRAPES) in China to predict precipitation in a rainstorm case in the country. Three prediction experiments were conducted and were used to investigate the impacts of FY-2C satellite data on cloud analysis of LAPS and on short range precipitation forecasts. In the first experiment, the initial cloud fields was zero value. In the second, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS without combining the satellite data. In the third experiment, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS including satellite data. The results indicated that the FY-2C satellite data combination in LAPS can show more realistic cloud distributions, and the model simulation for precipitation in 1–6 h had certain improvements over that when satellite data and complex cloud analysis were not applied.