Objective and accurate classification model or method of cloud image is a prerequisite for accurate weather monitoring and forecast.Thus safety of aircraft taking off and landing and air flight can be guaranteed.Thres...Objective and accurate classification model or method of cloud image is a prerequisite for accurate weather monitoring and forecast.Thus safety of aircraft taking off and landing and air flight can be guaranteed.Thresholding is a kind of simple and effective method of cloud classification.It can realize automated ground-based cloud detection and cloudage observation.The existing segmentation methods based on fixed threshold and single threshold cannot achieve good segmentation effect.Thus it is difficult to obtain the accurate result of cloud detection and cloudage observation.In view of the above-mentioned problems,multi-thresholding methods of ground-based cloud based on exponential entropy/exponential gray entropy and uniform searching particle swarm optimization(UPSO)are proposed.Exponential entropy and exponential gray entropy make up for the defects of undefined value and zero value in Shannon entropy.In addition,exponential gray entropy reflects the relative uniformity of gray levels within the cloud cluster and background cluster.Cloud regions and background regions of different gray level ranges can be distinguished more precisely using the multi-thresholding strategy.In order to reduce computational complexity of original exhaustive algorithm for multi-threshold selection,the UPSO algorithm is adopted.It can find the optimal thresholds quickly and accurately.As a result,the real-time processing of segmentation of groundbased cloud image can be realized.The experimental results show that,in comparison with the existing groundbased cloud image segmentation methods and multi-thresholding method based on maximum Shannon entropy,the proposed methods can extract the boundary shape,textures and details feature of cloud more clearly.Therefore,the accuracies of cloudage detection and morphology classification for ground-based cloud are both improved.展开更多
The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity d...The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity degree/relation in multi-scale map spaces and then proposes a model for calculating the degree of spatial similarity between a point cloud at one scale and its gener- alized counterpart at another scale. After validation, the new model features 16 points with map scale change as the x coordinate and the degree of spatial similarity as the y coordinate. Finally, using an application for curve fitting, the model achieves an empirical formula that can calculate the degree of spatial similarity using map scale change as the sole independent variable, and vice versa. This formula can be used to automate algorithms for point feature generalization and to determine when to terminate them during the generalization.展开更多
This paper presents an automated method for discontinuity trace mapping using three-dimensional point clouds of rock mass surfaces.Specifically,the method consists of five steps:(1)detection of trace feature points by...This paper presents an automated method for discontinuity trace mapping using three-dimensional point clouds of rock mass surfaces.Specifically,the method consists of five steps:(1)detection of trace feature points by normal tensor voting theory,(2)co ntraction of trace feature points,(3)connection of trace feature points,(4)linearization of trace segments,and(5)connection of trace segments.A sensitivity analysis was then conducted to identify the optimal parameters of the proposed method.Three field cases,a natural rock mass outcrop and two excavated rock tunnel surfaces,were analyzed using the proposed method to evaluate its validity and efficiency.The results show that the proposed method is more efficient and accurate than the traditional trace mapping method,and the efficiency enhancement is more robust as the number of feature points increases.展开更多
The Western Yunnan Earthquake Predication Test Site set up jointly by the China Earthquake Administration,the National Science Foundation Commission of America,and United States Geological Survey has played an importa...The Western Yunnan Earthquake Predication Test Site set up jointly by the China Earthquake Administration,the National Science Foundation Commission of America,and United States Geological Survey has played an important role in development of early earthquake research work in China. Due to various objective reasons, most of the predicted targets in the earthquake prediction test site have not been achieved,and the development has been hindered. In recent years, the experiment site has been reconsidered,and renamed the "Earthquake Science Experimental Site". Combined with the current development of seismology and the practical needs of disaster prevention and mitigation,we propose adding the "Underground Cloud Map"as the new direction of the experimental site. Using highly repeatable, environmentally friendly and safe airgun sources,we could send constant seismic signals,which realizes continuous monitoring of subsurface velocity changes. Utilizing the high-resolution 3-D crustal structure from ambient noise tomography,we could obtain 4-D (3-D space+1-D time) images of subsurface structures, which we termed the "Underground Cloud Map". The"Underground Cloud Map" can reflect underground velocity and stress changes,providing new means for the earthquake monitoring forecast nationwide,which promotes the conversion of experience-based earthquake prediction to physics-based prediction.展开更多
The long awaited cloud computing concept is a reality now due to the transformation of computer generations.However,security challenges have become the biggest obstacles for the advancement of this emerging technology...The long awaited cloud computing concept is a reality now due to the transformation of computer generations.However,security challenges have become the biggest obstacles for the advancement of this emerging technology.A well-established policy framework is defined in this paper to generate security policies which are compliant to requirements and capabilities.Moreover,a federated policy management schema is introduced based on the policy definition framework and a multi-level policy application to create and manage virtual clusters with identical or common security levels.The proposed model consists in the design of a well-established ontology according to security mechanisms,a procedure which classifies nodes with common policies into virtual clusters,a policy engine to enhance the process of mapping requests to a specific node as well as an associated cluster and matchmaker engine to eliminate inessential mapping processes.The suggested model has been evaluated according to performance and security parameters to prove the efficiency and reliability of this multilayered engine in cloud computing environments during policy definition,application and mapping procedures.展开更多
Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applicatio...Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.展开更多
Star sensors are an important means of autonomous navigation and access to space information for satellites.They have been widely deployed in the aerospace field.To satisfy the requirements for high resolution,timelin...Star sensors are an important means of autonomous navigation and access to space information for satellites.They have been widely deployed in the aerospace field.To satisfy the requirements for high resolution,timeliness,and confidentiality of star images,we propose an edge computing algorithm based on the star sensor cloud.Multiple sensors cooperate with each other to forma sensor cloud,which in turn extends the performance of a single sensor.The research on the data obtained by the star sensor has very important research and application values.First,a star point extraction model is proposed based on the fuzzy set model by analyzing the star image composition,which can reduce the amount of data computation.Then,a mappingmodel between content and space is constructed to achieve low-rank image representation and efficient computation.Finally,the data collected by the wireless sensor is delivered to the edge server,and a differentmethod is used to achieve privacy protection.Only a small amount of core data is stored in edge servers and local servers,and other data is transmitted to the cloud.Experiments show that the proposed algorithm can effectively reduce the cost of communication and storage,and has strong privacy.展开更多
基金Supported by the National Natural Science Foundation of China(60872065)the Open Foundation of Key Laboratory of Meteorological Disaster of Ministry of Education at Nanjing University of Information Science & Technology(KLME1108)the Priority Academic Program Development of Jiangsu Higher Education Institutions
文摘Objective and accurate classification model or method of cloud image is a prerequisite for accurate weather monitoring and forecast.Thus safety of aircraft taking off and landing and air flight can be guaranteed.Thresholding is a kind of simple and effective method of cloud classification.It can realize automated ground-based cloud detection and cloudage observation.The existing segmentation methods based on fixed threshold and single threshold cannot achieve good segmentation effect.Thus it is difficult to obtain the accurate result of cloud detection and cloudage observation.In view of the above-mentioned problems,multi-thresholding methods of ground-based cloud based on exponential entropy/exponential gray entropy and uniform searching particle swarm optimization(UPSO)are proposed.Exponential entropy and exponential gray entropy make up for the defects of undefined value and zero value in Shannon entropy.In addition,exponential gray entropy reflects the relative uniformity of gray levels within the cloud cluster and background cluster.Cloud regions and background regions of different gray level ranges can be distinguished more precisely using the multi-thresholding strategy.In order to reduce computational complexity of original exhaustive algorithm for multi-threshold selection,the UPSO algorithm is adopted.It can find the optimal thresholds quickly and accurately.As a result,the real-time processing of segmentation of groundbased cloud image can be realized.The experimental results show that,in comparison with the existing groundbased cloud image segmentation methods and multi-thresholding method based on maximum Shannon entropy,the proposed methods can extract the boundary shape,textures and details feature of cloud more clearly.Therefore,the accuracies of cloudage detection and morphology classification for ground-based cloud are both improved.
基金funded by the Natural Science Foundation Committee,China(41364001,41371435)
文摘The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity degree/relation in multi-scale map spaces and then proposes a model for calculating the degree of spatial similarity between a point cloud at one scale and its gener- alized counterpart at another scale. After validation, the new model features 16 points with map scale change as the x coordinate and the degree of spatial similarity as the y coordinate. Finally, using an application for curve fitting, the model achieves an empirical formula that can calculate the degree of spatial similarity using map scale change as the sole independent variable, and vice versa. This formula can be used to automate algorithms for point feature generalization and to determine when to terminate them during the generalization.
基金supported by the Special Fund for Basic Research on Scientific Instruments of the National Natural Science Foundation of China(Grant No.4182780021)Emeishan-Hanyuan Highway ProgramTaihang Mountain Highway Program。
文摘This paper presents an automated method for discontinuity trace mapping using three-dimensional point clouds of rock mass surfaces.Specifically,the method consists of five steps:(1)detection of trace feature points by normal tensor voting theory,(2)co ntraction of trace feature points,(3)connection of trace feature points,(4)linearization of trace segments,and(5)connection of trace segments.A sensitivity analysis was then conducted to identify the optimal parameters of the proposed method.Three field cases,a natural rock mass outcrop and two excavated rock tunnel surfaces,were analyzed using the proposed method to evaluate its validity and efficiency.The results show that the proposed method is more efficient and accurate than the traditional trace mapping method,and the efficiency enhancement is more robust as the number of feature points increases.
基金sponsored by the National Natural Science Foundation of China(Grant Nos.41790463 and 41674058)
文摘The Western Yunnan Earthquake Predication Test Site set up jointly by the China Earthquake Administration,the National Science Foundation Commission of America,and United States Geological Survey has played an important role in development of early earthquake research work in China. Due to various objective reasons, most of the predicted targets in the earthquake prediction test site have not been achieved,and the development has been hindered. In recent years, the experiment site has been reconsidered,and renamed the "Earthquake Science Experimental Site". Combined with the current development of seismology and the practical needs of disaster prevention and mitigation,we propose adding the "Underground Cloud Map"as the new direction of the experimental site. Using highly repeatable, environmentally friendly and safe airgun sources,we could send constant seismic signals,which realizes continuous monitoring of subsurface velocity changes. Utilizing the high-resolution 3-D crustal structure from ambient noise tomography,we could obtain 4-D (3-D space+1-D time) images of subsurface structures, which we termed the "Underground Cloud Map". The"Underground Cloud Map" can reflect underground velocity and stress changes,providing new means for the earthquake monitoring forecast nationwide,which promotes the conversion of experience-based earthquake prediction to physics-based prediction.
文摘The long awaited cloud computing concept is a reality now due to the transformation of computer generations.However,security challenges have become the biggest obstacles for the advancement of this emerging technology.A well-established policy framework is defined in this paper to generate security policies which are compliant to requirements and capabilities.Moreover,a federated policy management schema is introduced based on the policy definition framework and a multi-level policy application to create and manage virtual clusters with identical or common security levels.The proposed model consists in the design of a well-established ontology according to security mechanisms,a procedure which classifies nodes with common policies into virtual clusters,a policy engine to enhance the process of mapping requests to a specific node as well as an associated cluster and matchmaker engine to eliminate inessential mapping processes.The suggested model has been evaluated according to performance and security parameters to prove the efficiency and reliability of this multilayered engine in cloud computing environments during policy definition,application and mapping procedures.
文摘Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.
基金supported by Science and Technology Rising Star of Shaanxi Youth (No.2021KJXX-61)The Open Project Program of the State Key Lab of CAD&CG,Zhejiang University (No.A2206).
文摘Star sensors are an important means of autonomous navigation and access to space information for satellites.They have been widely deployed in the aerospace field.To satisfy the requirements for high resolution,timeliness,and confidentiality of star images,we propose an edge computing algorithm based on the star sensor cloud.Multiple sensors cooperate with each other to forma sensor cloud,which in turn extends the performance of a single sensor.The research on the data obtained by the star sensor has very important research and application values.First,a star point extraction model is proposed based on the fuzzy set model by analyzing the star image composition,which can reduce the amount of data computation.Then,a mappingmodel between content and space is constructed to achieve low-rank image representation and efficient computation.Finally,the data collected by the wireless sensor is delivered to the edge server,and a differentmethod is used to achieve privacy protection.Only a small amount of core data is stored in edge servers and local servers,and other data is transmitted to the cloud.Experiments show that the proposed algorithm can effectively reduce the cost of communication and storage,and has strong privacy.