Cloud computing enables performing computations and analysis tasks and sharing services in web-based computer centres instead of local desktop systems.One of the most used areas of cloud computing is geographic inform...Cloud computing enables performing computations and analysis tasks and sharing services in web-based computer centres instead of local desktop systems.One of the most used areas of cloud computing is geographic information systems(GIS)applications.Although Desktop GIS products are still used in the community frequently,Web GIS and Cloud GIS applications have drawn attention and have become more efficient for users.In this study,a serverless Cloud GIS framework is implemented for the land valuation platform.In order to store,analyse,and share geospatial data,the Aurora Serverless PostgreSQL database is created on Amazon Web Services(AWS).While adopting Aurora Serverless PostgreSQL as a database management system,a simple point in polygon analysis conducted to compare the performances with Amazon Relational Database Service(RDS)instance.Results showed that the serverless database responded to the query faster and scaled up during high workload to decrease latency.Hence,parcel vector data,which conveys ownership information and land values attributes,is shared directly from the PostGIS database as vector tiles.Besides S3 and AWS Lambda services are used for storing and disseminating raster-based land value map tiles.To visualize all shared data and maps through a web browser,open source web mapping library Mapbox GL JS is used.展开更多
The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing...The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing platform for a dust storm forecasting operational system should support a disruptive fashion by scaling up to enable high-resolution forecasting and massive public access when dust storms come and scaling down when no dust storm events occur to save energy and costs.With the capability of providing a large,elastic,and virtualized pool of computational resources,cloud computing becomes a new and advantageous computing paradigm to resolve scientific problems traditionally requiring a large-scale and high-performance cluster.This paper examines the viability for cloud computing to support dust storm forecasting.Through a holistic study by systematically comparing cloud computing using Amazon EC2 to traditional high performance computing(HPC)cluster,we find that cloud computing is emerging as a credible solution for(1)supporting dust storm forecasting in spinning off a large group of computing resources in a few minutes to satisfy the disruptive computing requirements of dust storm forecasting,(2)performing high-resolution dust storm forecasting when required,(3)supporting concurrent computing requirements,(4)supporting real dust storm event forecasting for a large geographic domain by using recent dust storm event in Phoniex,05 July 2011 as example,and(5)reducing cost by maintaining low computing support when there is no dust storm events while invoking a large amount of computing resource to perform high-resolution forecasting and responding to large amount of concurrent public accesses.展开更多
基金supported by Scientific Research Projects Coordination Unit of Istanbul TeknikÜniversitesi[Grant No.MYL-2018-41706].
文摘Cloud computing enables performing computations and analysis tasks and sharing services in web-based computer centres instead of local desktop systems.One of the most used areas of cloud computing is geographic information systems(GIS)applications.Although Desktop GIS products are still used in the community frequently,Web GIS and Cloud GIS applications have drawn attention and have become more efficient for users.In this study,a serverless Cloud GIS framework is implemented for the land valuation platform.In order to store,analyse,and share geospatial data,the Aurora Serverless PostgreSQL database is created on Amazon Web Services(AWS).While adopting Aurora Serverless PostgreSQL as a database management system,a simple point in polygon analysis conducted to compare the performances with Amazon Relational Database Service(RDS)instance.Results showed that the serverless database responded to the query faster and scaled up during high workload to decrease latency.Hence,parcel vector data,which conveys ownership information and land values attributes,is shared directly from the PostGIS database as vector tiles.Besides S3 and AWS Lambda services are used for storing and disseminating raster-based land value map tiles.To visualize all shared data and maps through a web browser,open source web mapping library Mapbox GL JS is used.
基金Research reported is supported by NSF(CSR-1117300 and IIP-1160979)NASA(NNX07AD99G)Microsoft Research.
文摘The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing platform for a dust storm forecasting operational system should support a disruptive fashion by scaling up to enable high-resolution forecasting and massive public access when dust storms come and scaling down when no dust storm events occur to save energy and costs.With the capability of providing a large,elastic,and virtualized pool of computational resources,cloud computing becomes a new and advantageous computing paradigm to resolve scientific problems traditionally requiring a large-scale and high-performance cluster.This paper examines the viability for cloud computing to support dust storm forecasting.Through a holistic study by systematically comparing cloud computing using Amazon EC2 to traditional high performance computing(HPC)cluster,we find that cloud computing is emerging as a credible solution for(1)supporting dust storm forecasting in spinning off a large group of computing resources in a few minutes to satisfy the disruptive computing requirements of dust storm forecasting,(2)performing high-resolution dust storm forecasting when required,(3)supporting concurrent computing requirements,(4)supporting real dust storm event forecasting for a large geographic domain by using recent dust storm event in Phoniex,05 July 2011 as example,and(5)reducing cost by maintaining low computing support when there is no dust storm events while invoking a large amount of computing resource to perform high-resolution forecasting and responding to large amount of concurrent public accesses.