This paper describes PERCEPOLIS, an educational platform that leverages technological advances, in particular in pervasive computing, to facilitate personalized learning in higher education, while supporting a network...This paper describes PERCEPOLIS, an educational platform that leverages technological advances, in particular in pervasive computing, to facilitate personalized learning in higher education, while supporting a networked curricular model. Fundamental to PERCEPOLIS is the modular approach to course development. Blended instruction, where students are responsible for perusing certain learning objects outside of class, used in conjunction with the cyberinfrastructure will allow the focus of face-to-face meetings to shift from lecture to active learning, interactive problem-solving, and reflective instructional tasks. The novelty of PERCEPOLIS lies in its ability to leverage pervasive and ubiquitous computing and communication through the use of intelligent software agents that use a student’s academic profile and interests, as well as supplemental information such as his or her learning style, to customize course content. Assessments that gauge the student’s mastery of concepts are used to allow self-paced progression through the course. Furthermore, the cyberinfrastructure facilitates the collection of data on student performance and learning at a resolution that far exceeds what is currently available. We believe that such an infrastructure will accelerate the acquisition of knowledge and skills critical to professional engineering practice, while facilitating the study of how this acquisition comes about, yielding insights that may lead to significant changes in pedagogy.展开更多
A geospatial cyberinfrastructure is needed to support advanced GIScience research and education activities.However,the heterogeneous and distributed nature of geospatial resources creates enormous obstacles for buildi...A geospatial cyberinfrastructure is needed to support advanced GIScience research and education activities.However,the heterogeneous and distributed nature of geospatial resources creates enormous obstacles for building a unified and interoperable geospatial cyberinfrastructure.In this paper,we propose the Geospatial Service Web(GSW)to underpin the development of a future geospatial cyberinfrastructure.The GSW excels over the traditional spatial data infrastructure by providing a highly intelligent geospatial middleware to integrate various geospatial resources through the Internet based on interoperable Web service technologies.The development of the GSW focuses on the establishment of a platform where data,information,and knowledge can be shared and exchanged in an interoperable manner.Theoretically,we describe the conceptual framework and research challenges for GSW,and then introduce our recent research toward building a GSW.A research agenda for building a GSW is also presented in the paper.展开更多
Design Safe addresses the challenges of supporting integrative data-driven research in natural hazards engineering.It is an end-to-end data management,communications,and analysis platform where users collect,generate,...Design Safe addresses the challenges of supporting integrative data-driven research in natural hazards engineering.It is an end-to-end data management,communications,and analysis platform where users collect,generate,analyze,curate,and publish large data sets from a variety of sources,including experiments,simulations,field research,and post-disaster reconnaissance.DesignSafe achieves key objectives through:(1)integration with high performance and cloud-computing resources to support the computational needs of the regional risk assessment community;(2)the possibility to curate and publish diverse data structures emphasizing relationships and understandability;and(3)facilitation of real time communications during natural hazards events and disasters for data and information sharing.The resultant services and tools shorten data cycles for resiliency evaluation,risk modeling validation,and forensic studies.This article illustrates salient features of the cyberinfrastructure.It summarizes its design principles,architecture,and functionalities.The focus is on case studies to show the impact of Design Safe on the disaster risk community.The Next Generation Liquefaction project collects and standardizes case histories of earthquake-induced soil liquefaction into a relational database—Design Safe—to permit users to interact with the data.Researchers can correlate in Design Safe building dynamic characteristics based on data from building sensors,with observed damage based on ground motion measurements.Reconnaissance groups upload,curate,and publish wind,seismic,and coastal damage data they gather during field reconnaissance missions,so these datasets are available shortly after a disaster.As a part of the education and community outreach efforts of Design Safe,training materials and collaboration space are also offered to the disaster risk management community.展开更多
For geospatial cyberinfrastructure-enabled web services,the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection,response a...For geospatial cyberinfrastructure-enabled web services,the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection,response and decision-making.Especially for vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications,their rich geometry and property information facilitates the development of interactive,efficient and intelligent data analysis and visualization applications.However,the big-data issues of vector datasets have hindered their wide adoption in web services.In this research,we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing.This strategy combines:(1)pre-and on-the-fly generalization,which automatically determines proper simplification level through the introduction of appropriate distance tolerance speed up simplification efficiency;(2)a progressive attribute transmission method to reduce data size and,therefore,the service response time;(3)compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments.A cyberinfrastructure web portal was developed for implementing the proposed technologies.After applying our optimization strategies,substantial performance enhancement is achieved.We expect this work to facilitate real-time spatial feature sharing,visual analytics and decision-making.展开更多
1 Key concepts underpinning geo-data science Geoinformatics and Geomathematics Computers have been used for data collection,management,analysis,and transmission in geoscience for about 70 years since the 1950s (Merria...1 Key concepts underpinning geo-data science Geoinformatics and Geomathematics Computers have been used for data collection,management,analysis,and transmission in geoscience for about 70 years since the 1950s (Merriam,2001;2004).The term geoinformatics is widely used to describe such activities.In real-world practices,researchers in both geography and geoscience are using the term geoinformatics.展开更多
The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require ...The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require the readiness of a computing infrastructure that can:(1)better support discovery,access and utilization of data and data processing so as to relieve scientists and engineers of IT tasks and focus on scientific discoveries;(2)provide real-time IT resources to enable real-time applications,such as emergency response;(3)deal with access spikes;and(4)provide more reliable and scalable service for massive numbers of concurrent users to advance public knowledge.The emergence of cloud computing provides a potential solution with an elastic,on-demand computing platform to integrateobservation systems,parameter extracting algorithms,phenomena simulations,analytical visualization and decision support,and to provide social impact and user feedbackthe essential elements of the geospatial sciences.We discuss the utilization of cloud computing to support the intensities of geospatial sciences by reporting from our investigations on how cloud computing could enable the geospatial sciences and how spatiotemporal principles,the kernel of the geospatial sciences,could be utilized to ensure the benefits of cloud computing.Four research examples are presented to analyze how to:(1)search,access and utilize geospatial data;(2)configure computing infrastructure to enable the computability of intensive simulation models;(3)disseminate and utilize research results for massive numbers of concurrent users;and(4)adopt spatiotemporal principles to support spatiotemporal intensive applications.The paper concludes with a discussion of opportunities and challenges for spatial cloud computing(SCC).展开更多
Big Data has emerged in the past few years as a new paradigm providing abundant data and opportunities to improve and/or enable research and decision-support applications with unprecedented value for digital earth app...Big Data has emerged in the past few years as a new paradigm providing abundant data and opportunities to improve and/or enable research and decision-support applications with unprecedented value for digital earth applications including business,sciences and engineering.At the same time,Big Data presents challenges for digital earth to store,transport,process,mine and serve the data.Cloud computing provides fundamental support to address the challenges with shared computing resources including computing,storage,networking and analytical software;the application of these resources has fostered impressive Big Data advancements.This paper surveys the two frontiers–Big Data and cloud computing–and reviews the advantages and consequences of utilizing cloud computing to tackling Big Data in the digital earth and relevant science domains.From the aspects of a general introduction,sources,challenges,technology status and research opportunities,the following observations are offered:(i)cloud computing and Big Data enable science discoveries and application developments;(ii)cloud computing provides major solutions for Big Data;(iii)Big Data,spatiotemporal thinking and various application domains drive the advancement of cloud computing and relevant technologies with new requirements;(iv)intrinsic spatiotemporal principles of Big Data and geospatial sciences provide the source for finding technical and theoretical solutions to optimize cloud computing and processing Big Data;(v)open availability of Big Data and processing capability pose social challenges of geospatial significance and(vi)a weave of innovations is transforming Big Data into geospatial research,engineering and business values.This review introduces future innovations and a research agenda for cloud computing supporting the transformation of the volume,velocity,variety and veracity into values of Big Data for local to global digital earth science and applications.展开更多
This paper introduces a new concept,distributed geospatial information processing(DGIP),which refers to the process of geospatial information residing on computers geographically dispersed and connected through comput...This paper introduces a new concept,distributed geospatial information processing(DGIP),which refers to the process of geospatial information residing on computers geographically dispersed and connected through computer networks,and the contribution of DGIP to Digital Earth(DE).The DGIP plays a critical role in integrating the widely distributed geospatial resources to support the DE envisioned to utilise a wide variety of information.This paper addresses this role from three different aspects:1)sharing Earth data,information,and services through geospatial interoperability supported by standardisation of contents and interfaces;2)sharing computing and software resources through a GeoCyberinfrastructure supported by DGIP middleware;and 3)sharing knowledge within and across domains through ontology and semantic searches.Observing the long-term process for the research and development of an operational DE,we discuss and expect some practical contributions of the DGIP to the DE.展开更多
One of the major scientific challenges and societal concerns is to make informed decisions to ensure sustainable groundwater availability when facing deep uncertainties.A major computational requirement associated wit...One of the major scientific challenges and societal concerns is to make informed decisions to ensure sustainable groundwater availability when facing deep uncertainties.A major computational requirement associated with this is on-demand computing for risk analysis to support timely decision.This paper presents a scientific modeling service called‘ModflowOnAzure’which enables large-scale ensemble runs of groundwater flow models to be easily executed in parallel in the Windows Azure cloud.Several technical issues were addressed,including the conjunctive use of desktop tools in MATLAB to avoid license issues in the cloud,integration of Dropbox with Azure for improved usability and‘Drop-and-Compute,’and automated file exchanges between desktop and the cloud.Two scientific use cases are presented in this paper using this service with significant computational speedup.One case is from Arizona,where six plausible alternative conceptual models and a streamflow stochastic model are used to evaluate the impacts of different groundwater pumping scenarios.Another case is from Texas,where a global sensitivity analysis is performed on a regional groundwater availability model.Results of both cases show informed uncertainty analysis results that can be used to assist the groundwater planning and sustainability study.展开更多
Big Earth Data-Cube infrastructures are becoming more and more popular to provide Analysis Ready Data,especially for managing satellite time series.These infrastructures build on the concept of multidimensional data m...Big Earth Data-Cube infrastructures are becoming more and more popular to provide Analysis Ready Data,especially for managing satellite time series.These infrastructures build on the concept of multidimensional data model(data hypercube)and are complex systems engaging different disciplines and expertise.For this reason,their interoperability capacity has become a challenge in the Global Change and Earth System science domains.To address this challenge,there is a pressing need in the community to reach a widely agreed definition of Data-Cube infrastructures and their key features.In this respect,a discussion has started recently about the definition of the possible facets characterizing a Data-Cube in the Earth Observation domain.This manuscript contributes to such debate by introducing a view-based model of Earth Data-Cube systems to design its infrastructural architecture and content schemas,with the final goal of enabling and facilitating interoperability.It introduces six modeling views,each of them is described according to:its main concerns,principal stakeholders,and possible patterns to be used.The manuscript considers the Business Intelligence experience with Data Warehouse and multidimensional“cubes”along with the more recent and analogous development in the Earth Observation domain,and puts forward a set of interoperability recommendations based on the modeling views.展开更多
Geospatial simulation models can help us understand the dynamic aspects of Digital Earth.To implement high-performance simulation models for complex geospatial problems,grid computing and cloud computing are two promi...Geospatial simulation models can help us understand the dynamic aspects of Digital Earth.To implement high-performance simulation models for complex geospatial problems,grid computing and cloud computing are two promising computational frameworks.This research compares the benefits and drawbacks of both in Web-based frameworks by testing a parallel Geographic Information System(GIS)simulation model(Schelling’s residential segregation model).The parallel GIS simulation model was tested on XSEDE(a representative grid computing platform)and Amazon EC2(a representative cloud computing platform).The test results demonstrate that cloud computing platforms can provide almost the same parallel computing capability as high-end grid computing frameworks.However,cloud computing resources are more accessible to individual scientists,easier to request and set up,and have more scalable software architecture for on-demand and dedicated Web services.These advantages may attract more geospatial scientists to utilize cloud computing for the development of Digital Earth simulation models in the future.展开更多
Based on various experiences in developing Geodata Infrastructures(GDIs)for scientific applications,this article proposes the concept of a Scientific GDI that can be used by scientists in environmental and earth scien...Based on various experiences in developing Geodata Infrastructures(GDIs)for scientific applications,this article proposes the concept of a Scientific GDI that can be used by scientists in environmental and earth sciences to share and disseminate their research results and related analysis methods.Scientific GDI is understood as an approach to tackle the science case in Digital Earth and to further enhance e-science for environmental research.Creating Scientific GDI to support the research community in efficiently exchanging data and methods related to the various scientific disciplines forming the basis of environmental studies poses numerous challenges on today’s GDI developments.The paper summarizes requirements and recommendations on the publication of scientific geospatial data and on functionalities to be provided in Scientific GDI.Best practices and open issues for governance and policies of a Scientific GDI are discussed and are concluded by deriving a research agenda for the next decade.展开更多
Chemical engineering is entering a new Golden Age of practice, thought, and impact, accompanied by great new opportunities and challenges. Five aspects mark this development: a new abundance of hydrocarbons; the evol...Chemical engineering is entering a new Golden Age of practice, thought, and impact, accompanied by great new opportunities and challenges. Five aspects mark this development: a new abundance of hydrocarbons; the evolution of biology into a molecular science; the ubiquity of powerful computational tools; the trend in manufacturing to be more process-oriented; and the systems approach that is part of ChE education from its first stages. There are important technical challenges, including technology creation and environmental impact, but just as important are new appreciation for and attention to challenges that require societal dialogues about com- plexity, uncertainty, and evolving and sometimes contra- dictory requirements. Crucial to all these impacts is enhancing the identity of what the profession is. That must be based on recognizing that the core of chemical engineering is applying molecular sciences to create value and advance the quality of life.展开更多
文摘This paper describes PERCEPOLIS, an educational platform that leverages technological advances, in particular in pervasive computing, to facilitate personalized learning in higher education, while supporting a networked curricular model. Fundamental to PERCEPOLIS is the modular approach to course development. Blended instruction, where students are responsible for perusing certain learning objects outside of class, used in conjunction with the cyberinfrastructure will allow the focus of face-to-face meetings to shift from lecture to active learning, interactive problem-solving, and reflective instructional tasks. The novelty of PERCEPOLIS lies in its ability to leverage pervasive and ubiquitous computing and communication through the use of intelligent software agents that use a student’s academic profile and interests, as well as supplemental information such as his or her learning style, to customize course content. Assessments that gauge the student’s mastery of concepts are used to allow self-paced progression through the course. Furthermore, the cyberinfrastructure facilitates the collection of data on student performance and learning at a resolution that far exceeds what is currently available. We believe that such an infrastructure will accelerate the acquisition of knowledge and skills critical to professional engineering practice, while facilitating the study of how this acquisition comes about, yielding insights that may lead to significant changes in pedagogy.
基金This work is jointly supported by National Basic Research Program of China(Nos.2012CB719906 and 2011CB707105)National Natural Science Foundation of China(Nos.41023001,40801153 and 40901190).
文摘A geospatial cyberinfrastructure is needed to support advanced GIScience research and education activities.However,the heterogeneous and distributed nature of geospatial resources creates enormous obstacles for building a unified and interoperable geospatial cyberinfrastructure.In this paper,we propose the Geospatial Service Web(GSW)to underpin the development of a future geospatial cyberinfrastructure.The GSW excels over the traditional spatial data infrastructure by providing a highly intelligent geospatial middleware to integrate various geospatial resources through the Internet based on interoperable Web service technologies.The development of the GSW focuses on the establishment of a platform where data,information,and knowledge can be shared and exchanged in an interoperable manner.Theoretically,we describe the conceptual framework and research challenges for GSW,and then introduce our recent research toward building a GSW.A research agenda for building a GSW is also presented in the paper.
基金The National Science Foundation(NSF)financially supports the Design Safe project under grant CMMI-1520817NSF grant ACI1134872 for high performance computing,and grants ACI-1127210 and ACI-1450459 for the development of the Agave API
文摘Design Safe addresses the challenges of supporting integrative data-driven research in natural hazards engineering.It is an end-to-end data management,communications,and analysis platform where users collect,generate,analyze,curate,and publish large data sets from a variety of sources,including experiments,simulations,field research,and post-disaster reconnaissance.DesignSafe achieves key objectives through:(1)integration with high performance and cloud-computing resources to support the computational needs of the regional risk assessment community;(2)the possibility to curate and publish diverse data structures emphasizing relationships and understandability;and(3)facilitation of real time communications during natural hazards events and disasters for data and information sharing.The resultant services and tools shorten data cycles for resiliency evaluation,risk modeling validation,and forensic studies.This article illustrates salient features of the cyberinfrastructure.It summarizes its design principles,architecture,and functionalities.The focus is on case studies to show the impact of Design Safe on the disaster risk community.The Next Generation Liquefaction project collects and standardizes case histories of earthquake-induced soil liquefaction into a relational database—Design Safe—to permit users to interact with the data.Researchers can correlate in Design Safe building dynamic characteristics based on data from building sensors,with observed damage based on ground motion measurements.Reconnaissance groups upload,curate,and publish wind,seismic,and coastal damage data they gather during field reconnaissance missions,so these datasets are available shortly after a disaster.As a part of the education and community outreach efforts of Design Safe,training materials and collaboration space are also offered to the disaster risk management community.
基金a National Science Foundation(NSF)CAREER award BCS-1455349,an NSF award PLR-1504432,and an OGC Testbed13 grant.
文摘For geospatial cyberinfrastructure-enabled web services,the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection,response and decision-making.Especially for vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications,their rich geometry and property information facilitates the development of interactive,efficient and intelligent data analysis and visualization applications.However,the big-data issues of vector datasets have hindered their wide adoption in web services.In this research,we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing.This strategy combines:(1)pre-and on-the-fly generalization,which automatically determines proper simplification level through the introduction of appropriate distance tolerance speed up simplification efficiency;(2)a progressive attribute transmission method to reduce data size and,therefore,the service response time;(3)compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments.A cyberinfrastructure web portal was developed for implementing the proposed technologies.After applying our optimization strategies,substantial performance enhancement is achieved.We expect this work to facilitate real-time spatial feature sharing,visual analytics and decision-making.
基金supported by the National Science Foundation (Grant No.1815526).
文摘1 Key concepts underpinning geo-data science Geoinformatics and Geomathematics Computers have been used for data collection,management,analysis,and transmission in geoscience for about 70 years since the 1950s (Merriam,2001;2004).The term geoinformatics is widely used to describe such activities.In real-world practices,researchers in both geography and geoscience are using the term geoinformatics.
基金We thank Drs.Huadong Guo and Changlin Wang for inviting us to write this definition and field review paper.Research reported is partially supported by NASA(NNX07AD99G and SMD-09-1448),FGDC(G09AC00103)Environmental Informatics Framework of the Earth,Energy,and Environment Program at Microsoft Research Connection.We thank insightful comments from reviewers including Dr.Aijun Chen(NASA/GMU),Dr.Thomas Huang(NASA JPL),Dr.Cao Kang(Clark Univ.),Krishna Kumar(Microsoft),Dr.Wenwen Li(UCSB),Dr.Michael Peterson(University of Nebraska-Omaha),Dr.Xuan Shi(Geogia Tech),Dr.Tong Zhang(Wuhan University),Jinesh Varia(Amazon)and an anonymous reviewer.This paper is a result from the collaborations/discussions with colleagues from NASA,FGDC,USGS,EPA,GSA,Microsoft,ESIP,AAG CISG,CPGIS,UCGIS,GEO,and ISDE.
文摘The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require the readiness of a computing infrastructure that can:(1)better support discovery,access and utilization of data and data processing so as to relieve scientists and engineers of IT tasks and focus on scientific discoveries;(2)provide real-time IT resources to enable real-time applications,such as emergency response;(3)deal with access spikes;and(4)provide more reliable and scalable service for massive numbers of concurrent users to advance public knowledge.The emergence of cloud computing provides a potential solution with an elastic,on-demand computing platform to integrateobservation systems,parameter extracting algorithms,phenomena simulations,analytical visualization and decision support,and to provide social impact and user feedbackthe essential elements of the geospatial sciences.We discuss the utilization of cloud computing to support the intensities of geospatial sciences by reporting from our investigations on how cloud computing could enable the geospatial sciences and how spatiotemporal principles,the kernel of the geospatial sciences,could be utilized to ensure the benefits of cloud computing.Four research examples are presented to analyze how to:(1)search,access and utilize geospatial data;(2)configure computing infrastructure to enable the computability of intensive simulation models;(3)disseminate and utilize research results for massive numbers of concurrent users;and(4)adopt spatiotemporal principles to support spatiotemporal intensive applications.The paper concludes with a discussion of opportunities and challenges for spatial cloud computing(SCC).
基金NASA AIST Program[NNX15AM85G]NCCS[NNG14HH38I]+2 种基金Goddard[NNG16PU001]NSF I/UCRC[1338925]EarthCube[ICER-1540998],CNS[1117300],Microsoft,Amazon,Northrop Grumman,Harris,and United Nations.
文摘Big Data has emerged in the past few years as a new paradigm providing abundant data and opportunities to improve and/or enable research and decision-support applications with unprecedented value for digital earth applications including business,sciences and engineering.At the same time,Big Data presents challenges for digital earth to store,transport,process,mine and serve the data.Cloud computing provides fundamental support to address the challenges with shared computing resources including computing,storage,networking and analytical software;the application of these resources has fostered impressive Big Data advancements.This paper surveys the two frontiers–Big Data and cloud computing–and reviews the advantages and consequences of utilizing cloud computing to tackling Big Data in the digital earth and relevant science domains.From the aspects of a general introduction,sources,challenges,technology status and research opportunities,the following observations are offered:(i)cloud computing and Big Data enable science discoveries and application developments;(ii)cloud computing provides major solutions for Big Data;(iii)Big Data,spatiotemporal thinking and various application domains drive the advancement of cloud computing and relevant technologies with new requirements;(iv)intrinsic spatiotemporal principles of Big Data and geospatial sciences provide the source for finding technical and theoretical solutions to optimize cloud computing and processing Big Data;(v)open availability of Big Data and processing capability pose social challenges of geospatial significance and(vi)a weave of innovations is transforming Big Data into geospatial research,engineering and business values.This review introduces future innovations and a research agenda for cloud computing supporting the transformation of the volume,velocity,variety and veracity into values of Big Data for local to global digital earth science and applications.
基金supported by a Chinese 973 project(2006CB701306)a NASA Geosciences Interoperability project(NNX07AD99G),and FGDC 2005 CAP award(05HQAG0115).
文摘This paper introduces a new concept,distributed geospatial information processing(DGIP),which refers to the process of geospatial information residing on computers geographically dispersed and connected through computer networks,and the contribution of DGIP to Digital Earth(DE).The DGIP plays a critical role in integrating the widely distributed geospatial resources to support the DE envisioned to utilise a wide variety of information.This paper addresses this role from three different aspects:1)sharing Earth data,information,and services through geospatial interoperability supported by standardisation of contents and interfaces;2)sharing computing and software resources through a GeoCyberinfrastructure supported by DGIP middleware;and 3)sharing knowledge within and across domains through ontology and semantic searches.Observing the long-term process for the research and development of an operational DE,we discuss and expect some practical contributions of the DGIP to the DE.
文摘One of the major scientific challenges and societal concerns is to make informed decisions to ensure sustainable groundwater availability when facing deep uncertainties.A major computational requirement associated with this is on-demand computing for risk analysis to support timely decision.This paper presents a scientific modeling service called‘ModflowOnAzure’which enables large-scale ensemble runs of groundwater flow models to be easily executed in parallel in the Windows Azure cloud.Several technical issues were addressed,including the conjunctive use of desktop tools in MATLAB to avoid license issues in the cloud,integration of Dropbox with Azure for improved usability and‘Drop-and-Compute,’and automated file exchanges between desktop and the cloud.Two scientific use cases are presented in this paper using this service with significant computational speedup.One case is from Arizona,where six plausible alternative conceptual models and a streamflow stochastic model are used to evaluate the impacts of different groundwater pumping scenarios.Another case is from Texas,where a global sensitivity analysis is performed on a regional groundwater availability model.Results of both cases show informed uncertainty analysis results that can be used to assist the groundwater planning and sustainability study.
基金This research was supported by the European Commission in the framework of the H2020 ECOPOTENTIAL project(ID 641762)the H2020 SeaDataCloud project(ID 730960),and the FP7 EarthServer project(ID 283610).
文摘Big Earth Data-Cube infrastructures are becoming more and more popular to provide Analysis Ready Data,especially for managing satellite time series.These infrastructures build on the concept of multidimensional data model(data hypercube)and are complex systems engaging different disciplines and expertise.For this reason,their interoperability capacity has become a challenge in the Global Change and Earth System science domains.To address this challenge,there is a pressing need in the community to reach a widely agreed definition of Data-Cube infrastructures and their key features.In this respect,a discussion has started recently about the definition of the possible facets characterizing a Data-Cube in the Earth Observation domain.This manuscript contributes to such debate by introducing a view-based model of Earth Data-Cube systems to design its infrastructural architecture and content schemas,with the final goal of enabling and facilitating interoperability.It introduces six modeling views,each of them is described according to:its main concerns,principal stakeholders,and possible patterns to be used.The manuscript considers the Business Intelligence experience with Data Warehouse and multidimensional“cubes”along with the more recent and analogous development in the Earth Observation domain,and puts forward a set of interoperability recommendations based on the modeling views.
基金This work used the Extreme Science and Engineering Discovery Environment(XSEDE)which is supported by National Science Foundation grant number OCI-1053575+1 种基金The first author expresses the appreciation of funds received from the National Science Foundation(Award#CNS-1028177)support from San Diego State University。
文摘Geospatial simulation models can help us understand the dynamic aspects of Digital Earth.To implement high-performance simulation models for complex geospatial problems,grid computing and cloud computing are two promising computational frameworks.This research compares the benefits and drawbacks of both in Web-based frameworks by testing a parallel Geographic Information System(GIS)simulation model(Schelling’s residential segregation model).The parallel GIS simulation model was tested on XSEDE(a representative grid computing platform)and Amazon EC2(a representative cloud computing platform).The test results demonstrate that cloud computing platforms can provide almost the same parallel computing capability as high-end grid computing frameworks.However,cloud computing resources are more accessible to individual scientists,easier to request and set up,and have more scalable software architecture for on-demand and dedicated Web services.These advantages may attract more geospatial scientists to utilize cloud computing for the development of Digital Earth simulation models in the future.
文摘Based on various experiences in developing Geodata Infrastructures(GDIs)for scientific applications,this article proposes the concept of a Scientific GDI that can be used by scientists in environmental and earth sciences to share and disseminate their research results and related analysis methods.Scientific GDI is understood as an approach to tackle the science case in Digital Earth and to further enhance e-science for environmental research.Creating Scientific GDI to support the research community in efficiently exchanging data and methods related to the various scientific disciplines forming the basis of environmental studies poses numerous challenges on today’s GDI developments.The paper summarizes requirements and recommendations on the publication of scientific geospatial data and on functionalities to be provided in Scientific GDI.Best practices and open issues for governance and policies of a Scientific GDI are discussed and are concluded by deriving a research agenda for the next decade.
文摘Chemical engineering is entering a new Golden Age of practice, thought, and impact, accompanied by great new opportunities and challenges. Five aspects mark this development: a new abundance of hydrocarbons; the evolution of biology into a molecular science; the ubiquity of powerful computational tools; the trend in manufacturing to be more process-oriented; and the systems approach that is part of ChE education from its first stages. There are important technical challenges, including technology creation and environmental impact, but just as important are new appreciation for and attention to challenges that require societal dialogues about com- plexity, uncertainty, and evolving and sometimes contra- dictory requirements. Crucial to all these impacts is enhancing the identity of what the profession is. That must be based on recognizing that the core of chemical engineering is applying molecular sciences to create value and advance the quality of life.