A detailed and accurate inventory map of landslides is crucial for quantitative hazard assessment and land planning.Traditional methods relying on change detection and object-oriented approaches have been criticized f...A detailed and accurate inventory map of landslides is crucial for quantitative hazard assessment and land planning.Traditional methods relying on change detection and object-oriented approaches have been criticized for their dependence on expert knowledge and subjective factors.Recent advancements in highresolution satellite imagery,coupled with the rapid development of artificial intelligence,particularly datadriven deep learning algorithms(DL)such as convolutional neural networks(CNN),have provided rich feature indicators for landslide mapping,overcoming previous limitations.In this review paper,77representative DL-based landslide detection methods applied in various environments over the past seven years were examined.This study analyzed the structures of different DL networks,discussed five main application scenarios,and assessed both the advancements and limitations of DL in geological hazard analysis.The results indicated that the increasing number of articles per year reflects growing interest in landslide mapping by artificial intelligence,with U-Net-based structures gaining prominence due to their flexibility in feature extraction and generalization.Finally,we explored the hindrances of DL in landslide hazard research based on the above research content.Challenges such as black-box operations and sample dependence persist,warranting further theoretical research and future application of DL in landslide detection.展开更多
A detailed inspection of roads requires highly detailed spatial data with sufficient precision to deliver an accurate geometry and to describe road defects visually.This paper presents a novel method for the detection...A detailed inspection of roads requires highly detailed spatial data with sufficient precision to deliver an accurate geometry and to describe road defects visually.This paper presents a novel method for the detection of road defects.The input data for road defect detection included point clouds and orthomosaics gathered by mobile mapping technology.The defects were categorized in three major groups with the following geometric primitives:points,lines and polygons.The method suggests the detection of point objects from matched point clouds,panoramic images and ortho photos.Defects were mapped as point,line or polygon geometries,directly derived from orthomosaics and panoramic images.Besides the geometric position of road defects,all objects were assigned to a variety of attributes:defect type,surface material,center-of-gravity,area,length,corresponding image of the defect and degree of damage.A spatial dataset comprising defect values with a matching data type was created to perform the attribute analysis quickly and correctly.The final product is a spatial vector data set,consisting of points,lines and polygons,which contains attributes with further information and geometry.This paper demonstrates that mobile mapping suits a large-scale feature extraction of road infrastructure defects.By its simplicity and flexibility,the presented methodology allows it to be easily adapted to extract further feature types with their attributes.This makes the proposed approach a vital tool for data extraction settings with multiple mobile mapping data analysts,e.g.,offline crowdsourcing.展开更多
随着铁路行业加速向数字化、智能化、信息化转型,传统路基建、运、维监测中的数据管理方法和可视化水平已无法满足未来发展的需求。为解决这一困境,提出1套基于3层浏览器/服务器(Browser/Server,B/S)架构的高铁路基监测展示平台总体构...随着铁路行业加速向数字化、智能化、信息化转型,传统路基建、运、维监测中的数据管理方法和可视化水平已无法满足未来发展的需求。为解决这一困境,提出1套基于3层浏览器/服务器(Browser/Server,B/S)架构的高铁路基监测展示平台总体构建方案,以实现监测数据和工程数据的高效集成;使用Revit软件建立传感器实体模型,基于工业基础类(Industry Foundation Classes,IFC)标准实现传感器属性信息扩展,以Java语言为中介实现IFC文件与MySQL数据库表的精准映射,并通过设置约束关联其他工程信息;利用建筑信息模型(Building Information Modeling,BIM)软件创建集成传感器的高铁路基模型,通过轻量化处理并利用Three.js引擎实现该模型在网页端的可视化展示;通过进一步关联传感器模型和数据库中的监测信息,实现监测数据的实时展示和超限示警定位功能。结果表明:该展示平台的建立可帮助相关人员快速掌握监测项目资料,动态精准评估建、运、维过程中路基的状态,从而达到促进各方交流协作和辅助决策实施的目的。展开更多
The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created.However,challenges remain with connecting ...The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created.However,challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist.The objective of this paper is to present a publically available Geospatial Search Engine(GSE)that utilizes a web crawler built on top of the Google search engine in order to search the web for geospatial data.The crawler seeding mechanism combines search terms entered by users with predefined keywords that identify geospatial data services.A procedure runs daily to update map server layers and metadata,and to eliminate servers that go offline.The GSE supports Web Map Services,ArcGIS services,and websites that have geospatial data for download.We applied the GSE to search for all available geospatial services under these formats and provide search results including the spatial distribution of all obtained services.While enhancements to our GSE and to web crawler technology in general lie ahead,our work represents an important step toward realizing the potential of a publically accessible tool for discovering the global availability of geospatial data.展开更多
基金supported by the National Key Research and Development Program of China(2021YFB3901205)the National Institute of Natural Hazards,Ministry of Emergency Management of China(2023-JBKY-57)。
文摘A detailed and accurate inventory map of landslides is crucial for quantitative hazard assessment and land planning.Traditional methods relying on change detection and object-oriented approaches have been criticized for their dependence on expert knowledge and subjective factors.Recent advancements in highresolution satellite imagery,coupled with the rapid development of artificial intelligence,particularly datadriven deep learning algorithms(DL)such as convolutional neural networks(CNN),have provided rich feature indicators for landslide mapping,overcoming previous limitations.In this review paper,77representative DL-based landslide detection methods applied in various environments over the past seven years were examined.This study analyzed the structures of different DL networks,discussed five main application scenarios,and assessed both the advancements and limitations of DL in geological hazard analysis.The results indicated that the increasing number of articles per year reflects growing interest in landslide mapping by artificial intelligence,with U-Net-based structures gaining prominence due to their flexibility in feature extraction and generalization.Finally,we explored the hindrances of DL in landslide hazard research based on the above research content.Challenges such as black-box operations and sample dependence persist,warranting further theoretical research and future application of DL in landslide detection.
基金The project presented in the paper is published with kind permission of the contributor.The original data were provided by DataDEV Company,Novi Sad,Republic of SerbiaThe paper presents the part of research realized within the project“Multidisciplinary theoretical and experimental research in education and science in the fields of civil engineering,risk management and fire safety and geodesy”conducted by the Department of Civil Engineering and Geodesy,Faculty of Technical Sciences,University of Novi Sad。
文摘A detailed inspection of roads requires highly detailed spatial data with sufficient precision to deliver an accurate geometry and to describe road defects visually.This paper presents a novel method for the detection of road defects.The input data for road defect detection included point clouds and orthomosaics gathered by mobile mapping technology.The defects were categorized in three major groups with the following geometric primitives:points,lines and polygons.The method suggests the detection of point objects from matched point clouds,panoramic images and ortho photos.Defects were mapped as point,line or polygon geometries,directly derived from orthomosaics and panoramic images.Besides the geometric position of road defects,all objects were assigned to a variety of attributes:defect type,surface material,center-of-gravity,area,length,corresponding image of the defect and degree of damage.A spatial dataset comprising defect values with a matching data type was created to perform the attribute analysis quickly and correctly.The final product is a spatial vector data set,consisting of points,lines and polygons,which contains attributes with further information and geometry.This paper demonstrates that mobile mapping suits a large-scale feature extraction of road infrastructure defects.By its simplicity and flexibility,the presented methodology allows it to be easily adapted to extract further feature types with their attributes.This makes the proposed approach a vital tool for data extraction settings with multiple mobile mapping data analysts,e.g.,offline crowdsourcing.
文摘随着铁路行业加速向数字化、智能化、信息化转型,传统路基建、运、维监测中的数据管理方法和可视化水平已无法满足未来发展的需求。为解决这一困境,提出1套基于3层浏览器/服务器(Browser/Server,B/S)架构的高铁路基监测展示平台总体构建方案,以实现监测数据和工程数据的高效集成;使用Revit软件建立传感器实体模型,基于工业基础类(Industry Foundation Classes,IFC)标准实现传感器属性信息扩展,以Java语言为中介实现IFC文件与MySQL数据库表的精准映射,并通过设置约束关联其他工程信息;利用建筑信息模型(Building Information Modeling,BIM)软件创建集成传感器的高铁路基模型,通过轻量化处理并利用Three.js引擎实现该模型在网页端的可视化展示;通过进一步关联传感器模型和数据库中的监测信息,实现监测数据的实时展示和超限示警定位功能。结果表明:该展示平台的建立可帮助相关人员快速掌握监测项目资料,动态精准评估建、运、维过程中路基的状态,从而达到促进各方交流协作和辅助决策实施的目的。
文摘The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created.However,challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist.The objective of this paper is to present a publically available Geospatial Search Engine(GSE)that utilizes a web crawler built on top of the Google search engine in order to search the web for geospatial data.The crawler seeding mechanism combines search terms entered by users with predefined keywords that identify geospatial data services.A procedure runs daily to update map server layers and metadata,and to eliminate servers that go offline.The GSE supports Web Map Services,ArcGIS services,and websites that have geospatial data for download.We applied the GSE to search for all available geospatial services under these formats and provide search results including the spatial distribution of all obtained services.While enhancements to our GSE and to web crawler technology in general lie ahead,our work represents an important step toward realizing the potential of a publically accessible tool for discovering the global availability of geospatial data.