We proposed a method to separate ground points and vegetation points from discrete return,small footprint airborne laser scanner data,called skewness change algorithm.The method,which makes use of intensity of laser s...We proposed a method to separate ground points and vegetation points from discrete return,small footprint airborne laser scanner data,called skewness change algorithm.The method,which makes use of intensity of laser scanner data,is especially applicable in steep,and forested areas.It does not take slope of forested area into account,while other algorithms consider the change of slope in steep forested area.The ground points and vegetation points can be used to estimate digital terrain model(DTM) and fractional vegetation cover,respectively.A few vegetation points which were classified into the ground points were removed as noise before the generation of DTM.This method was tested in a test area of 10000 square meters.A LiteMapper -5600 laser system was used and a flight was carried out over a ground of 700―800 m.In this tested area,a total number of 1546 field measurement ground points were measured with a total station TOPCON GTS-602 and TOPCON GTS -7002 for validation of DTM and the mean error value is -18.5 cm and the RMSE(root mean square error) is ±20.9 cm.A data trap sizes of 4m in diameter from airborne laser scanner data was selected to compute vegetation fraction cover.Validation of fractional vegetation cover was carried out using 15 hemispherical photographs,which are georeferenced to centimeter accuracy by differential GPS.The gap fraction was computed over a range of zenith angles 10° using the gap light analyzer(GLA) from each hemispherical photograph.The R2 for the regression of fractional vegetation cover from these ALS data and the respective field measurements is 0.7554.So this study presents a method for synchronous estimation of DTM and fractional vegetation cover in forested area from airborne LIDAR height and intensity data.展开更多
基金Supported by the National State Key Basic Research Project (Grant No.2007CB714404)the National Natural Science Foundation of China (Grant No.40871173)+1 种基金the State Key Laboratory of Remote Sensing Science,China (Grant No.03Q0030449)Key Science and Technology R&D Program of Qinghai Province (Grant No.2006-6-160-01)
文摘We proposed a method to separate ground points and vegetation points from discrete return,small footprint airborne laser scanner data,called skewness change algorithm.The method,which makes use of intensity of laser scanner data,is especially applicable in steep,and forested areas.It does not take slope of forested area into account,while other algorithms consider the change of slope in steep forested area.The ground points and vegetation points can be used to estimate digital terrain model(DTM) and fractional vegetation cover,respectively.A few vegetation points which were classified into the ground points were removed as noise before the generation of DTM.This method was tested in a test area of 10000 square meters.A LiteMapper -5600 laser system was used and a flight was carried out over a ground of 700―800 m.In this tested area,a total number of 1546 field measurement ground points were measured with a total station TOPCON GTS-602 and TOPCON GTS -7002 for validation of DTM and the mean error value is -18.5 cm and the RMSE(root mean square error) is ±20.9 cm.A data trap sizes of 4m in diameter from airborne laser scanner data was selected to compute vegetation fraction cover.Validation of fractional vegetation cover was carried out using 15 hemispherical photographs,which are georeferenced to centimeter accuracy by differential GPS.The gap fraction was computed over a range of zenith angles 10° using the gap light analyzer(GLA) from each hemispherical photograph.The R2 for the regression of fractional vegetation cover from these ALS data and the respective field measurements is 0.7554.So this study presents a method for synchronous estimation of DTM and fractional vegetation cover in forested area from airborne LIDAR height and intensity data.