The purpose of the paper is to analyse the effectiveness of a solution known as road condition tool(RCT)based on data crowdsourcing from smartphones users in the transport system.The tool developed by the author of th...The purpose of the paper is to analyse the effectiveness of a solution known as road condition tool(RCT)based on data crowdsourcing from smartphones users in the transport system.The tool developed by the author of the paper,enabling identification and assessment of road pavement defects by analysing the dynamics of vehicle motion in the road network.Transport system users equipped with a smartphone with the RCT mobile application on board record data of linear accelerations,speed,and vehicle location,and then,without any intervention,send them to the RCT server database in an aggregated form.The aggregated data are processed in the combined time and location criterion,and the road pavement condition assessment index is estimated for fixed 10 m long measuring sections.The measuring sections correspond to the sections of roads defined in the pavement management systems(PMS)used by municipal road infrastructure administration bodies.Both the research in question and the results obtained by the method proposed for purposes of the road pavement condition assessment were compared with a set of reference data of the road infrastructure administration body which conducted surveys using highly specialised measuring equipment.The results of this comparison,performed using binary classifiers,confirm the potential RCT solution proposed by the author.This solution makes it possible to global monitor the road infrastructure condition on a continuous basis via numerous users of the transport system,which guarantees that such an assessment is kept up to date.展开更多
The City of Calgary did a comparative study between two techniques of traffic data collection: Bluetooth sensors and crowdsourcing, for measuring travel time reliability on two goods movement corridors in Calgary, Al...The City of Calgary did a comparative study between two techniques of traffic data collection: Bluetooth sensors and crowdsourcing, for measuring travel time reliability on two goods movement corridors in Calgary, Alberta. To estimate travel time and speed, we used the output of BIuFAX sensors, which were operated by monitoring Bluetooth signals at several points along a roadway. On the other hand, TomTom historical traffic data were extracted from the TomTom Traffic Stats portal. To calculate travel time reliability, we applied the buffer index, and the planning time index recommended by FHWA (Federal Highway Administration). The Bluetooth traffic data were presumed as the benchmark in this study. Unlike the TomTom traffic data, the data provided by the Bluetooth technology met the minimum recommended sample size requirement, although data processing was time consuming and impractical for long study periods. Our study results showed that crowdsourcing technique can be a viable alternative and provide travel time reliability estimates with a reasonable accuracy, when there are adequate numbers of records registered. However, the TomTom sample sizes in Calgary were not large enough to provide a statistically reliable method of providing travel time indices. Further researches may verify the accuracy of crowdsourcing technologies for travel time studies.展开更多
Fingerprint⁃based Bluetooth positioning is a popular indoor positioning technology.However,the change of indoor environment and Bluetooth anchor locations has significant impact on signal distribution,which will resul...Fingerprint⁃based Bluetooth positioning is a popular indoor positioning technology.However,the change of indoor environment and Bluetooth anchor locations has significant impact on signal distribution,which will result in the decline of positioning accuracy.The widespread extension of Bluetooth positioning is limited by the need of manual effort to collect the fingerprints with position labels for fingerprint database construction and updating.To address this problem,this paper presents an adaptive fingerprint database updating approach.First,the crowdsourced data including the Bluetooth Received Signal Strength(RSS)sequences and the speed and heading of the pedestrian were recorded.Second,the recorded crowdsourced data were fused by the Kalman Filtering(KF),and then fed into the trajectory validity analysis model with the purpose of assigning the unlabeled RSS data with position labels to generate candidate fingerprints.Third,after enough candidate fingerprints were obtained at each Reference Point(RP),the Density⁃based Spatial Clustering of Applications with Noise(DBSCAN)approach was conducted on both the original and the candidate fingerprints to filter out the fingerprints which had been identified as the noise,and then the mean of fingerprints in the cluster with the largest data volume was selected as the updated fingerprint of the corresponding RP.Finally,the extensive experimental results show that with the increase of the number of candidate fingerprints and update iterations,the fingerprint⁃based Bluetooth positioning accuracy can be effectively improved.展开更多
Weather forecasting has been a critical component to predict and control building energy consumption for better building energy management.Without accessibility to other data sources,the onsite observed temperatures o...Weather forecasting has been a critical component to predict and control building energy consumption for better building energy management.Without accessibility to other data sources,the onsite observed temperatures or the airport temperatures are used in forecast models.In this paper,we present a novel approach by utilizing the crowdsourcing weather data from neighboring personal weather stations(PWS)to improve the weather forecast accuracy around buildings using a general spatial-temporal modeling framework.The final forecast is based on the ensemble of local forecasts for the target location using neighboring PWSs.Our approach is distinguished from existing literature in various aspects.First,we leverage the crowdsourcing weather data from PWS in addition to public data sources.In this way,the data is at much finer time resolution(e.g.,at 5-minute frequency)and spatial resolution(e.g.,arbitrary location vs grid).Second,our proposed model incorporates spatial-temporal correlation information of weather variables between the target building and a set of neighboring PWSs so that underlying correlations can be effectively captured to improve forecasting performance.We demonstrate the performance of the proposed framework by comparing to the benchmark models on temperature forecasting for a building located at an arbitrary location at San Antonio,Texas,USA.In general,the proposed model framework equipped with machine learning technique such as Random Forest can improve forecasting by 50%compares with persistent model and has 90%chance to outperform airport forecast in short-term forecasting.In a real-time setting,the proposed model framework can provide more accurate temperature forecasting results compared with using airport temperature forecast for most forecast horizon.Moreover,we analyze the sensitivity of model parameters to gain insights on how crowdsourcing data from the neighboring personal weather stations impacts forecasting performance.Finally,we implement our model in other cities such as Syracuse and Chicago to test the model’s performance in different landforms and climate types.展开更多
Crowdsourced data can effectively observe environmental and urban ecosystem processes.The use of data produced by untrained people into flood forecasting models may effectively allow Early Warning Systems(EWS)to bette...Crowdsourced data can effectively observe environmental and urban ecosystem processes.The use of data produced by untrained people into flood forecasting models may effectively allow Early Warning Systems(EWS)to better perform while support decision-making to reduce the fatalities and economic losses due to inundation hazard.In this work,we develop a Data Assimilation(DA)method integrating Volunteered Geographic Information(VGI)and a 2D hydraulic model and we test its performances.The proposed framework seeks to extend the capabilities and performances of standard DA works,based on the use of traditional in situ sensors,by assimilating VGI while managing and taking into account the uncertainties related to the quality,and the location and timing of the entire set of observational data.The November 2012 flood in the Italian Tiber River basin was selected as the case study.Results show improvements of the model in terms of uncertainty with a significant persistence of the model updating after the integration of the VGI,even in the case of use of few-selected observations gathered from social media.This will encourage further research in the use of VGI for EWS considering the exponential increase of quality and quantity of smartphone and social media user worldwide.展开更多
Volunteered geographic information(VGI)can be considered a subset of crowdsourced data(CSD)and its popularity has recently increased in a number of application areas.Disaster management is one of its key application a...Volunteered geographic information(VGI)can be considered a subset of crowdsourced data(CSD)and its popularity has recently increased in a number of application areas.Disaster management is one of its key application areas in which the benefits of VGI and CSD are potentially very high.However,quality issues such as credibility,reliability and relevance are limiting many of the advantages of utilising CSD.Credibility issues arise as CSD come from a variety of heterogeneous sources including both professionals and untrained citizens.VGI and CSD are also highly unstructured and the quality and metadata are often undocumented.In the 2011 Australian floods,the general public and disaster management administrators used the Ushahidi Crowd-mapping platform to extensively communicate flood-related information including hazards,evacuations,emergency services,road closures and property damage.This study assessed the credibility of the Australian Broadcasting Corporation’s Ushahidi CrowdMap dataset using a Naïve Bayesian network approach based on models commonly used in spam email detection systems.The results of the study reveal that the spam email detection approach is potentially useful for CSD credibility detection with an accuracy of over 90%using a forced classification methodology.展开更多
A land-use map at the regional scale is a heavy computation task yet is critical to most landowners,researchers,and decision-makers,enabling them to make informed decisions for varying objectives.There are two major d...A land-use map at the regional scale is a heavy computation task yet is critical to most landowners,researchers,and decision-makers,enabling them to make informed decisions for varying objectives.There are two major difficulties in generating land classification maps at the regional scale:the necessity of large data-sets of training points and the expensive computation cost in terms of both money and time.Volunteered Geographic Information opens a new era in mapping and visualizing the physical world by providing an open-access database valuable georeferenced information collected by volunteer citizens.As one of the most well-known VGI initiatives,OpenStreetMap(OSM),contributes not only to road network distribution information but also to the potential for using these data to justify and delineate land patterns.Whereas,most large-scale mapping approaches-including regional and national scales–confuse“land cover”and“land-use”,or build up the land-use database based on modeled land cover data-sets,in this study,we clearly distinguished and differentiated land-use from land cover.By focusing on our prime objective of mapping land-use and management practices,a robust regional land-use mapping approach was developed by integrating OSM data with the earth observation remote sensing imagery.Our novel approach incorporates a vital temporal component to large-scale land-use mapping while effectively eliminating the typically burdensome computation and time/money demands of such work.Furthermore,our novel approach in regional scale land-use mapping produced robust results in our study area:the overall internal accuracy of the classifier was 95.2%and the external accuracy of the classifier was measured at 74.8%.展开更多
文摘The purpose of the paper is to analyse the effectiveness of a solution known as road condition tool(RCT)based on data crowdsourcing from smartphones users in the transport system.The tool developed by the author of the paper,enabling identification and assessment of road pavement defects by analysing the dynamics of vehicle motion in the road network.Transport system users equipped with a smartphone with the RCT mobile application on board record data of linear accelerations,speed,and vehicle location,and then,without any intervention,send them to the RCT server database in an aggregated form.The aggregated data are processed in the combined time and location criterion,and the road pavement condition assessment index is estimated for fixed 10 m long measuring sections.The measuring sections correspond to the sections of roads defined in the pavement management systems(PMS)used by municipal road infrastructure administration bodies.Both the research in question and the results obtained by the method proposed for purposes of the road pavement condition assessment were compared with a set of reference data of the road infrastructure administration body which conducted surveys using highly specialised measuring equipment.The results of this comparison,performed using binary classifiers,confirm the potential RCT solution proposed by the author.This solution makes it possible to global monitor the road infrastructure condition on a continuous basis via numerous users of the transport system,which guarantees that such an assessment is kept up to date.
文摘The City of Calgary did a comparative study between two techniques of traffic data collection: Bluetooth sensors and crowdsourcing, for measuring travel time reliability on two goods movement corridors in Calgary, Alberta. To estimate travel time and speed, we used the output of BIuFAX sensors, which were operated by monitoring Bluetooth signals at several points along a roadway. On the other hand, TomTom historical traffic data were extracted from the TomTom Traffic Stats portal. To calculate travel time reliability, we applied the buffer index, and the planning time index recommended by FHWA (Federal Highway Administration). The Bluetooth traffic data were presumed as the benchmark in this study. Unlike the TomTom traffic data, the data provided by the Bluetooth technology met the minimum recommended sample size requirement, although data processing was time consuming and impractical for long study periods. Our study results showed that crowdsourcing technique can be a viable alternative and provide travel time reliability estimates with a reasonable accuracy, when there are adequate numbers of records registered. However, the TomTom sample sizes in Calgary were not large enough to provide a statistically reliable method of providing travel time indices. Further researches may verify the accuracy of crowdsourcing technologies for travel time studies.
基金Sponsored by the National Natural Science Foundation of China(Grant Nos.61771083,61704015)the Program for Changjiang Scholars and Innovative Research Team in University(Grant No.IRT1299)+3 种基金the Special Fund of Chongqing Key Laboratory(CSTC)Fundamental Science and Frontier Technology Research Project of Chongqing(Grant Nos.cstc2017jcyjAX0380,cstc2015jcyjBX0065)the Scientific and Technological Research Foundation of Chongqing Municipal Education Commission(Grant No.KJ1704083)the University Outstanding Achievement Transformation Project of Chongqing(Grant No.KJZH17117).
文摘Fingerprint⁃based Bluetooth positioning is a popular indoor positioning technology.However,the change of indoor environment and Bluetooth anchor locations has significant impact on signal distribution,which will result in the decline of positioning accuracy.The widespread extension of Bluetooth positioning is limited by the need of manual effort to collect the fingerprints with position labels for fingerprint database construction and updating.To address this problem,this paper presents an adaptive fingerprint database updating approach.First,the crowdsourced data including the Bluetooth Received Signal Strength(RSS)sequences and the speed and heading of the pedestrian were recorded.Second,the recorded crowdsourced data were fused by the Kalman Filtering(KF),and then fed into the trajectory validity analysis model with the purpose of assigning the unlabeled RSS data with position labels to generate candidate fingerprints.Third,after enough candidate fingerprints were obtained at each Reference Point(RP),the Density⁃based Spatial Clustering of Applications with Noise(DBSCAN)approach was conducted on both the original and the candidate fingerprints to filter out the fingerprints which had been identified as the noise,and then the mean of fingerprints in the cluster with the largest data volume was selected as the updated fingerprint of the corresponding RP.Finally,the extensive experimental results show that with the increase of the number of candidate fingerprints and update iterations,the fingerprint⁃based Bluetooth positioning accuracy can be effectively improved.
文摘Weather forecasting has been a critical component to predict and control building energy consumption for better building energy management.Without accessibility to other data sources,the onsite observed temperatures or the airport temperatures are used in forecast models.In this paper,we present a novel approach by utilizing the crowdsourcing weather data from neighboring personal weather stations(PWS)to improve the weather forecast accuracy around buildings using a general spatial-temporal modeling framework.The final forecast is based on the ensemble of local forecasts for the target location using neighboring PWSs.Our approach is distinguished from existing literature in various aspects.First,we leverage the crowdsourcing weather data from PWS in addition to public data sources.In this way,the data is at much finer time resolution(e.g.,at 5-minute frequency)and spatial resolution(e.g.,arbitrary location vs grid).Second,our proposed model incorporates spatial-temporal correlation information of weather variables between the target building and a set of neighboring PWSs so that underlying correlations can be effectively captured to improve forecasting performance.We demonstrate the performance of the proposed framework by comparing to the benchmark models on temperature forecasting for a building located at an arbitrary location at San Antonio,Texas,USA.In general,the proposed model framework equipped with machine learning technique such as Random Forest can improve forecasting by 50%compares with persistent model and has 90%chance to outperform airport forecast in short-term forecasting.In a real-time setting,the proposed model framework can provide more accurate temperature forecasting results compared with using airport temperature forecast for most forecast horizon.Moreover,we analyze the sensitivity of model parameters to gain insights on how crowdsourcing data from the neighboring personal weather stations impacts forecasting performance.Finally,we implement our model in other cities such as Syracuse and Chicago to test the model’s performance in different landforms and climate types.
文摘Crowdsourced data can effectively observe environmental and urban ecosystem processes.The use of data produced by untrained people into flood forecasting models may effectively allow Early Warning Systems(EWS)to better perform while support decision-making to reduce the fatalities and economic losses due to inundation hazard.In this work,we develop a Data Assimilation(DA)method integrating Volunteered Geographic Information(VGI)and a 2D hydraulic model and we test its performances.The proposed framework seeks to extend the capabilities and performances of standard DA works,based on the use of traditional in situ sensors,by assimilating VGI while managing and taking into account the uncertainties related to the quality,and the location and timing of the entire set of observational data.The November 2012 flood in the Italian Tiber River basin was selected as the case study.Results show improvements of the model in terms of uncertainty with a significant persistence of the model updating after the integration of the VGI,even in the case of use of few-selected observations gathered from social media.This will encourage further research in the use of VGI for EWS considering the exponential increase of quality and quantity of smartphone and social media user worldwide.
基金Authors wish to acknowledge the Australian Government for providing support for the research work through the Research Training Program(RTP)and Monique Potts,ABC–Australia for providing the 2011 Australian Flood’s Ushahidi Crowdmap data.
文摘Volunteered geographic information(VGI)can be considered a subset of crowdsourced data(CSD)and its popularity has recently increased in a number of application areas.Disaster management is one of its key application areas in which the benefits of VGI and CSD are potentially very high.However,quality issues such as credibility,reliability and relevance are limiting many of the advantages of utilising CSD.Credibility issues arise as CSD come from a variety of heterogeneous sources including both professionals and untrained citizens.VGI and CSD are also highly unstructured and the quality and metadata are often undocumented.In the 2011 Australian floods,the general public and disaster management administrators used the Ushahidi Crowd-mapping platform to extensively communicate flood-related information including hazards,evacuations,emergency services,road closures and property damage.This study assessed the credibility of the Australian Broadcasting Corporation’s Ushahidi CrowdMap dataset using a Naïve Bayesian network approach based on models commonly used in spam email detection systems.The results of the study reveal that the spam email detection approach is potentially useful for CSD credibility detection with an accuracy of over 90%using a forced classification methodology.
文摘A land-use map at the regional scale is a heavy computation task yet is critical to most landowners,researchers,and decision-makers,enabling them to make informed decisions for varying objectives.There are two major difficulties in generating land classification maps at the regional scale:the necessity of large data-sets of training points and the expensive computation cost in terms of both money and time.Volunteered Geographic Information opens a new era in mapping and visualizing the physical world by providing an open-access database valuable georeferenced information collected by volunteer citizens.As one of the most well-known VGI initiatives,OpenStreetMap(OSM),contributes not only to road network distribution information but also to the potential for using these data to justify and delineate land patterns.Whereas,most large-scale mapping approaches-including regional and national scales–confuse“land cover”and“land-use”,or build up the land-use database based on modeled land cover data-sets,in this study,we clearly distinguished and differentiated land-use from land cover.By focusing on our prime objective of mapping land-use and management practices,a robust regional land-use mapping approach was developed by integrating OSM data with the earth observation remote sensing imagery.Our novel approach incorporates a vital temporal component to large-scale land-use mapping while effectively eliminating the typically burdensome computation and time/money demands of such work.Furthermore,our novel approach in regional scale land-use mapping produced robust results in our study area:the overall internal accuracy of the classifier was 95.2%and the external accuracy of the classifier was measured at 74.8%.