Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlatio...Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlation of spatiotemporal heterogeneous data in the temporal and spatial dimensions,traditional detection methods can not guarantee both detection speed and accuracy.Therefore,this article proposes a method for detecting the accuracy of spatiotemporal heterogeneous data by fusing graph convolution and temporal convolution networks.Firstly,the geographic weighting function is introduced and improved to quantify the degree of association between nodes and calculate the weighted adjacency value to simplify the complex topology.Secondly,design spatiotemporal convolutional units based on graph convolutional neural networks and temporal convolutional networks to improve detection speed and accuracy.Finally,the proposed method is compared with three methods,ARIMA,T-GCN,and STGCN,in real scenarios to verify its effectiveness in terms of detection speed,detection accuracy and stability.The experimental results show that the RMSE,MAE,and MAPE of this method are the smallest in the cases of simple connectivity and complex connectivity degree,which are 13.82/12.08,2.77/2.41,and 16.70/14.73,respectively.Also,it detects the shortest time of 672.31/887.36,respectively.In addition,the evaluation results are the same under different time periods of processing and complex topology environment,which indicates that the detection accuracy of this method is the highest and has good research value and application prospects.展开更多
Data is always a crucial issue of concern especially during its prediction and computation in digital revolution.This paper exactly helps in providing efficient learning mechanism for accurate predictability and reduc...Data is always a crucial issue of concern especially during its prediction and computation in digital revolution.This paper exactly helps in providing efficient learning mechanism for accurate predictability and reducing redundant data communication.It also discusses the Bayesian analysis that finds the conditional probability of at least two parametric based predictions for the data.The paper presents a method for improving the performance of Bayesian classification using the combination of Kalman Filter and K-means.The method is applied on a small dataset just for establishing the fact that the proposed algorithm can reduce the time for computing the clusters from data.The proposed Bayesian learning probabilistic model is used to check the statistical noise and other inaccuracies using unknown variables.This scenario is being implemented using efficient machine learning algorithm to perpetuate the Bayesian probabilistic approach.It also demonstrates the generative function forKalman-filer based prediction model and its observations.This paper implements the algorithm using open source platform of Python and efficiently integrates all different modules to piece of code via Common Platform Enumeration(CPE)for Python.展开更多
Secure authentication and accurate localization among Internet of Things(IoT)sensors are pivotal for the functionality and integrity of IoT networks.IoT authentication and localization are intricate and symbiotic,impa...Secure authentication and accurate localization among Internet of Things(IoT)sensors are pivotal for the functionality and integrity of IoT networks.IoT authentication and localization are intricate and symbiotic,impacting both the security and operational functionality of IoT systems.Hence,accurate localization and lightweight authentication on resource-constrained IoT devices pose several challenges.To overcome these challenges,recent approaches have used encryption techniques with well-known key infrastructures.However,these methods are inefficient due to the increasing number of data breaches in their localization approaches.This proposed research efficiently integrates authentication and localization processes in such a way that they complement each other without compromising on security or accuracy.The proposed framework aims to detect active attacks within IoT networks,precisely localize malicious IoT devices participating in these attacks,and establish dynamic implicit authentication mechanisms.This integrated framework proposes a Correlation Composition Awareness(CCA)model,which explores innovative approaches to device correlations,enhancing the accuracy of attack detection and localization.Additionally,this framework introduces the Pair Collaborative Localization(PCL)technique,facilitating precise identification of the exact locations of malicious IoT devices.To address device authentication,a Behavior and Performance Measurement(BPM)scheme is developed,ensuring that only trusted devices gain access to the network.This work has been evaluated across various environments and compared against existing models.The results prove that the proposed methodology attains 96%attack detection accuracy,84%localization accuracy,and 98%device authentication accuracy.展开更多
Recently,the application of Wireless Sensor Networks(WSNs)has been increasing rapidly.It requires privacy preserving data aggregation protocols to secure the data from compromises.Preserving privacy of the sensor data...Recently,the application of Wireless Sensor Networks(WSNs)has been increasing rapidly.It requires privacy preserving data aggregation protocols to secure the data from compromises.Preserving privacy of the sensor data is a challenging task.This paper presents a non-linear regression-based data aggregation protocol for preserving privacy of the sensor data.The proposed protocol uses non-linear regression functions to represent the sensor data collected from the sensor nodes.Instead of sending the complete data to the cluster head,the sensor nodes only send the coefficients of the non-linear function.This will reduce the communication overhead of the network.The data aggregation is performed on the masked coefficients and the sink node is able to retrieve the approximated results over the aggregated data.The analysis of experiment results shows that the proposed protocol is able to minimize communication overhead,enhance data aggregation accuracy,and preserve data privacy.展开更多
The objective of traffic accident reconstruction is to recreate the event, which is necessary for analyzing the collision dynamics that is used as evidence in court cases. Traffic accident reconstruction and a demonst...The objective of traffic accident reconstruction is to recreate the event, which is necessary for analyzing the collision dynamics that is used as evidence in court cases. Traffic accident reconstruction and a demonstration of the event require precise data pertaining to scene measurement. However, there are differences between the individual measuring tools and methods related to traffic accident investigation, just as there are differences between the extent of their use and measurement accuracy. The most commonly applied method is the measuring tape, followed by measurements with total stations and laser rangefinders, while photogrammetry is also becoming increasingly important. The advantages and disadvantages of individual tools and methods affect the required number of investigators, portability, measurement range, applicability depending on the amount of light and weather conditions, on the possibility of remote measurement, on data collection time, on the scope, on the option to later process, the collected data and above all on the accuracy of all gathered data. The latter is crucial for proving the guilt or innocence of traffic accident participants at court, as inaccurate data can lead to an unjust sentence. Measurement accuracy using the above mentioned tools and methods also varies depending on which ones are used, as well as on other factors.展开更多
In this paper,an Adaptive-Weighted Time-Dimensional and Space-Dimensional(AWTDSD) data aggregation algorithm for a clustered sensor network is proposed for prolonging the lifetime of the network as well as improving t...In this paper,an Adaptive-Weighted Time-Dimensional and Space-Dimensional(AWTDSD) data aggregation algorithm for a clustered sensor network is proposed for prolonging the lifetime of the network as well as improving the accuracy of the data gathered in the network.AWTDSD contains three phases:(1) the time-dimensional aggregation phase for eliminating the data redundancy;(2) the adaptive-weighted aggregation phase for further aggregating the data as well as improving the accuracy of the aggregated data; and(3) the space-dimensional aggregation phase for reducing the size and the amount of the data transmission to the base station.AWTDSD utilizes the correlations between the sensed data for reducing the data transmission and increasing the data accuracy as well.Experimental result shows that AWTDSD can not only save almost a half of the total energy consumption but also greatly increase the accuracy of the data monitored by the sensors in the clustered network.展开更多
Objective:This study aimed to evaluate the system accuracy of four types of blood glucose monitoring systems(BGMSs)and explore the differences in the system accuracy acceptability of each BGMS against five different s...Objective:This study aimed to evaluate the system accuracy of four types of blood glucose monitoring systems(BGMSs)and explore the differences in the system accuracy acceptability of each BGMS against five different standards.Methods:The glucose measurement values obtained from four types of BGMSs(Roche Accu-Chek® Performa,Bayer Contour™ TS,Sinomedisite Glupad® H1 Plus,and Sinocare® Gold-Accu)were evaluated against the reference values obtained from the biochemical analyzer of the central laboratory.The system accuracy acceptability of each BGMS was determined using the criteria specified in five standards,namely the International Organization for Standardization(ISO)15197:2003,Clinical Laboratory Standards Institute(CLSI)POCT12-A3,ISO 15197:2013,Chinese Society of Laboratory Medicine(CSLM)consensus,and US Food and Drug Administration(FDA)guidelines.Results:From 2018 to 2022,10,980 pairs of measurement values were obtained from 366 glucose meters of four types of BGMSs.Significant correlations were observed between the glucose measurement values from the BGMSs and the reference values from the biochemical analyzer of the central laboratory.The correlation coefficient r was 0.995 for Roche Accu-Chek® Performa,0.994 for Bayer Contour™ TS,0.983 for Sinomedisite Glupad® H1 Plus,and 0.997 for SinocareR Gold-Accu.The acceptability criteria specified in ISO 15197:2003 were met by 100.00%(135/135)of the glucose meters of Roche Accu-Chek® Performa,100.00%(109/109)of Bayer Contour™ TS,81.61%(71/87)of Sinomedisite Glupad® H1 Plus,and 100.00%(35/35)of SinocareR Gold-Accu.Whereas,the acceptability criteria specified in ISO 15197:2013 were met by 99.26%(134/135)of the glucose meters of Roche Accu-Chek® Performa,88.07%(96/109)of Bayer Contour™ TS,58.62%(51/87)of Sinomedisite Glupad R H1 Plus,and 91.43%(32/35)of SinocareR Gold-Accu.Conclusions:Among the four types of BGMSs evaluated,the glucose meters of Roche Accu-Chek® Performa exhibited superior system accuracy.The system accuracy acceptability of each BGMS varied significantly against the acceptability criteria specified in the five different standards.展开更多
Synchrophasor systems, providing low-latency,high-precision, and time-synchronized measurements to enhance power grid performances, are deployed globally.However, the synchrophasor system as a physical network,involve...Synchrophasor systems, providing low-latency,high-precision, and time-synchronized measurements to enhance power grid performances, are deployed globally.However, the synchrophasor system as a physical network,involves communication constraints and data quality issues, which will impact or even disable certain synchrophasor applications. This work investigates the data quality issue for synchrophasor applications. In Part I, the standards of synchrophasor systems and the classifications and data quality requirements of synchrophasor applications are reviewed. Also, the actual events of synchronization signal accuracy, synchrophasor data loss, and latency are counted and analyzed. The review and statistics are expected to provide an overall picture of data accuracy,loss, and latency issues for synchrophasor applications.展开更多
基金supported by the National Natural Science Foundation of China under Grants 42172161by the Heilongjiang Provincial Natural Science Foundation of China under Grant LH2020F003+2 种基金by the Heilongjiang Provincial Department of Education Project of China under Grants UNPYSCT-2020144by the Innovation Guidance Fund of Heilongjiang Province of China under Grants 15071202202by the Science and Technology Bureau Project of Qinhuangdao Province of China under Grants 202101A226.
文摘Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlation of spatiotemporal heterogeneous data in the temporal and spatial dimensions,traditional detection methods can not guarantee both detection speed and accuracy.Therefore,this article proposes a method for detecting the accuracy of spatiotemporal heterogeneous data by fusing graph convolution and temporal convolution networks.Firstly,the geographic weighting function is introduced and improved to quantify the degree of association between nodes and calculate the weighted adjacency value to simplify the complex topology.Secondly,design spatiotemporal convolutional units based on graph convolutional neural networks and temporal convolutional networks to improve detection speed and accuracy.Finally,the proposed method is compared with three methods,ARIMA,T-GCN,and STGCN,in real scenarios to verify its effectiveness in terms of detection speed,detection accuracy and stability.The experimental results show that the RMSE,MAE,and MAPE of this method are the smallest in the cases of simple connectivity and complex connectivity degree,which are 13.82/12.08,2.77/2.41,and 16.70/14.73,respectively.Also,it detects the shortest time of 672.31/887.36,respectively.In addition,the evaluation results are the same under different time periods of processing and complex topology environment,which indicates that the detection accuracy of this method is the highest and has good research value and application prospects.
文摘Data is always a crucial issue of concern especially during its prediction and computation in digital revolution.This paper exactly helps in providing efficient learning mechanism for accurate predictability and reducing redundant data communication.It also discusses the Bayesian analysis that finds the conditional probability of at least two parametric based predictions for the data.The paper presents a method for improving the performance of Bayesian classification using the combination of Kalman Filter and K-means.The method is applied on a small dataset just for establishing the fact that the proposed algorithm can reduce the time for computing the clusters from data.The proposed Bayesian learning probabilistic model is used to check the statistical noise and other inaccuracies using unknown variables.This scenario is being implemented using efficient machine learning algorithm to perpetuate the Bayesian probabilistic approach.It also demonstrates the generative function forKalman-filer based prediction model and its observations.This paper implements the algorithm using open source platform of Python and efficiently integrates all different modules to piece of code via Common Platform Enumeration(CPE)for Python.
文摘Secure authentication and accurate localization among Internet of Things(IoT)sensors are pivotal for the functionality and integrity of IoT networks.IoT authentication and localization are intricate and symbiotic,impacting both the security and operational functionality of IoT systems.Hence,accurate localization and lightweight authentication on resource-constrained IoT devices pose several challenges.To overcome these challenges,recent approaches have used encryption techniques with well-known key infrastructures.However,these methods are inefficient due to the increasing number of data breaches in their localization approaches.This proposed research efficiently integrates authentication and localization processes in such a way that they complement each other without compromising on security or accuracy.The proposed framework aims to detect active attacks within IoT networks,precisely localize malicious IoT devices participating in these attacks,and establish dynamic implicit authentication mechanisms.This integrated framework proposes a Correlation Composition Awareness(CCA)model,which explores innovative approaches to device correlations,enhancing the accuracy of attack detection and localization.Additionally,this framework introduces the Pair Collaborative Localization(PCL)technique,facilitating precise identification of the exact locations of malicious IoT devices.To address device authentication,a Behavior and Performance Measurement(BPM)scheme is developed,ensuring that only trusted devices gain access to the network.This work has been evaluated across various environments and compared against existing models.The results prove that the proposed methodology attains 96%attack detection accuracy,84%localization accuracy,and 98%device authentication accuracy.
文摘Recently,the application of Wireless Sensor Networks(WSNs)has been increasing rapidly.It requires privacy preserving data aggregation protocols to secure the data from compromises.Preserving privacy of the sensor data is a challenging task.This paper presents a non-linear regression-based data aggregation protocol for preserving privacy of the sensor data.The proposed protocol uses non-linear regression functions to represent the sensor data collected from the sensor nodes.Instead of sending the complete data to the cluster head,the sensor nodes only send the coefficients of the non-linear function.This will reduce the communication overhead of the network.The data aggregation is performed on the masked coefficients and the sink node is able to retrieve the approximated results over the aggregated data.The analysis of experiment results shows that the proposed protocol is able to minimize communication overhead,enhance data aggregation accuracy,and preserve data privacy.
文摘The objective of traffic accident reconstruction is to recreate the event, which is necessary for analyzing the collision dynamics that is used as evidence in court cases. Traffic accident reconstruction and a demonstration of the event require precise data pertaining to scene measurement. However, there are differences between the individual measuring tools and methods related to traffic accident investigation, just as there are differences between the extent of their use and measurement accuracy. The most commonly applied method is the measuring tape, followed by measurements with total stations and laser rangefinders, while photogrammetry is also becoming increasingly important. The advantages and disadvantages of individual tools and methods affect the required number of investigators, portability, measurement range, applicability depending on the amount of light and weather conditions, on the possibility of remote measurement, on data collection time, on the scope, on the option to later process, the collected data and above all on the accuracy of all gathered data. The latter is crucial for proving the guilt or innocence of traffic accident participants at court, as inaccurate data can lead to an unjust sentence. Measurement accuracy using the above mentioned tools and methods also varies depending on which ones are used, as well as on other factors.
基金Supported by the Promotive Research Fund for Excellent Young and Middle-aged Scientists of Shandong Province(No.BS2010DX010)the Project of Higher Educational Science and Technology Program of Shandong Province(No.J12LN36)
文摘In this paper,an Adaptive-Weighted Time-Dimensional and Space-Dimensional(AWTDSD) data aggregation algorithm for a clustered sensor network is proposed for prolonging the lifetime of the network as well as improving the accuracy of the data gathered in the network.AWTDSD contains three phases:(1) the time-dimensional aggregation phase for eliminating the data redundancy;(2) the adaptive-weighted aggregation phase for further aggregating the data as well as improving the accuracy of the aggregated data; and(3) the space-dimensional aggregation phase for reducing the size and the amount of the data transmission to the base station.AWTDSD utilizes the correlations between the sensed data for reducing the data transmission and increasing the data accuracy as well.Experimental result shows that AWTDSD can not only save almost a half of the total energy consumption but also greatly increase the accuracy of the data monitored by the sensors in the clustered network.
文摘Objective:This study aimed to evaluate the system accuracy of four types of blood glucose monitoring systems(BGMSs)and explore the differences in the system accuracy acceptability of each BGMS against five different standards.Methods:The glucose measurement values obtained from four types of BGMSs(Roche Accu-Chek® Performa,Bayer Contour™ TS,Sinomedisite Glupad® H1 Plus,and Sinocare® Gold-Accu)were evaluated against the reference values obtained from the biochemical analyzer of the central laboratory.The system accuracy acceptability of each BGMS was determined using the criteria specified in five standards,namely the International Organization for Standardization(ISO)15197:2003,Clinical Laboratory Standards Institute(CLSI)POCT12-A3,ISO 15197:2013,Chinese Society of Laboratory Medicine(CSLM)consensus,and US Food and Drug Administration(FDA)guidelines.Results:From 2018 to 2022,10,980 pairs of measurement values were obtained from 366 glucose meters of four types of BGMSs.Significant correlations were observed between the glucose measurement values from the BGMSs and the reference values from the biochemical analyzer of the central laboratory.The correlation coefficient r was 0.995 for Roche Accu-Chek® Performa,0.994 for Bayer Contour™ TS,0.983 for Sinomedisite Glupad® H1 Plus,and 0.997 for SinocareR Gold-Accu.The acceptability criteria specified in ISO 15197:2003 were met by 100.00%(135/135)of the glucose meters of Roche Accu-Chek® Performa,100.00%(109/109)of Bayer Contour™ TS,81.61%(71/87)of Sinomedisite Glupad® H1 Plus,and 100.00%(35/35)of SinocareR Gold-Accu.Whereas,the acceptability criteria specified in ISO 15197:2013 were met by 99.26%(134/135)of the glucose meters of Roche Accu-Chek® Performa,88.07%(96/109)of Bayer Contour™ TS,58.62%(51/87)of Sinomedisite Glupad R H1 Plus,and 91.43%(32/35)of SinocareR Gold-Accu.Conclusions:Among the four types of BGMSs evaluated,the glucose meters of Roche Accu-Chek® Performa exhibited superior system accuracy.The system accuracy acceptability of each BGMS varied significantly against the acceptability criteria specified in the five different standards.
基金supported in part by the U.S.National Science Foundation(U.S.NSF)through the U.S.NSF/Department of Energy(DOE)Engineering Research Center Program under Award EEC-1041877 for CURENT
文摘Synchrophasor systems, providing low-latency,high-precision, and time-synchronized measurements to enhance power grid performances, are deployed globally.However, the synchrophasor system as a physical network,involves communication constraints and data quality issues, which will impact or even disable certain synchrophasor applications. This work investigates the data quality issue for synchrophasor applications. In Part I, the standards of synchrophasor systems and the classifications and data quality requirements of synchrophasor applications are reviewed. Also, the actual events of synchronization signal accuracy, synchrophasor data loss, and latency are counted and analyzed. The review and statistics are expected to provide an overall picture of data accuracy,loss, and latency issues for synchrophasor applications.