Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk manageme...With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk management,but searching for an efficient and accurate risk assessment algorithm has become a challenge for the civil aviation industry.Therefore,an improved risk assessment algorithm(PS-AE-LSTM)based on long short-term memory network(LSTM)with autoencoder(AE)is proposed for the various supervised deep learning algorithms in flight safety that cannot adequately address the problem of the quality on risk level labels.Firstly,based on the normal distribution characteristics of flight data,a probability severity(PS)model is established to enhance the quality of risk assessment labels.Secondly,autoencoder is introduced to reconstruct the flight parameter data to improve the data quality.Finally,utilizing the time-series nature of flight data,a long and short-termmemory network is used to classify the risk level and improve the accuracy of risk assessment.Thus,a risk assessment experimentwas conducted to analyze a fleet landing phase dataset using the PS-AE-LSTMalgorithm to assess the risk level associated with aircraft hard landing events.The results show that the proposed algorithm achieves an accuracy of 86.45%compared with seven baseline models and has excellent risk assessment capability.展开更多
Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G...Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.展开更多
This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabili...This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabilistic line-of-sight(LoS) channel. Especially, access points(APs) are introduced to collect data from some sensors in the unlicensed band to improve data collection efficiency. We formulate a mixed-integer non-convex optimization problem to minimize the UAV flight time by jointly designing the UAV 3D trajectory and sensors’ scheduling, while ensuring the required amount of data can be collected under the limited UAV energy. To solve this nonconvex problem, we recast the objective problem into a tractable form. Then, the problem is further divided into several sub-problems to solve iteratively, and the successive convex approximation(SCA) scheme is applied to solve each non-convex subproblem. Finally,the bisection search is adopted to speed up the searching for the minimum UAV flight time. Simulation results verify that the UAV flight time can be shortened by the proposed method effectively.展开更多
Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean...Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.展开更多
Remaining useful life(RUL) prediction is one of the most crucial elements in prognostics and health management(PHM). Aiming at the imperfect prior information, this paper proposes an RUL prediction method based on a n...Remaining useful life(RUL) prediction is one of the most crucial elements in prognostics and health management(PHM). Aiming at the imperfect prior information, this paper proposes an RUL prediction method based on a nonlinear random coefficient regression(RCR) model with fusing failure time data.Firstly, some interesting natures of parameters estimation based on the nonlinear RCR model are given. Based on these natures,the failure time data can be fused as the prior information reasonably. Specifically, the fixed parameters are calculated by the field degradation data of the evaluated equipment and the prior information of random coefficient is estimated with fusing the failure time data of congeneric equipment. Then, the prior information of the random coefficient is updated online under the Bayesian framework, the probability density function(PDF) of the RUL with considering the limitation of the failure threshold is performed. Finally, two case studies are used for experimental verification. Compared with the traditional Bayesian method, the proposed method can effectively reduce the influence of imperfect prior information and improve the accuracy of RUL prediction.展开更多
Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time....Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time.The growing volume of tweets with sentiment drifts has led to the need for devising an adaptive approach to detect and handle this drift in real time.This work proposes an adap-tive learning algorithm-based framework,Twitter Sentiment Drift Analysis-Bidir-ectional Encoder Representations from Transformers(TSDA-BERT),which introduces a sentiment drift measure to detect drifts and a domain impact score to adaptively retrain the classification model with domain relevant data in real time.The framework also works on static data by converting them to data streams using the Kafka tool.The experiments conducted on real time and simulated tweets of sports,health care andfinancial topics show that the proposed system is able to detect sentiment drifts and maintain the performance of the classification model,with accuracies of 91%,87%and 90%,respectively.Though the results have been provided only for a few topics,as a proof of concept,this framework can be applied to detect sentiment drifts and perform sentiment classification on real time data streams of any topic.展开更多
There are errors in multi-source uncertain time series data.Truth discovery methods for time series data are effective in finding more accurate values,but some have limitations in their usability.To tackle this challe...There are errors in multi-source uncertain time series data.Truth discovery methods for time series data are effective in finding more accurate values,but some have limitations in their usability.To tackle this challenge,we propose a new and convenient truth discovery method to handle time series data.A more accurate sample is closer to the truth and,consequently,to other accurate samples.Because the mutual-confirm relationship between sensors is very similar to the mutual-quote relationship between web pages,we evaluate sensor reliability based on PageRank and then estimate the truth by sensor reliability.Therefore,this method does not rely on smoothness assumptions or prior knowledge of the data.Finally,we validate the effectiveness and efficiency of the proposed method on real-world and synthetic data sets,respectively.展开更多
The aim of this study is to establish the prevailing conditions of changing climatic trends and change point dates in four selected meteorological stations of Uyo, Benin, Port Harcourt, and Warri in the Niger Delta re...The aim of this study is to establish the prevailing conditions of changing climatic trends and change point dates in four selected meteorological stations of Uyo, Benin, Port Harcourt, and Warri in the Niger Delta region of Nigeria. Using daily or 24-hourly annual maximum series (AMS) data with the Indian Meteorological Department (IMD) and the modified Chowdury Indian Meteorological Department (MCIMD) models were adopted to downscale the time series data. Mann-Kendall (MK) trend and Sen’s Slope Estimator (SSE) test showed a statistically significant trend for Uyo and Benin, while Port Harcourt and Warri showed mild trends. The Sen’s Slope magnitude and variation rate were 21.6, 10.8, 6.00 and 4.4 mm/decade, respectively. The trend change-point analysis showed the initial rainfall change-point dates as 2002, 2005, 1988, and 2000 for Uyo, Benin, Port Harcourt, and Warri, respectively. These prove positive changing climatic conditions for rainfall in the study area. Erosion and flood control facilities analysis and design in the Niger Delta will require the application of Non-stationary IDF modelling.展开更多
Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in t...Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in time series forecasting. However, two problems weaken the performance of TCNs. One is that in dilated casual convolution, causal convolution leads to the receptive fields of outputs being concentrated in the earlier part of the input sequence, whereas the recent input information will be severely lost. The other is that the distribution shift problem in time series has not been adequately solved. To address the first problem, we propose a subsequence-based dilated convolution method (SDC). By using multiple convolutional filters to convolve elements of neighboring subsequences, the method extracts temporal features from a growing receptive field via a growing subsequence rather than a single element. Ultimately, the receptive field of each output element can cover the whole input sequence. To address the second problem, we propose a difference and compensation method (DCM). The method reduces the discrepancies between and within the input sequences by difference operations and then compensates the outputs for the information lost due to difference operations. Based on SDC and DCM, we further construct a temporal subsequence-based convolutional network with difference (TSCND) for time series forecasting. The experimental results show that TSCND can reduce prediction mean squared error by 7.3% and save runtime, compared with state-of-the-art models and vanilla TCN.展开更多
In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed a...In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.展开更多
In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. U...In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. Using the moderate-resolution imaging spectroradiometer (MODIS)/enhanced vegetation index(EVI) collected every eight days during January- July from 2005 to 2008 and the corresponding remote sensing data as experimental materials, we constructed cloud-free images via the Harmonic analysis of time series (HANTS). The cloud-free images were then treated by dynamic threshold method for obtaining the vegetation phenology in green up period and its distribution pattern. And the distribution pattern between freezing disaster year and normal year were comparatively analyzed for revealing the effect of freezing disaster on vegetation phenology in experimental plot. The result showed that the treated EVI data performed well in monitoring the effect of freezing disaster on vegetation phenology, accurately reflecting the regions suffered from freezing disaster. This result suggests that processing of remote sensing data using HANTS method could well monitor the ecological characteristics of vegetation.展开更多
This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great s...Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great significance for exploiting and protecting the ocean.We used hourly mean wave height,temperature,and pressure real-time observation data taken in the Xiaomaidao station(in Qingdao,China)from June 1,2017,to May 31,2018,to explore the data quality using eight quality control methods,and to discriminate the most effective method for Xiaomaidao station.After using the eight quality control methods,the percentages of the mean wave height,temperature,and pressure data that passed the tests were 89.6%,88.3%,and 98.6%,respectively.With the marine disaster(wave alarm report)data,the values failed in the test mainly due to the influence of aging observation equipment and missing data transmissions.The mean wave height is often affected by dynamic marine disasters,so the continuity test method is not effective.The correlation test with other related parameters would be more useful for the mean wave height.展开更多
Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with...Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with traffic data collected by discrete loop detectors as well as the web-crawl weather data. Matched case-control method and support vector machines (SVMs) technique were employed to identify the risk status. The adaptive synthetic over-sampling technique was applied to solve the imbalanced dataset issues. Random forest technique was applied to select the contributing factors and avoid the over-fitting issues. The results indicate that the SVMs classifier could successfully classify 76.32% of the crashes on the test dataset and 87.52% of the crashes on the overall dataset, which were relatively satisfactory compared with the results of the previous studies. Compared with the SVMs classifier without the data, the SVMs classifier with the web-crawl weather data increased the crash prediction accuracy by 1.32% and decreased the false alarm rate by 1.72%, showing the potential value of the massive web weather data. Mean impact value method was employed to evaluate the variable effects, and the results are identical with the results of most of previous studies. The emerging technique based on the discrete traffic data and web weather data proves to be more applicable on real- time safety management on freeways.展开更多
An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb...An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.展开更多
In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be ma...In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k -dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.展开更多
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
基金the National Natural Science Foundation of China(U2033213)the Fundamental Research Funds for the Central Universities(FZ2021ZZ01,FZ2022ZX50).
文摘With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk management,but searching for an efficient and accurate risk assessment algorithm has become a challenge for the civil aviation industry.Therefore,an improved risk assessment algorithm(PS-AE-LSTM)based on long short-term memory network(LSTM)with autoencoder(AE)is proposed for the various supervised deep learning algorithms in flight safety that cannot adequately address the problem of the quality on risk level labels.Firstly,based on the normal distribution characteristics of flight data,a probability severity(PS)model is established to enhance the quality of risk assessment labels.Secondly,autoencoder is introduced to reconstruct the flight parameter data to improve the data quality.Finally,utilizing the time-series nature of flight data,a long and short-termmemory network is used to classify the risk level and improve the accuracy of risk assessment.Thus,a risk assessment experimentwas conducted to analyze a fleet landing phase dataset using the PS-AE-LSTMalgorithm to assess the risk level associated with aircraft hard landing events.The results show that the proposed algorithm achieves an accuracy of 86.45%compared with seven baseline models and has excellent risk assessment capability.
基金This work was supported by the National Science and Technology Council,Taiwan,under Project NSTC 112-2221-E-029-015.
文摘Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.
基金supported by the National Key Research and Development Program under Grant 2022YFB3303702the Key Program of National Natural Science Foundation of China under Grant 61931001+1 种基金supported by the National Natural Science Foundation of China under Grant No.62203368the Natural Science Foundation of Sichuan Province under Grant No.2023NSFSC1440。
文摘This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabilistic line-of-sight(LoS) channel. Especially, access points(APs) are introduced to collect data from some sensors in the unlicensed band to improve data collection efficiency. We formulate a mixed-integer non-convex optimization problem to minimize the UAV flight time by jointly designing the UAV 3D trajectory and sensors’ scheduling, while ensuring the required amount of data can be collected under the limited UAV energy. To solve this nonconvex problem, we recast the objective problem into a tractable form. Then, the problem is further divided into several sub-problems to solve iteratively, and the successive convex approximation(SCA) scheme is applied to solve each non-convex subproblem. Finally,the bisection search is adopted to speed up the searching for the minimum UAV flight time. Simulation results verify that the UAV flight time can be shortened by the proposed method effectively.
基金The National Key R&D Program of China under contract No.2021YFC3101603.
文摘Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.
基金supported by National Natural Science Foundation of China (61703410,61873175,62073336,61873273,61773386,61922089)。
文摘Remaining useful life(RUL) prediction is one of the most crucial elements in prognostics and health management(PHM). Aiming at the imperfect prior information, this paper proposes an RUL prediction method based on a nonlinear random coefficient regression(RCR) model with fusing failure time data.Firstly, some interesting natures of parameters estimation based on the nonlinear RCR model are given. Based on these natures,the failure time data can be fused as the prior information reasonably. Specifically, the fixed parameters are calculated by the field degradation data of the evaluated equipment and the prior information of random coefficient is estimated with fusing the failure time data of congeneric equipment. Then, the prior information of the random coefficient is updated online under the Bayesian framework, the probability density function(PDF) of the RUL with considering the limitation of the failure threshold is performed. Finally, two case studies are used for experimental verification. Compared with the traditional Bayesian method, the proposed method can effectively reduce the influence of imperfect prior information and improve the accuracy of RUL prediction.
文摘Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time.The growing volume of tweets with sentiment drifts has led to the need for devising an adaptive approach to detect and handle this drift in real time.This work proposes an adap-tive learning algorithm-based framework,Twitter Sentiment Drift Analysis-Bidir-ectional Encoder Representations from Transformers(TSDA-BERT),which introduces a sentiment drift measure to detect drifts and a domain impact score to adaptively retrain the classification model with domain relevant data in real time.The framework also works on static data by converting them to data streams using the Kafka tool.The experiments conducted on real time and simulated tweets of sports,health care andfinancial topics show that the proposed system is able to detect sentiment drifts and maintain the performance of the classification model,with accuracies of 91%,87%and 90%,respectively.Though the results have been provided only for a few topics,as a proof of concept,this framework can be applied to detect sentiment drifts and perform sentiment classification on real time data streams of any topic.
基金National Natural Science Foundation of China(No.62002131)Shuangchuang Ph.D Award(from World Prestigious Universities)of Jiangsu Province,China(No.JSSCBS20211179)。
文摘There are errors in multi-source uncertain time series data.Truth discovery methods for time series data are effective in finding more accurate values,but some have limitations in their usability.To tackle this challenge,we propose a new and convenient truth discovery method to handle time series data.A more accurate sample is closer to the truth and,consequently,to other accurate samples.Because the mutual-confirm relationship between sensors is very similar to the mutual-quote relationship between web pages,we evaluate sensor reliability based on PageRank and then estimate the truth by sensor reliability.Therefore,this method does not rely on smoothness assumptions or prior knowledge of the data.Finally,we validate the effectiveness and efficiency of the proposed method on real-world and synthetic data sets,respectively.
文摘The aim of this study is to establish the prevailing conditions of changing climatic trends and change point dates in four selected meteorological stations of Uyo, Benin, Port Harcourt, and Warri in the Niger Delta region of Nigeria. Using daily or 24-hourly annual maximum series (AMS) data with the Indian Meteorological Department (IMD) and the modified Chowdury Indian Meteorological Department (MCIMD) models were adopted to downscale the time series data. Mann-Kendall (MK) trend and Sen’s Slope Estimator (SSE) test showed a statistically significant trend for Uyo and Benin, while Port Harcourt and Warri showed mild trends. The Sen’s Slope magnitude and variation rate were 21.6, 10.8, 6.00 and 4.4 mm/decade, respectively. The trend change-point analysis showed the initial rainfall change-point dates as 2002, 2005, 1988, and 2000 for Uyo, Benin, Port Harcourt, and Warri, respectively. These prove positive changing climatic conditions for rainfall in the study area. Erosion and flood control facilities analysis and design in the Niger Delta will require the application of Non-stationary IDF modelling.
基金supported by the National Key Research and Development Program of China(No.2018YFB2101300)the National Natural Science Foundation of China(Grant No.61871186)the Dean’s Fund of Engineering Research Center of Software/Hardware Co-Design Technology and Application,Ministry of Education(East China Normal University).
文摘Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in time series forecasting. However, two problems weaken the performance of TCNs. One is that in dilated casual convolution, causal convolution leads to the receptive fields of outputs being concentrated in the earlier part of the input sequence, whereas the recent input information will be severely lost. The other is that the distribution shift problem in time series has not been adequately solved. To address the first problem, we propose a subsequence-based dilated convolution method (SDC). By using multiple convolutional filters to convolve elements of neighboring subsequences, the method extracts temporal features from a growing receptive field via a growing subsequence rather than a single element. Ultimately, the receptive field of each output element can cover the whole input sequence. To address the second problem, we propose a difference and compensation method (DCM). The method reduces the discrepancies between and within the input sequences by difference operations and then compensates the outputs for the information lost due to difference operations. Based on SDC and DCM, we further construct a temporal subsequence-based convolutional network with difference (TSCND) for time series forecasting. The experimental results show that TSCND can reduce prediction mean squared error by 7.3% and save runtime, compared with state-of-the-art models and vanilla TCN.
基金The Soft Science Research Project of Ministry of Housing and Urban-Rural Development of China (No. 2008-k5-14)
文摘In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.
文摘In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. Using the moderate-resolution imaging spectroradiometer (MODIS)/enhanced vegetation index(EVI) collected every eight days during January- July from 2005 to 2008 and the corresponding remote sensing data as experimental materials, we constructed cloud-free images via the Harmonic analysis of time series (HANTS). The cloud-free images were then treated by dynamic threshold method for obtaining the vegetation phenology in green up period and its distribution pattern. And the distribution pattern between freezing disaster year and normal year were comparatively analyzed for revealing the effect of freezing disaster on vegetation phenology in experimental plot. The result showed that the treated EVI data performed well in monitoring the effect of freezing disaster on vegetation phenology, accurately reflecting the regions suffered from freezing disaster. This result suggests that processing of remote sensing data using HANTS method could well monitor the ecological characteristics of vegetation.
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
基金Supported by the National Key Research and Development Program of China(Nos.2016YFC1402000,2018YFC1407003,2017YFC1405300)
文摘Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great significance for exploiting and protecting the ocean.We used hourly mean wave height,temperature,and pressure real-time observation data taken in the Xiaomaidao station(in Qingdao,China)from June 1,2017,to May 31,2018,to explore the data quality using eight quality control methods,and to discriminate the most effective method for Xiaomaidao station.After using the eight quality control methods,the percentages of the mean wave height,temperature,and pressure data that passed the tests were 89.6%,88.3%,and 98.6%,respectively.With the marine disaster(wave alarm report)data,the values failed in the test mainly due to the influence of aging observation equipment and missing data transmissions.The mean wave height is often affected by dynamic marine disasters,so the continuity test method is not effective.The correlation test with other related parameters would be more useful for the mean wave height.
基金supported by the National Natural Science Foundation (71301119)the Shanghai Natural Science Foundation (12ZR1434100)
文摘Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with traffic data collected by discrete loop detectors as well as the web-crawl weather data. Matched case-control method and support vector machines (SVMs) technique were employed to identify the risk status. The adaptive synthetic over-sampling technique was applied to solve the imbalanced dataset issues. Random forest technique was applied to select the contributing factors and avoid the over-fitting issues. The results indicate that the SVMs classifier could successfully classify 76.32% of the crashes on the test dataset and 87.52% of the crashes on the overall dataset, which were relatively satisfactory compared with the results of the previous studies. Compared with the SVMs classifier without the data, the SVMs classifier with the web-crawl weather data increased the crash prediction accuracy by 1.32% and decreased the false alarm rate by 1.72%, showing the potential value of the massive web weather data. Mean impact value method was employed to evaluate the variable effects, and the results are identical with the results of most of previous studies. The emerging technique based on the discrete traffic data and web weather data proves to be more applicable on real- time safety management on freeways.
文摘An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.
文摘In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k -dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.