As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ...As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.展开更多
The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the...The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the health state and carrying out vibration suppression of the equipment.In engineering scenarios,co-frequency vibration faults are highlighted by rotational frequency and are difficult to identify,and existing intelligent methods require more hardware conditions and are exclusively time-consuming.Therefore,Lightweight-convolutional neural networks(LW-CNN)algorithm is proposed in this paper to achieve real-time fault diagnosis.The critical parameters are discussed and verified by simulated and experimental signals for the sliding window data augmentation method.Based on LW-CNN and data augmentation,the real-time intelligent diagnosis of co-frequency is realized.Moreover,a real-time detection method of fault diagnosis algorithm is proposed for data acquisition to fault diagnosis.It is verified by experiments that the LW-CNN and sliding window methods are used with high accuracy and real-time performance.展开更多
Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policy...Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policymanagement efficiency and difficulty in accurately describing the access control policy. To overcome theseproblems, this paper proposes a big data access control mechanism based on a two-layer permission decisionstructure. This mechanism extends the attribute-based access control (ABAC) model. Business attributes areintroduced in the ABAC model as business constraints between entities. The proposed mechanism implementsa two-layer permission decision structure composed of the inherent attributes of access control entities and thebusiness attributes, which constitute the general permission decision algorithm based on logical calculation andthe business permission decision algorithm based on a bi-directional long short-term memory (BiLSTM) neuralnetwork, respectively. The general permission decision algorithm is used to implement accurate policy decisions,while the business permission decision algorithm implements fuzzy decisions based on the business constraints.The BiLSTM neural network is used to calculate the similarity of the business attributes to realize intelligent,adaptive, and efficient access control permission decisions. Through the two-layer permission decision structure,the complex and diverse big data access control management requirements can be satisfied by considering thesecurity and availability of resources. Experimental results show that the proposed mechanism is effective andreliable. In summary, it can efficiently support the secure sharing of big data resources.展开更多
Data trading enables data owners and data requesters to sell and purchase data.With the emergence of blockchain technology,research on blockchain-based data trading systems is receiving a lot of attention.Particularly...Data trading enables data owners and data requesters to sell and purchase data.With the emergence of blockchain technology,research on blockchain-based data trading systems is receiving a lot of attention.Particularly,to reduce the on-chain storage cost,a novel paradigm of blockchain and cloud fusion has been widely considered as a promising data trading platform.Moreover,the fact that data can be used for commercial purposes will encourage users and organizations from various fields to participate in the data marketplace.In the data marketplace,it is a challenge how to trade the data securely outsourced to the external cloud in a way that restricts access to the data only to authorized users across multiple domains.In this paper,we propose a cross-domain bilateral access control protocol for blockchain-cloud based data trading systems.We consider a system model that consists of domain authorities,data senders,data receivers,a blockchain layer,and a cloud provider.The proposed protocol enables access control and source identification of the outsourced data by leveraging identity-based cryptographic techniques.In the proposed protocol,the outsourced data of the sender is encrypted under the target receiver’s identity,and the cloud provider performs policy-match verification on the authorization tags of the sender and receiver generated by the identity-based signature scheme.Therefore,data trading can be achieved only if the identities of the data sender and receiver simultaneously meet the policies specified by each other.To demonstrate efficiency,we evaluate the performance of the proposed protocol and compare it with existing studies.展开更多
In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on ...In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on machine learning and mud logging data is studied in this paper.This method can effectively utilize downhole parameters collected in real-time during drilling,to identify lithology in real-time and provide a reference for optimization of drilling parameters.Given the imbalance of lithology samples,the synthetic minority over-sampling technique(SMOTE)and Tomek link were used to balance the sample number of five lithologies.Meanwhile,this paper introduces Tent map,random opposition-based learning and dynamic perceived probability to the original crow search algorithm(CSA),and establishes an improved crow search algorithm(ICSA).In this paper,ICSA is used to optimize the hyperparameter combination of random forest(RF),extremely random trees(ET),extreme gradient boosting(XGB),and light gradient boosting machine(LGBM)models.In addition,this study combines the recognition advantages of the four models.The accuracy of lithology identification by the weighted average probability model reaches 0.877.The study of this paper realizes high-precision real-time lithology identification method,which can provide lithology reference for the drilling process.展开更多
The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initiall...The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.展开更多
Predicting the mechanical behaviors of structure and perceiving the anomalies in advance are essential to ensuring the safe operation of infrastructures in the long run.In addition to the incomplete consideration of i...Predicting the mechanical behaviors of structure and perceiving the anomalies in advance are essential to ensuring the safe operation of infrastructures in the long run.In addition to the incomplete consideration of influencing factors,the prediction time scale of existing studies is rough.Therefore,this study focuses on the development of a real-time prediction model by coupling the spatio-temporal correlation with external load through autoencoder network(ATENet)based on structural health monitoring(SHM)data.An autoencoder mechanism is performed to acquire the high-level representation of raw monitoring data at different spatial positions,and the recurrent neural network is applied to understanding the temporal correlation from the time series.Then,the obtained temporal-spatial information is coupled with dynamic loads through a fully connected layer to predict structural performance in next 12 h.As a case study,the proposed model is formulated on the SHM data collected from a representative underwater shield tunnel.The robustness study is carried out to verify the reliability and the prediction capability of the proposed model.Finally,the ATENet model is compared with some typical models,and the results indicate that it has the best performance.ATENet model is of great value to predict the realtime evolution trend of tunnel structure.展开更多
Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these adv...Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these advancements, efficiently programming GPUs remains a daunting challenge, often relying on trial-and-error optimization methods. This paper introduces an optimization technique for CUDA programs through a novel Data Layout strategy, aimed at restructuring memory data arrangement to significantly enhance data access locality. Focusing on the dynamic programming algorithm for chained matrix multiplication—a critical operation across various domains including artificial intelligence (AI), high-performance computing (HPC), and the Internet of Things (IoT)—this technique facilitates more localized access. We specifically illustrate the importance of efficient matrix multiplication in these areas, underscoring the technique’s broader applicability and its potential to address some of the most pressing computational challenges in GPU-accelerated applications. Our findings reveal a remarkable reduction in memory consumption and a substantial 50% decrease in execution time for CUDA programs utilizing this technique, thereby setting a new benchmark for optimization in GPU computing.展开更多
This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the eff...Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the efficacy of big data awareness detection systems.We advocate for a collaborative caching approach involving edge devices and cloud networks to combat this.This strategy is devised to streamline the data retrieval path,subsequently diminishing network strain.Crafting an adept cache processing scheme poses its own set of challenges,especially given the transient nature of monitoring data and the imperative for swift data transmission,intertwined with resource allocation tactics.This paper unveils a novel mobile healthcare solution that harnesses the power of our collaborative caching approach,facilitating nuanced health monitoring via edge devices.The system capitalizes on cloud computing for intricate health data analytics,especially in pinpointing health anomalies.Given the dynamic locational shifts and possible connection disruptions,we have architected a hierarchical detection system,particularly during crises.This system caches data efficiently and incorporates a detection utility to assess data freshness and potential lag in response times.Furthermore,we introduce the Cache-Assisted Real-Time Detection(CARD)model,crafted to optimize utility.Addressing the inherent complexity of the NP-hard CARD model,we have championed a greedy algorithm as a solution.Simulations reveal that our collaborative caching technique markedly elevates the Cache Hit Ratio(CHR)and data freshness,outshining its contemporaneous benchmark algorithms.The empirical results underscore the strength and efficiency of our innovative IoHT-based health monitoring solution.To encapsulate,this paper tackles the nuances of real-time health data monitoring in the IoHT landscape,presenting a joint edge-cloud caching strategy paired with a hierarchical detection system.Our methodology yields enhanced cache efficiency and data freshness.The corroborative numerical data accentuates the feasibility and relevance of our model,casting a beacon for the future trajectory of real-time health data monitoring systems.展开更多
The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improv...The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improved the system conversion rata to 200?MHz and reduced the speed of data transporting and storing to 50?MHz. The high speed HDPLD and ECL logic parts were used to control system timing and the memory address. The multi layer print board and the shield were used to decrease interference produced by the high speed circuit. The system timing was designed carefully. The interleaving/multiplexing technique could improve the system conversion rata greatly while reducing the speed of external digital interfaces greatly. The design resolved the difficulties in high speed system effectively. The experiment proved the data acquisition system is stable and accurate.展开更多
A study of the accessibility of a city’s scenic spots via different travel modes can contribute to optimization of tourism-related transportation while improving tourists’ travel-related satisfaction levels and adva...A study of the accessibility of a city’s scenic spots via different travel modes can contribute to optimization of tourism-related transportation while improving tourists’ travel-related satisfaction levels and advancing tourism. We systematically analyzed the accessibility of 56 scenic spots in Xi’an City, China, via car and public transport travel modes using the real-time travel function of the Baidu Maps API(Application Programming Interface) along with spatial analysis methods and the modal accessibility gap index of scenic spots. We obtained the following results. First, maximum and minimum travel times using public transport exceeded those using cars. Moreover, the accessibility of scenic spots via cars and public transport presented a circular spatial pattern of increasing travel time from the center to the periphery. Contrasting with travel by public transport, car travel showed a clear time-space compression effect. Second, accessibility of the scenic spots via cars and public transport showed some spatial heterogeneity, with no clear advantages of car accessibility in the central urban area. However, advantages of car accessibility were increasingly evident moving from the center to the periphery. Third, whereas the correlation of the modal accessibility gap index of scenic spots in Xi’an with global space was significantly positive, local spatial interdependence was only evident in some inner city areas and in marginal areas. Moreover, spatial heterogeneity was evident in two regions but was insignificant in other areas, indicating that the spatial interdependence of the modal accessibility gap index in most scenic spots was not apparent in terms of the overall effect of public transport routes, road networks, and the distribution of scenic spots. The improvement of public transport coverage in marginal areas and the optimization of public transport routes in central urban areas are essential tasks for improving travel using public transport in the future.展开更多
Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great s...Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great significance for exploiting and protecting the ocean.We used hourly mean wave height,temperature,and pressure real-time observation data taken in the Xiaomaidao station(in Qingdao,China)from June 1,2017,to May 31,2018,to explore the data quality using eight quality control methods,and to discriminate the most effective method for Xiaomaidao station.After using the eight quality control methods,the percentages of the mean wave height,temperature,and pressure data that passed the tests were 89.6%,88.3%,and 98.6%,respectively.With the marine disaster(wave alarm report)data,the values failed in the test mainly due to the influence of aging observation equipment and missing data transmissions.The mean wave height is often affected by dynamic marine disasters,so the continuity test method is not effective.The correlation test with other related parameters would be more useful for the mean wave height.展开更多
The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information ...The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information flow.To ensure effective transmission of wide-frequency electrical information by the communication protocol of a WAMS,this study performs real-time traffic monitoring and analysis of the data network of a power information system,and establishes corresponding network optimization strategies to solve existing transmission problems.This study utilizes the traffic analysis results obtained using the current real-time dynamic monitoring system to design an optimization strategy,covering the optimization in three progressive levels:the underlying communication protocol,source data,and transmission process.Optimization of the system structure and scheduling optimization of data information are validated to be feasible and practical via tests.展开更多
Glacier disasters occur frequently in alpine regions around the world,but the current conventional geological disaster measurement technology cannot be directly used for glacier disaster measurement.Hence,in this stud...Glacier disasters occur frequently in alpine regions around the world,but the current conventional geological disaster measurement technology cannot be directly used for glacier disaster measurement.Hence,in this study,a distributed multi-sensor measurement system for glacier deformation was established by integrating piezoelectric sensing,coded sensing,attitude sensing technology and wireless communication technology.The traditional Modbus protocol was optimized to solve the problem of data identification confusion of different acquisition nodes.Through indoor wireless transmission,adaptive performance analysis,error measurement experiment and landslide simulation experiment,the performance of the measurement system was analyzed and evaluated.Using unmanned aerial vehicle technology,the reliability and effectiveness of the measurement system were verified on the site of Galongla glacier in southeastern Tibet,China.The results show that the mean absolute percentage errors were only 1.13%and 2.09%for the displacement and temperature,respectively.The distributed glacier deformation real-time measurement system provides a new means for the assessment of the development process of glacier disasters and disaster prevention and mitigation.展开更多
Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable ga...Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable gate array) implementation of a novel reliable real-time data transfer system based on EPA (Ethemet for plant automation) protocol and IEEE 1588 standard. This combination can provide more predictable and real-time communication between automation equipments and precise synchronization between devices. The designed EPA system has been verified on Xilinx Spartan3 XC3S1500 and it consumed 75% of the total slices. The experimental results show that the novel industrial control system achieves high synchronization precision and provides a 1.59-ps standard deviation between the master device and the slave ones. Such a real-time data transfer system is an excellent candidate for automation equipments which require precise synchronization based on Ethemet at a comparatively low price.展开更多
This paper designs and develops a framework on a distributed computing platform for massive multi-source spatial data using a column-oriented database(HBase).This platform consists of four layers including ETL(extract...This paper designs and develops a framework on a distributed computing platform for massive multi-source spatial data using a column-oriented database(HBase).This platform consists of four layers including ETL(extraction transformation loading) tier,data processing tier,data storage tier and data display tier,achieving long-term store,real-time analysis and inquiry for massive data.Finally,a real dataset cluster is simulated,which are made up of 39 nodes including 2 master nodes and 37 data nodes,and performing function tests of data importing module and real-time query module,and performance tests of HDFS's I/O,the MapReduce cluster,batch-loading and real-time query of massive data.The test results indicate that this platform achieves high performance in terms of response time and linear scalability.展开更多
Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accura...Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.展开更多
A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environment...A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.展开更多
Opinion (sentiment) analysis on big data streams from the constantly generated text streams on social media networks to hundreds of millions of online consumer reviews provides many organizations in every field with o...Opinion (sentiment) analysis on big data streams from the constantly generated text streams on social media networks to hundreds of millions of online consumer reviews provides many organizations in every field with opportunities to discover valuable intelligence from the massive user generated text streams. However, the traditional content analysis frameworks are inefficient to handle the unprecedentedly big volume of unstructured text streams and the complexity of text analysis tasks for the real time opinion analysis on the big data streams. In this paper, we propose a parallel real time sentiment analysis system: Social Media Data Stream Sentiment Analysis Service (SMDSSAS) that performs multiple phases of sentiment analysis of social media text streams effectively in real time with two fully analytic opinion mining models to combat the scale of text data streams and the complexity of sentiment analysis processing on unstructured text streams. We propose two aspect based opinion mining models: Deterministic and Probabilistic sentiment models for a real time sentiment analysis on the user given topic related data streams. Experiments on the social media Twitter stream traffic captured during the pre-election weeks of the 2016 Presidential election for real-time analysis of public opinions toward two presidential candidates showed that the proposed system was able to predict correctly Donald Trump as the winner of the 2016 Presidential election. The cross validation results showed that the proposed sentiment models with the real-time streaming components in our proposed framework delivered effectively the analysis of the opinions on two presidential candidates with average 81% accuracy for the Deterministic model and 80% for the Probabilistic model, which are 1% - 22% improvements from the results of the existing literature.展开更多
基金supported by the Meteorological Soft Science Project(Grant No.2023ZZXM29)the Natural Science Fund Project of Tianjin,China(Grant No.21JCYBJC00740)the Key Research and Development-Social Development Program of Jiangsu Province,China(Grant No.BE2021685).
文摘As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.
基金Supported by National Natural Science Foundation of China(Grant Nos.51875031,52242507)Beijing Municipal Natural Science Foundation of China(Grant No.3212010)Beijing Municipal Youth Backbone Personal Project of China(Grant No.2017000020124 G018).
文摘The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the health state and carrying out vibration suppression of the equipment.In engineering scenarios,co-frequency vibration faults are highlighted by rotational frequency and are difficult to identify,and existing intelligent methods require more hardware conditions and are exclusively time-consuming.Therefore,Lightweight-convolutional neural networks(LW-CNN)algorithm is proposed in this paper to achieve real-time fault diagnosis.The critical parameters are discussed and verified by simulated and experimental signals for the sliding window data augmentation method.Based on LW-CNN and data augmentation,the real-time intelligent diagnosis of co-frequency is realized.Moreover,a real-time detection method of fault diagnosis algorithm is proposed for data acquisition to fault diagnosis.It is verified by experiments that the LW-CNN and sliding window methods are used with high accuracy and real-time performance.
基金Key Research and Development and Promotion Program of Henan Province(No.222102210069)Zhongyuan Science and Technology Innovation Leading Talent Project(224200510003)National Natural Science Foundation of China(No.62102449).
文摘Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policymanagement efficiency and difficulty in accurately describing the access control policy. To overcome theseproblems, this paper proposes a big data access control mechanism based on a two-layer permission decisionstructure. This mechanism extends the attribute-based access control (ABAC) model. Business attributes areintroduced in the ABAC model as business constraints between entities. The proposed mechanism implementsa two-layer permission decision structure composed of the inherent attributes of access control entities and thebusiness attributes, which constitute the general permission decision algorithm based on logical calculation andthe business permission decision algorithm based on a bi-directional long short-term memory (BiLSTM) neuralnetwork, respectively. The general permission decision algorithm is used to implement accurate policy decisions,while the business permission decision algorithm implements fuzzy decisions based on the business constraints.The BiLSTM neural network is used to calculate the similarity of the business attributes to realize intelligent,adaptive, and efficient access control permission decisions. Through the two-layer permission decision structure,the complex and diverse big data access control management requirements can be satisfied by considering thesecurity and availability of resources. Experimental results show that the proposed mechanism is effective andreliable. In summary, it can efficiently support the secure sharing of big data resources.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(No.2022R1I1A3063257)supported by the MSIT(Ministry of Science and ICT),Korea,under the Special R&D Zone Development Project(R&D)—Development of R&D Innovation Valley Support Program(2023-DD-RD-0152)supervised by the Innovation Foundation.
文摘Data trading enables data owners and data requesters to sell and purchase data.With the emergence of blockchain technology,research on blockchain-based data trading systems is receiving a lot of attention.Particularly,to reduce the on-chain storage cost,a novel paradigm of blockchain and cloud fusion has been widely considered as a promising data trading platform.Moreover,the fact that data can be used for commercial purposes will encourage users and organizations from various fields to participate in the data marketplace.In the data marketplace,it is a challenge how to trade the data securely outsourced to the external cloud in a way that restricts access to the data only to authorized users across multiple domains.In this paper,we propose a cross-domain bilateral access control protocol for blockchain-cloud based data trading systems.We consider a system model that consists of domain authorities,data senders,data receivers,a blockchain layer,and a cloud provider.The proposed protocol enables access control and source identification of the outsourced data by leveraging identity-based cryptographic techniques.In the proposed protocol,the outsourced data of the sender is encrypted under the target receiver’s identity,and the cloud provider performs policy-match verification on the authorization tags of the sender and receiver generated by the identity-based signature scheme.Therefore,data trading can be achieved only if the identities of the data sender and receiver simultaneously meet the policies specified by each other.To demonstrate efficiency,we evaluate the performance of the proposed protocol and compare it with existing studies.
基金supported by CNPC-CZU Innovation Alliancesupported by the Program of Polar Drilling Environmental Protection and Waste Treatment Technology (2022YFC2806403)。
文摘In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on machine learning and mud logging data is studied in this paper.This method can effectively utilize downhole parameters collected in real-time during drilling,to identify lithology in real-time and provide a reference for optimization of drilling parameters.Given the imbalance of lithology samples,the synthetic minority over-sampling technique(SMOTE)and Tomek link were used to balance the sample number of five lithologies.Meanwhile,this paper introduces Tent map,random opposition-based learning and dynamic perceived probability to the original crow search algorithm(CSA),and establishes an improved crow search algorithm(ICSA).In this paper,ICSA is used to optimize the hyperparameter combination of random forest(RF),extremely random trees(ET),extreme gradient boosting(XGB),and light gradient boosting machine(LGBM)models.In addition,this study combines the recognition advantages of the four models.The accuracy of lithology identification by the weighted average probability model reaches 0.877.The study of this paper realizes high-precision real-time lithology identification method,which can provide lithology reference for the drilling process.
基金supported by the National Key Research and Development Program of China(grant number 2019YFE0123600)。
文摘The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.
基金This work is supported by the National Natural Science Foundation of China(Grant No.51991392)Key Deployment Projects of Chinese Academy of Sciences(Grant No.ZDRW-ZS-2021-3-3)the Second Tibetan Plateau Scientific Expedition and Research Program(STEP)(Grant No.2019QZKK0904).
文摘Predicting the mechanical behaviors of structure and perceiving the anomalies in advance are essential to ensuring the safe operation of infrastructures in the long run.In addition to the incomplete consideration of influencing factors,the prediction time scale of existing studies is rough.Therefore,this study focuses on the development of a real-time prediction model by coupling the spatio-temporal correlation with external load through autoencoder network(ATENet)based on structural health monitoring(SHM)data.An autoencoder mechanism is performed to acquire the high-level representation of raw monitoring data at different spatial positions,and the recurrent neural network is applied to understanding the temporal correlation from the time series.Then,the obtained temporal-spatial information is coupled with dynamic loads through a fully connected layer to predict structural performance in next 12 h.As a case study,the proposed model is formulated on the SHM data collected from a representative underwater shield tunnel.The robustness study is carried out to verify the reliability and the prediction capability of the proposed model.Finally,the ATENet model is compared with some typical models,and the results indicate that it has the best performance.ATENet model is of great value to predict the realtime evolution trend of tunnel structure.
文摘Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these advancements, efficiently programming GPUs remains a daunting challenge, often relying on trial-and-error optimization methods. This paper introduces an optimization technique for CUDA programs through a novel Data Layout strategy, aimed at restructuring memory data arrangement to significantly enhance data access locality. Focusing on the dynamic programming algorithm for chained matrix multiplication—a critical operation across various domains including artificial intelligence (AI), high-performance computing (HPC), and the Internet of Things (IoT)—this technique facilitates more localized access. We specifically illustrate the importance of efficient matrix multiplication in these areas, underscoring the technique’s broader applicability and its potential to address some of the most pressing computational challenges in GPU-accelerated applications. Our findings reveal a remarkable reduction in memory consumption and a substantial 50% decrease in execution time for CUDA programs utilizing this technique, thereby setting a new benchmark for optimization in GPU computing.
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
基金supported by National Natural Science Foundation of China(NSFC)under Grant Number T2350710232.
文摘Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the efficacy of big data awareness detection systems.We advocate for a collaborative caching approach involving edge devices and cloud networks to combat this.This strategy is devised to streamline the data retrieval path,subsequently diminishing network strain.Crafting an adept cache processing scheme poses its own set of challenges,especially given the transient nature of monitoring data and the imperative for swift data transmission,intertwined with resource allocation tactics.This paper unveils a novel mobile healthcare solution that harnesses the power of our collaborative caching approach,facilitating nuanced health monitoring via edge devices.The system capitalizes on cloud computing for intricate health data analytics,especially in pinpointing health anomalies.Given the dynamic locational shifts and possible connection disruptions,we have architected a hierarchical detection system,particularly during crises.This system caches data efficiently and incorporates a detection utility to assess data freshness and potential lag in response times.Furthermore,we introduce the Cache-Assisted Real-Time Detection(CARD)model,crafted to optimize utility.Addressing the inherent complexity of the NP-hard CARD model,we have championed a greedy algorithm as a solution.Simulations reveal that our collaborative caching technique markedly elevates the Cache Hit Ratio(CHR)and data freshness,outshining its contemporaneous benchmark algorithms.The empirical results underscore the strength and efficiency of our innovative IoHT-based health monitoring solution.To encapsulate,this paper tackles the nuances of real-time health data monitoring in the IoHT landscape,presenting a joint edge-cloud caching strategy paired with a hierarchical detection system.Our methodology yields enhanced cache efficiency and data freshness.The corroborative numerical data accentuates the feasibility and relevance of our model,casting a beacon for the future trajectory of real-time health data monitoring systems.
文摘The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improved the system conversion rata to 200?MHz and reduced the speed of data transporting and storing to 50?MHz. The high speed HDPLD and ECL logic parts were used to control system timing and the memory address. The multi layer print board and the shield were used to decrease interference produced by the high speed circuit. The system timing was designed carefully. The interleaving/multiplexing technique could improve the system conversion rata greatly while reducing the speed of external digital interfaces greatly. The design resolved the difficulties in high speed system effectively. The experiment proved the data acquisition system is stable and accurate.
基金Under the auspices of National Natural Science Foundation of China(No.41831284,41501120)Special Scientific Research Project of Education Department of Shaanxi Provincial Government(No.18JK0649)Scientific Research Project of Xi’an International Studies University(No.18XWC24)
文摘A study of the accessibility of a city’s scenic spots via different travel modes can contribute to optimization of tourism-related transportation while improving tourists’ travel-related satisfaction levels and advancing tourism. We systematically analyzed the accessibility of 56 scenic spots in Xi’an City, China, via car and public transport travel modes using the real-time travel function of the Baidu Maps API(Application Programming Interface) along with spatial analysis methods and the modal accessibility gap index of scenic spots. We obtained the following results. First, maximum and minimum travel times using public transport exceeded those using cars. Moreover, the accessibility of scenic spots via cars and public transport presented a circular spatial pattern of increasing travel time from the center to the periphery. Contrasting with travel by public transport, car travel showed a clear time-space compression effect. Second, accessibility of the scenic spots via cars and public transport showed some spatial heterogeneity, with no clear advantages of car accessibility in the central urban area. However, advantages of car accessibility were increasingly evident moving from the center to the periphery. Third, whereas the correlation of the modal accessibility gap index of scenic spots in Xi’an with global space was significantly positive, local spatial interdependence was only evident in some inner city areas and in marginal areas. Moreover, spatial heterogeneity was evident in two regions but was insignificant in other areas, indicating that the spatial interdependence of the modal accessibility gap index in most scenic spots was not apparent in terms of the overall effect of public transport routes, road networks, and the distribution of scenic spots. The improvement of public transport coverage in marginal areas and the optimization of public transport routes in central urban areas are essential tasks for improving travel using public transport in the future.
基金Supported by the National Key Research and Development Program of China(Nos.2016YFC1402000,2018YFC1407003,2017YFC1405300)
文摘Offshore waters provide resources for human beings,while on the other hand,threaten them because of marine disasters.Ocean stations are part of offshore observation networks,and the quality of their data is of great significance for exploiting and protecting the ocean.We used hourly mean wave height,temperature,and pressure real-time observation data taken in the Xiaomaidao station(in Qingdao,China)from June 1,2017,to May 31,2018,to explore the data quality using eight quality control methods,and to discriminate the most effective method for Xiaomaidao station.After using the eight quality control methods,the percentages of the mean wave height,temperature,and pressure data that passed the tests were 89.6%,88.3%,and 98.6%,respectively.With the marine disaster(wave alarm report)data,the values failed in the test mainly due to the influence of aging observation equipment and missing data transmissions.The mean wave height is often affected by dynamic marine disasters,so the continuity test method is not effective.The correlation test with other related parameters would be more useful for the mean wave height.
文摘The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information flow.To ensure effective transmission of wide-frequency electrical information by the communication protocol of a WAMS,this study performs real-time traffic monitoring and analysis of the data network of a power information system,and establishes corresponding network optimization strategies to solve existing transmission problems.This study utilizes the traffic analysis results obtained using the current real-time dynamic monitoring system to design an optimization strategy,covering the optimization in three progressive levels:the underlying communication protocol,source data,and transmission process.Optimization of the system structure and scheduling optimization of data information are validated to be feasible and practical via tests.
基金funded by National Key R&D Program of China((Nos.2022YFC3003403 and 2018YFC1505203)Key Research and Development Program of Tibet Autonomous Region(XZ202301ZY0039G)+1 种基金Natural Science Foundation of Hebei Province(No.F2021201031)Geological Survey Project of China Geological Survey(No.DD20221747)。
文摘Glacier disasters occur frequently in alpine regions around the world,but the current conventional geological disaster measurement technology cannot be directly used for glacier disaster measurement.Hence,in this study,a distributed multi-sensor measurement system for glacier deformation was established by integrating piezoelectric sensing,coded sensing,attitude sensing technology and wireless communication technology.The traditional Modbus protocol was optimized to solve the problem of data identification confusion of different acquisition nodes.Through indoor wireless transmission,adaptive performance analysis,error measurement experiment and landslide simulation experiment,the performance of the measurement system was analyzed and evaluated.Using unmanned aerial vehicle technology,the reliability and effectiveness of the measurement system were verified on the site of Galongla glacier in southeastern Tibet,China.The results show that the mean absolute percentage errors were only 1.13%and 2.09%for the displacement and temperature,respectively.The distributed glacier deformation real-time measurement system provides a new means for the assessment of the development process of glacier disasters and disaster prevention and mitigation.
文摘Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable gate array) implementation of a novel reliable real-time data transfer system based on EPA (Ethemet for plant automation) protocol and IEEE 1588 standard. This combination can provide more predictable and real-time communication between automation equipments and precise synchronization between devices. The designed EPA system has been verified on Xilinx Spartan3 XC3S1500 and it consumed 75% of the total slices. The experimental results show that the novel industrial control system achieves high synchronization precision and provides a 1.59-ps standard deviation between the master device and the slave ones. Such a real-time data transfer system is an excellent candidate for automation equipments which require precise synchronization based on Ethemet at a comparatively low price.
基金Supported by the National Science and Technology Support Project(No.2012BAH01F02)from Ministry of Science and Technology of Chinathe Director Fund(No.IS201116002)from Institute of Seismology,CEA
文摘This paper designs and develops a framework on a distributed computing platform for massive multi-source spatial data using a column-oriented database(HBase).This platform consists of four layers including ETL(extraction transformation loading) tier,data processing tier,data storage tier and data display tier,achieving long-term store,real-time analysis and inquiry for massive data.Finally,a real dataset cluster is simulated,which are made up of 39 nodes including 2 master nodes and 37 data nodes,and performing function tests of data importing module and real-time query module,and performance tests of HDFS's I/O,the MapReduce cluster,batch-loading and real-time query of massive data.The test results indicate that this platform achieves high performance in terms of response time and linear scalability.
文摘Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.
基金Project(20030533011)supported by the National Research Foundation for the Doctoral Program of Higher Education of China
文摘A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.
文摘Opinion (sentiment) analysis on big data streams from the constantly generated text streams on social media networks to hundreds of millions of online consumer reviews provides many organizations in every field with opportunities to discover valuable intelligence from the massive user generated text streams. However, the traditional content analysis frameworks are inefficient to handle the unprecedentedly big volume of unstructured text streams and the complexity of text analysis tasks for the real time opinion analysis on the big data streams. In this paper, we propose a parallel real time sentiment analysis system: Social Media Data Stream Sentiment Analysis Service (SMDSSAS) that performs multiple phases of sentiment analysis of social media text streams effectively in real time with two fully analytic opinion mining models to combat the scale of text data streams and the complexity of sentiment analysis processing on unstructured text streams. We propose two aspect based opinion mining models: Deterministic and Probabilistic sentiment models for a real time sentiment analysis on the user given topic related data streams. Experiments on the social media Twitter stream traffic captured during the pre-election weeks of the 2016 Presidential election for real-time analysis of public opinions toward two presidential candidates showed that the proposed system was able to predict correctly Donald Trump as the winner of the 2016 Presidential election. The cross validation results showed that the proposed sentiment models with the real-time streaming components in our proposed framework delivered effectively the analysis of the opinions on two presidential candidates with average 81% accuracy for the Deterministic model and 80% for the Probabilistic model, which are 1% - 22% improvements from the results of the existing literature.