With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.Th...With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.The best way to enhance traffic flow for vehicles and traffic management departments is to share thedata they receive.There needs to be more protection for the VANET systems.An effective and safe methodof outsourcing is suggested,which reduces computation costs by achieving data security using a homomorphicmapping based on the conjugate operation of matrices.This research proposes a VANET-based data outsourcingsystem to fix the issues.To keep data outsourcing secure,the suggested model takes cryptography models intoaccount.Fog will keep the generated keys for the purpose of vehicle authentication.For controlling and overseeingthe outsourced data while preserving privacy,the suggested approach considers the Trusted Certified Auditor(TCA).Using the secret key,TCA can identify the genuine identity of VANETs when harmful messages aredetected.The proposed model develops a TCA-based unique static vehicle labeling system using cryptography(TCA-USVLC)for secure data outsourcing and privacy preservation in VANETs.The proposed model calculatesthe trust of vehicles in 16 ms for an average of 180 vehicles and achieves 98.6%accuracy for data encryption toprovide security.The proposedmodel achieved 98.5%accuracy in data outsourcing and 98.6%accuracy in privacypreservation in fog-enabled VANETs.Elliptical curve cryptography models can be applied in the future for betterencryption and decryption rates with lightweight cryptography operations.展开更多
Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landsli...Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.展开更多
The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections an...The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections and convergence.In this paper,with the optimization objective of maximizing network utility while ensuring flows performance-centric weighted fairness,this paper designs a reinforcement learning-based cloud-edge autonomous multi-domain data center network architecture that achieves single-domain autonomy and multi-domain collaboration.Due to the conflict between the utility of different flows,the bandwidth fairness allocation problem for various types of flows is formulated by considering different defined reward functions.Regarding the tradeoff between fairness and utility,this paper deals with the corresponding reward functions for the cases where the flows undergo abrupt changes and smooth changes in the flows.In addition,to accommodate the Quality of Service(QoS)requirements for multiple types of flows,this paper proposes a multi-domain autonomous routing algorithm called LSTM+MADDPG.Introducing a Long Short-Term Memory(LSTM)layer in the actor and critic networks,more information about temporal continuity is added,further enhancing the adaptive ability changes in the dynamic network environment.The LSTM+MADDPG algorithm is compared with the latest reinforcement learning algorithm by conducting experiments on real network topology and traffic traces,and the experimental results show that LSTM+MADDPG improves the delay convergence speed by 14.6%and delays the start moment of packet loss by 18.2%compared with other algorithms.展开更多
Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean...Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.展开更多
Long runout landslides involve a massive amount of energy and can be extremely hazardous owing to their long movement distance,high mobility and strong destructive power.Numerical methods have been widely used to pred...Long runout landslides involve a massive amount of energy and can be extremely hazardous owing to their long movement distance,high mobility and strong destructive power.Numerical methods have been widely used to predict the landslide runout but a fundamental problem remained is how to determine the reliable numerical parameters.This study proposes a framework to predict the runout of potential landslides through multi-source data collaboration and numerical analysis of historical landslide events.Specifically,for the historical landslide cases,the landslide-induced seismic signal,geophysical surveys,and possible in-situ drone/phone videos(multi-source data collaboration)can validate the numerical results in terms of landslide dynamics and deposit features and help calibrate the numerical(rheological)parameters.Subsequently,the calibrated numerical parameters can be used to numerically predict the runout of potential landslides in the region with a similar geological setting to the recorded events.Application of the runout prediction approach to the 2020 Jiashanying landslide in Guizhou,China gives reasonable results in comparison to the field observations.The numerical parameters are determined from the multi-source data collaboration analysis of a historical case in the region(2019 Shuicheng landslide).The proposed framework for landslide runout prediction can be of great utility for landslide risk assessment and disaster reduction in mountainous regions worldwide.展开更多
Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data...Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data models are studied,and the characteristics of building information modeling standards(IFC),city geographic modeling language(CityGML),indoor modeling language(IndoorGML),and other models are compared and analyzed.CityGML and IndoorGML models face challenges in satisfying diverse application scenarios and requirements due to limitations in their expression capabilities.It is proposed to combine the semantic information of the model objects to effectively partition and organize the indoor and outdoor spatial 3D model data and to construct the indoor and outdoor data organization mechanism of“chunk-layer-subobject-entrances-area-detail object.”This method is verified by proposing a 3D data organization method for indoor and outdoor space and constructing a 3D visualization system based on it.展开更多
This article presents a real-life project that aimed to evaluate the safety of traffic vehicles on old bridges without any prior data.The project involved various safety inspections,including conventional,static,and d...This article presents a real-life project that aimed to evaluate the safety of traffic vehicles on old bridges without any prior data.The project involved various safety inspections,including conventional,static,and dynamic load inspections and safety assessments.After conducting these tests,it was concluded that the structure of the old bridge is relatively safe,with only a few bumps.The bridge could function normally following appropriate treatment.The analysis provides valuable insights into the assessment of the quality and safety of such bridges to ensure the safe driving of heavy vehicles.展开更多
This study aims to improve knowledge of the structure of southwest Cameroon based on the analysis and interpretation of gravity data derived from the SGG-UGM-2 model. A residual anomaly map was first calculated from t...This study aims to improve knowledge of the structure of southwest Cameroon based on the analysis and interpretation of gravity data derived from the SGG-UGM-2 model. A residual anomaly map was first calculated from the Bouguer anomaly map, which is strongly affected by a regional gradient. The residual anomaly map generated provides information on the variation in subsurface density, but does not provide sufficient information, hence the interest in using filtering with the aim of highlighting the structures affecting the area of south-west Cameroon. Three interpretation methods were used: vertical gradient, horizontal gradient coupled with upward continuation and Euler deconvolution. The application of these treatments enabled us to map a large number of gravimetric lineaments materializing density discontinuities. These lineaments are organized along main preferential directions: NW-SE, NNE-SSW, ENE-WSW and secondary directions: NNW-SSE, NE-SW, NS and E-W. Euler solutions indicate depths of up to 7337 m. Thanks to the results of this research, significant information has been acquired, contributing to a deeper understanding of the structural composition of the study area. The resulting structural map vividly illustrates the major tectonic events that shaped the geological framework of the study area. It also serves as a guide for prospecting subsurface resources (water and hydrocarbons). .展开更多
Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these adv...Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these advancements, efficiently programming GPUs remains a daunting challenge, often relying on trial-and-error optimization methods. This paper introduces an optimization technique for CUDA programs through a novel Data Layout strategy, aimed at restructuring memory data arrangement to significantly enhance data access locality. Focusing on the dynamic programming algorithm for chained matrix multiplication—a critical operation across various domains including artificial intelligence (AI), high-performance computing (HPC), and the Internet of Things (IoT)—this technique facilitates more localized access. We specifically illustrate the importance of efficient matrix multiplication in these areas, underscoring the technique’s broader applicability and its potential to address some of the most pressing computational challenges in GPU-accelerated applications. Our findings reveal a remarkable reduction in memory consumption and a substantial 50% decrease in execution time for CUDA programs utilizing this technique, thereby setting a new benchmark for optimization in GPU computing.展开更多
The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity ...The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity structure of the marine residual basin in detail,leading to the lack of a deeper understanding of the distribution and lithology owing to strong energy shielding on the top interface of marine sediments.In this study,we present seismic tomography data from ocean bottom seismographs that describe the NEE-trending velocity distributions of the basin.The results indicate that strong velocity variations occur at shallow crustal levels.Horizontal velocity bodies show good correlation with surface geological features,and multi-layer features exist in the vertical velocity framework(depth:0–10 km).The analyses of the velocity model,gravity data,magnetic data,multichannel seismic profiles,and drilling data showed that high-velocity anomalies(>6.5 km/s)of small(thickness:1–2 km)and large(thickness:>5 km)scales were caused by igneous complexes in the multi-layer structure,which were active during the Palaeogene.Possible locations of good Mesozoic and Palaeozoic marine strata are limited to the Central Uplift and the western part of the Northern Depression along the wide-angle ocean bottom seismograph array.Following the Indosinian movement,a strong compression existed in the Northern Depression during the extensional phase that caused the formation of folds in the middle of the survey line.This study is useful for reconstructing the regional tectonic evolution and delineating the distribution of the marine residual basin in the South Yellow Sea basin.展开更多
Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer ...Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer radiation belt electron fluxes.In the present study,we develop a forecast model of radiation belt electron fluxes based on the data assimilation method,in terms of Van Allen Probe measurements combined with three-dimensional radiation belt numerical simulations.Our forecast model can cover the entire outer radiation belt with a high temporal resolution(1 hour)and a spatial resolution of 0.25 L over a wide range of both electron energy(0.1-5.0 MeV)and pitch angle(5°-90°).On the basis of this model,we forecast hourly electron fluxes for the next 1,2,and 3 days during an intense geomagnetic storm and evaluate the corresponding prediction performance.Our model can reasonably predict the stormtime evolution of radiation belt electrons with high prediction efficiency(up to~0.8-1).The best prediction performance is found for~0.3-3 MeV electrons at L=~3.25-4.5,which extends to higher L and lower energies with increasing pitch angle.Our results demonstrate that the forecast model developed can be a powerful tool to predict the spatiotemporal changes in outer radiation belt electron fluxes,and the model has both scientific significance and practical implications.展开更多
With the continuous development of deep learning,Deep Convolutional Neural Network(DCNN)has attracted wide attention in the industry due to its high accuracy in image classification.Compared with other DCNN hard-ware ...With the continuous development of deep learning,Deep Convolutional Neural Network(DCNN)has attracted wide attention in the industry due to its high accuracy in image classification.Compared with other DCNN hard-ware deployment platforms,Field Programmable Gate Array(FPGA)has the advantages of being programmable,low power consumption,parallelism,and low cost.However,the enormous amount of calculation of DCNN and the limited logic capacity of FPGA restrict the energy efficiency of the DCNN accelerator.The traditional sequential sliding window method can improve the throughput of the DCNN accelerator by data multiplexing,but this method’s data multiplexing rate is low because it repeatedly reads the data between rows.This paper proposes a fast data readout strategy via the circular sliding window data reading method,it can improve the multiplexing rate of data between rows by optimizing the memory access order of input data.In addition,the multiplication bit width of the DCNN accelerator is much smaller than that of the Digital Signal Processing(DSP)on the FPGA,which means that there will be a waste of resources if a multiplication uses a single DSP.A multiplier sharing strategy is proposed,the multiplier of the accelerator is customized so that a single DSP block can complete multiple groups of 4,6,and 8-bit signed multiplication in parallel.Finally,based on two strategies of appeal,an FPGA optimized accelerator is proposed.The accelerator is customized by Verilog language and deployed on Xilinx VCU118.When the accelerator recognizes the CIRFAR-10 dataset,its energy efficiency is 39.98 GOPS/W,which provides 1.73×speedup energy efficiency over previous DCNN FPGA accelerators.When the accelerator recognizes the IMAGENET dataset,its energy efficiency is 41.12 GOPS/W,which shows 1.28×−3.14×energy efficiency compared with others.展开更多
Wind and wave data are essential in climatological and engineering design applications.In this study,data from 15 buoys located throughout the South China Sea(SCS)were used to evaluate the ERA5 wind and wave data.Appl...Wind and wave data are essential in climatological and engineering design applications.In this study,data from 15 buoys located throughout the South China Sea(SCS)were used to evaluate the ERA5 wind and wave data.Applicability assessment are beneficial for gaining insight into the reliability of the ERA5 data in the SCS.The bias range between the ERA5 and observed wind-speed data was-0.78-0.99 m/s.The result indicates that,while the ERA5 wind-speed data underestimation was dominate,the overestimation of such data existed as well.Additionally,the ERA5 data underestimated annual maximum wind-speed by up to 38%,with a correlation coefficient>0.87.The bias between the ERA5 and observed significant wave height(SWH)data varied from-0.24 to 0.28 m.And the ERA5 data showed positive SWH bias,which implied a general underestimation at all locations,except those in the Beibu Gulf and centralwestern SCS,where overestimation was observed.Under extreme conditions,annual maximum SWH in the ERA5 data was underestimated by up to 30%.The correlation coefficients between the ERA5 and observed SWH data at all locations were greater than 0.92,except in the central-western SCS(0.84).The bias between the ERA5 and observed mean wave period(MWP)data varied from-0.74 to 0.57 s.The ERA5 data showed negative MWP biases implying a general overestimation at all locations,except for B1(the Beibu Gulf)and B7(the northeastern SCS),where underestimation was observed.The correlation coefficient between the ERA5 and observed MWP data in the Beibu Gulf was the smallest(0.56),and those of other locations fluctuated within a narrow range from 0.82 to 0.90.The intercomparison indicates that during the analyzed time-span,the ERA5 data generally underestimated wind-speed and SWH,but overestimated MWP.Under non-extreme conditions,the ERA5 wind-speed and SWH data can be used with confidence in most regions of the SCS,except in the central-western SCS.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the g...Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while r...Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while requiring minimal agricultural inputs.However,accurately identifying ratoon rice crops is challenging due to the similarity of its spectral features with other rice cropping systems(e.g.,double rice).Moreover,images with a high spatiotemporal resolution are essential since ratoon rice is generally cultivated in fragmented croplands within regions that frequently exhibit cloudy and rainy weather.In this study,taking Qichun County in Hubei Province,China as an example,we developed a new phenology-based ratoon rice vegetation index(PRVI)for the purpose of ratoon rice mapping at a 30 m spatial resolution using a robust time series generated from Harmonized Landsat and Sentinel-2(HLS)images.The PRVI that incorporated the red,near-infrared,and shortwave infrared 1 bands was developed based on the analysis of spectro-phenological separability and feature selection.Based on actual field samples,the performance of the PRVI for ratoon rice mapping was carefully evaluated by comparing it to several vegetation indices,including normalized difference vegetation index(NDVI),enhanced vegetation index(EVI)and land surface water index(LSWI).The results suggested that the PRVI could sufficiently capture the specific characteristics of ratoon rice,leading to a favorable separability between ratoon rice and other land cover types.Furthermore,the PRVI showed the best performance for identifying ratoon rice in the phenological phases characterized by grain filling and harvesting to tillering of the ratoon crop(GHS-TS2),indicating that only several images are required to obtain an accurate ratoon rice map.Finally,the PRVI performed better than NDVI,EVI,LSWI and their combination at the GHS-TS2 stages,with producer's accuracy and user's accuracy of 92.22 and 89.30%,respectively.These results demonstrate that the proposed PRVI based on HLS data can effectively identify ratoon rice in fragmented croplands at crucial phenological stages,which is promising for identifying the earliest timing of ratoon rice planting and can provide a fundamental dataset for crop management activities.展开更多
Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w...Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.展开更多
文摘With the recent technological developments,massive vehicular ad hoc networks(VANETs)have been established,enabling numerous vehicles and their respective Road Side Unit(RSU)components to communicate with oneanother.The best way to enhance traffic flow for vehicles and traffic management departments is to share thedata they receive.There needs to be more protection for the VANET systems.An effective and safe methodof outsourcing is suggested,which reduces computation costs by achieving data security using a homomorphicmapping based on the conjugate operation of matrices.This research proposes a VANET-based data outsourcingsystem to fix the issues.To keep data outsourcing secure,the suggested model takes cryptography models intoaccount.Fog will keep the generated keys for the purpose of vehicle authentication.For controlling and overseeingthe outsourced data while preserving privacy,the suggested approach considers the Trusted Certified Auditor(TCA).Using the secret key,TCA can identify the genuine identity of VANETs when harmful messages aredetected.The proposed model develops a TCA-based unique static vehicle labeling system using cryptography(TCA-USVLC)for secure data outsourcing and privacy preservation in VANETs.The proposed model calculatesthe trust of vehicles in 16 ms for an average of 180 vehicles and achieves 98.6%accuracy for data encryption toprovide security.The proposedmodel achieved 98.5%accuracy in data outsourcing and 98.6%accuracy in privacypreservation in fog-enabled VANETs.Elliptical curve cryptography models can be applied in the future for betterencryption and decryption rates with lightweight cryptography operations.
基金supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021QD032)。
文摘Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.
文摘The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections and convergence.In this paper,with the optimization objective of maximizing network utility while ensuring flows performance-centric weighted fairness,this paper designs a reinforcement learning-based cloud-edge autonomous multi-domain data center network architecture that achieves single-domain autonomy and multi-domain collaboration.Due to the conflict between the utility of different flows,the bandwidth fairness allocation problem for various types of flows is formulated by considering different defined reward functions.Regarding the tradeoff between fairness and utility,this paper deals with the corresponding reward functions for the cases where the flows undergo abrupt changes and smooth changes in the flows.In addition,to accommodate the Quality of Service(QoS)requirements for multiple types of flows,this paper proposes a multi-domain autonomous routing algorithm called LSTM+MADDPG.Introducing a Long Short-Term Memory(LSTM)layer in the actor and critic networks,more information about temporal continuity is added,further enhancing the adaptive ability changes in the dynamic network environment.The LSTM+MADDPG algorithm is compared with the latest reinforcement learning algorithm by conducting experiments on real network topology and traffic traces,and the experimental results show that LSTM+MADDPG improves the delay convergence speed by 14.6%and delays the start moment of packet loss by 18.2%compared with other algorithms.
基金The National Key R&D Program of China under contract No.2021YFC3101603.
文摘Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.
基金supported by the National Natural Science Foundation of China(41977215)。
文摘Long runout landslides involve a massive amount of energy and can be extremely hazardous owing to their long movement distance,high mobility and strong destructive power.Numerical methods have been widely used to predict the landslide runout but a fundamental problem remained is how to determine the reliable numerical parameters.This study proposes a framework to predict the runout of potential landslides through multi-source data collaboration and numerical analysis of historical landslide events.Specifically,for the historical landslide cases,the landslide-induced seismic signal,geophysical surveys,and possible in-situ drone/phone videos(multi-source data collaboration)can validate the numerical results in terms of landslide dynamics and deposit features and help calibrate the numerical(rheological)parameters.Subsequently,the calibrated numerical parameters can be used to numerically predict the runout of potential landslides in the region with a similar geological setting to the recorded events.Application of the runout prediction approach to the 2020 Jiashanying landslide in Guizhou,China gives reasonable results in comparison to the field observations.The numerical parameters are determined from the multi-source data collaboration analysis of a historical case in the region(2019 Shuicheng landslide).The proposed framework for landslide runout prediction can be of great utility for landslide risk assessment and disaster reduction in mountainous regions worldwide.
文摘Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data models are studied,and the characteristics of building information modeling standards(IFC),city geographic modeling language(CityGML),indoor modeling language(IndoorGML),and other models are compared and analyzed.CityGML and IndoorGML models face challenges in satisfying diverse application scenarios and requirements due to limitations in their expression capabilities.It is proposed to combine the semantic information of the model objects to effectively partition and organize the indoor and outdoor spatial 3D model data and to construct the indoor and outdoor data organization mechanism of“chunk-layer-subobject-entrances-area-detail object.”This method is verified by proposing a 3D data organization method for indoor and outdoor space and constructing a 3D visualization system based on it.
文摘This article presents a real-life project that aimed to evaluate the safety of traffic vehicles on old bridges without any prior data.The project involved various safety inspections,including conventional,static,and dynamic load inspections and safety assessments.After conducting these tests,it was concluded that the structure of the old bridge is relatively safe,with only a few bumps.The bridge could function normally following appropriate treatment.The analysis provides valuable insights into the assessment of the quality and safety of such bridges to ensure the safe driving of heavy vehicles.
文摘This study aims to improve knowledge of the structure of southwest Cameroon based on the analysis and interpretation of gravity data derived from the SGG-UGM-2 model. A residual anomaly map was first calculated from the Bouguer anomaly map, which is strongly affected by a regional gradient. The residual anomaly map generated provides information on the variation in subsurface density, but does not provide sufficient information, hence the interest in using filtering with the aim of highlighting the structures affecting the area of south-west Cameroon. Three interpretation methods were used: vertical gradient, horizontal gradient coupled with upward continuation and Euler deconvolution. The application of these treatments enabled us to map a large number of gravimetric lineaments materializing density discontinuities. These lineaments are organized along main preferential directions: NW-SE, NNE-SSW, ENE-WSW and secondary directions: NNW-SSE, NE-SW, NS and E-W. Euler solutions indicate depths of up to 7337 m. Thanks to the results of this research, significant information has been acquired, contributing to a deeper understanding of the structural composition of the study area. The resulting structural map vividly illustrates the major tectonic events that shaped the geological framework of the study area. It also serves as a guide for prospecting subsurface resources (water and hydrocarbons). .
文摘Over the past decade, Graphics Processing Units (GPUs) have revolutionized high-performance computing, playing pivotal roles in advancing fields like IoT, autonomous vehicles, and exascale computing. Despite these advancements, efficiently programming GPUs remains a daunting challenge, often relying on trial-and-error optimization methods. This paper introduces an optimization technique for CUDA programs through a novel Data Layout strategy, aimed at restructuring memory data arrangement to significantly enhance data access locality. Focusing on the dynamic programming algorithm for chained matrix multiplication—a critical operation across various domains including artificial intelligence (AI), high-performance computing (HPC), and the Internet of Things (IoT)—this technique facilitates more localized access. We specifically illustrate the importance of efficient matrix multiplication in these areas, underscoring the technique’s broader applicability and its potential to address some of the most pressing computational challenges in GPU-accelerated applications. Our findings reveal a remarkable reduction in memory consumption and a substantial 50% decrease in execution time for CUDA programs utilizing this technique, thereby setting a new benchmark for optimization in GPU computing.
基金The National Natural Science Foundation of China under contract No.41806048the Open Fund of the Hubei Key Laboratory of Marine Geological Resources under contract No.MGR202009+2 种基金the Fund from the Key Laboratory of Deep-Earth Dynamics of Ministry of Natural Resource,Institute of Geology,Chinese Academy of Geological Sciences under contract No.J1901-16the Aoshan Science and Technology Innovation Project of Pilot National Laboratory for Marine Science and Technology(Qingdao)under contract No.2015ASKJ03-Seabed Resourcesthe Fund from the Korea Institute of Ocean Science and Technology(KIOST)under contract No.PE99741.
文摘The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity structure of the marine residual basin in detail,leading to the lack of a deeper understanding of the distribution and lithology owing to strong energy shielding on the top interface of marine sediments.In this study,we present seismic tomography data from ocean bottom seismographs that describe the NEE-trending velocity distributions of the basin.The results indicate that strong velocity variations occur at shallow crustal levels.Horizontal velocity bodies show good correlation with surface geological features,and multi-layer features exist in the vertical velocity framework(depth:0–10 km).The analyses of the velocity model,gravity data,magnetic data,multichannel seismic profiles,and drilling data showed that high-velocity anomalies(>6.5 km/s)of small(thickness:1–2 km)and large(thickness:>5 km)scales were caused by igneous complexes in the multi-layer structure,which were active during the Palaeogene.Possible locations of good Mesozoic and Palaeozoic marine strata are limited to the Central Uplift and the western part of the Northern Depression along the wide-angle ocean bottom seismograph array.Following the Indosinian movement,a strong compression existed in the Northern Depression during the extensional phase that caused the formation of folds in the middle of the survey line.This study is useful for reconstructing the regional tectonic evolution and delineating the distribution of the marine residual basin in the South Yellow Sea basin.
基金supported by the National Natural Science Foundation of China (Grant Nos. 42025404, 42188101, and 42241143)the National Key R&D Program of China (Grant Nos. 2022YFF0503700 and 2022YFF0503900)+1 种基金the B-type Strategic Priority Program of the Chinese Academy of Sciences (Grant No. XDB41000000)the Fundamental Research Funds for the Central Universities (Grant No. 2042022kf1012)
文摘Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer radiation belt electron fluxes.In the present study,we develop a forecast model of radiation belt electron fluxes based on the data assimilation method,in terms of Van Allen Probe measurements combined with three-dimensional radiation belt numerical simulations.Our forecast model can cover the entire outer radiation belt with a high temporal resolution(1 hour)and a spatial resolution of 0.25 L over a wide range of both electron energy(0.1-5.0 MeV)and pitch angle(5°-90°).On the basis of this model,we forecast hourly electron fluxes for the next 1,2,and 3 days during an intense geomagnetic storm and evaluate the corresponding prediction performance.Our model can reasonably predict the stormtime evolution of radiation belt electrons with high prediction efficiency(up to~0.8-1).The best prediction performance is found for~0.3-3 MeV electrons at L=~3.25-4.5,which extends to higher L and lower energies with increasing pitch angle.Our results demonstrate that the forecast model developed can be a powerful tool to predict the spatiotemporal changes in outer radiation belt electron fluxes,and the model has both scientific significance and practical implications.
基金supported in part by the Major Program of the Ministry of Science and Technology of China under Grant 2019YFB2205102in part by the National Natural Science Foundation of China under Grant 61974164,62074166,61804181,62004219,62004220,62104256.
文摘With the continuous development of deep learning,Deep Convolutional Neural Network(DCNN)has attracted wide attention in the industry due to its high accuracy in image classification.Compared with other DCNN hard-ware deployment platforms,Field Programmable Gate Array(FPGA)has the advantages of being programmable,low power consumption,parallelism,and low cost.However,the enormous amount of calculation of DCNN and the limited logic capacity of FPGA restrict the energy efficiency of the DCNN accelerator.The traditional sequential sliding window method can improve the throughput of the DCNN accelerator by data multiplexing,but this method’s data multiplexing rate is low because it repeatedly reads the data between rows.This paper proposes a fast data readout strategy via the circular sliding window data reading method,it can improve the multiplexing rate of data between rows by optimizing the memory access order of input data.In addition,the multiplication bit width of the DCNN accelerator is much smaller than that of the Digital Signal Processing(DSP)on the FPGA,which means that there will be a waste of resources if a multiplication uses a single DSP.A multiplier sharing strategy is proposed,the multiplier of the accelerator is customized so that a single DSP block can complete multiple groups of 4,6,and 8-bit signed multiplication in parallel.Finally,based on two strategies of appeal,an FPGA optimized accelerator is proposed.The accelerator is customized by Verilog language and deployed on Xilinx VCU118.When the accelerator recognizes the CIRFAR-10 dataset,its energy efficiency is 39.98 GOPS/W,which provides 1.73×speedup energy efficiency over previous DCNN FPGA accelerators.When the accelerator recognizes the IMAGENET dataset,its energy efficiency is 41.12 GOPS/W,which shows 1.28×−3.14×energy efficiency compared with others.
基金Supported by the Southern Marine Science and Engineering Guangdong Laboratory(Zhuhai)(No.SML2021SP102)the Key Laboratory of Marine Environmental Survey Technology and Application+2 种基金Ministry of Natural Resources(Nos.MESTA-2020-C003,MESTA-2020-C004)the Key Research and Development Project of Guangdong Province(No.2020B1111020003)the Science and Technology Research Project of Jiangxi Provincial Department of Education(No.GJJ200330)。
文摘Wind and wave data are essential in climatological and engineering design applications.In this study,data from 15 buoys located throughout the South China Sea(SCS)were used to evaluate the ERA5 wind and wave data.Applicability assessment are beneficial for gaining insight into the reliability of the ERA5 data in the SCS.The bias range between the ERA5 and observed wind-speed data was-0.78-0.99 m/s.The result indicates that,while the ERA5 wind-speed data underestimation was dominate,the overestimation of such data existed as well.Additionally,the ERA5 data underestimated annual maximum wind-speed by up to 38%,with a correlation coefficient>0.87.The bias between the ERA5 and observed significant wave height(SWH)data varied from-0.24 to 0.28 m.And the ERA5 data showed positive SWH bias,which implied a general underestimation at all locations,except those in the Beibu Gulf and centralwestern SCS,where overestimation was observed.Under extreme conditions,annual maximum SWH in the ERA5 data was underestimated by up to 30%.The correlation coefficients between the ERA5 and observed SWH data at all locations were greater than 0.92,except in the central-western SCS(0.84).The bias between the ERA5 and observed mean wave period(MWP)data varied from-0.74 to 0.57 s.The ERA5 data showed negative MWP biases implying a general overestimation at all locations,except for B1(the Beibu Gulf)and B7(the northeastern SCS),where underestimation was observed.The correlation coefficient between the ERA5 and observed MWP data in the Beibu Gulf was the smallest(0.56),and those of other locations fluctuated within a narrow range from 0.82 to 0.90.The intercomparison indicates that during the analyzed time-span,the ERA5 data generally underestimated wind-speed and SWH,but overestimated MWP.Under non-extreme conditions,the ERA5 wind-speed and SWH data can be used with confidence in most regions of the SCS,except in the central-western SCS.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
基金funded by the National Natural Science Foundation of China(General Program:No.52074314,No.U19B6003-05)National Key Research and Development Program of China(2019YFA0708303-05)。
文摘Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金supported by the National Natural Science Foundation of China(42271360 and 42271399)the Young Elite Scientists Sponsorship Program by China Association for Science and Technology(CAST)(2020QNRC001)the Fundamental Research Funds for the Central Universities,China(2662021JC013,CCNU22QN018)。
文摘Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while requiring minimal agricultural inputs.However,accurately identifying ratoon rice crops is challenging due to the similarity of its spectral features with other rice cropping systems(e.g.,double rice).Moreover,images with a high spatiotemporal resolution are essential since ratoon rice is generally cultivated in fragmented croplands within regions that frequently exhibit cloudy and rainy weather.In this study,taking Qichun County in Hubei Province,China as an example,we developed a new phenology-based ratoon rice vegetation index(PRVI)for the purpose of ratoon rice mapping at a 30 m spatial resolution using a robust time series generated from Harmonized Landsat and Sentinel-2(HLS)images.The PRVI that incorporated the red,near-infrared,and shortwave infrared 1 bands was developed based on the analysis of spectro-phenological separability and feature selection.Based on actual field samples,the performance of the PRVI for ratoon rice mapping was carefully evaluated by comparing it to several vegetation indices,including normalized difference vegetation index(NDVI),enhanced vegetation index(EVI)and land surface water index(LSWI).The results suggested that the PRVI could sufficiently capture the specific characteristics of ratoon rice,leading to a favorable separability between ratoon rice and other land cover types.Furthermore,the PRVI showed the best performance for identifying ratoon rice in the phenological phases characterized by grain filling and harvesting to tillering of the ratoon crop(GHS-TS2),indicating that only several images are required to obtain an accurate ratoon rice map.Finally,the PRVI performed better than NDVI,EVI,LSWI and their combination at the GHS-TS2 stages,with producer's accuracy and user's accuracy of 92.22 and 89.30%,respectively.These results demonstrate that the proposed PRVI based on HLS data can effectively identify ratoon rice in fragmented croplands at crucial phenological stages,which is promising for identifying the earliest timing of ratoon rice planting and can provide a fundamental dataset for crop management activities.
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
基金Korea Institute of Energy Technology Evaluation and Planning(KETEP)grant funded by the Korea government(Grant No.20214000000140,Graduate School of Convergence for Clean Energy Integrated Power Generation)Korea Basic Science Institute(National Research Facilities and Equipment Center)grant funded by the Ministry of Education(2021R1A6C101A449)the National Research Foundation of Korea grant funded by the Ministry of Science and ICT(2021R1A2C1095139),Republic of Korea。
文摘Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.