Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production exp...Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model.展开更多
The 2015/16 El Niño event ranks among the top three of the last 100 years in terms of intensity,but most dynamical models had a relatively low prediction skill for this event before the summer months.Therefore,th...The 2015/16 El Niño event ranks among the top three of the last 100 years in terms of intensity,but most dynamical models had a relatively low prediction skill for this event before the summer months.Therefore,the attribution of this particular event can help us to understand the cause of super El Niño–Southern Oscillation events and how to forecast them skillfully.The present study applies attribute methods based on a deep learning model to study the key factors related to the formation of this event.A deep learning model is trained using historical simulations from 21 CMIP6 models to predict the Niño-3.4 index.The integrated gradient method is then used to identify the key signals in the North Pacific that determine the evolution of the Niño-3.4 index.These crucial signals are then masked in the initial conditions to verify their roles in the prediction.In addition to confirming the key signals inducing the super El Niño event revealed in previous attribution studies,we identify the combined contribution of the tropical North Atlantic and the South Pacific oceans to the evolution and intensity of this event,emphasizing the crucial role of the interactions among them and the North Pacific.This approach is also applied to other El Niño events,revealing several new precursor signals.This study suggests that the deep learning method is useful in attributing the key factors inducing extreme tropical climate events.展开更多
为了增加网络吞吐量并改善用户体验,提出一种基于Q学习(Q-learning)的多业务网络选择博弈(Multi-Service Network Selection Game based on Q-learning,QSNG)策略。该策略通过模糊推理和综合属性评估获得多业务网络效用函数,并将其用作Q...为了增加网络吞吐量并改善用户体验,提出一种基于Q学习(Q-learning)的多业务网络选择博弈(Multi-Service Network Selection Game based on Q-learning,QSNG)策略。该策略通过模糊推理和综合属性评估获得多业务网络效用函数,并将其用作Q-learning的奖励。用户通过博弈算法预测网络选择策略收益,避免访问负载较重的网络。同时,使用二进制指数退避算法减少多个用户并发访问某个网络的概率。仿真结果表明,所提策略可以根据用户的QoS需求和价格偏好自适应地切换到最合适的网络,将其与基于强化学习的网络辅助反馈(Reinforcement Learning with Network-Assisted Feedback,RLNF)策略和无线网络选择博弈(Radio Network Selection Games,RSG)策略相比,所提策略可以分别减少总切换数量的80%和60%,使网络吞吐量分别提高了7%和8%,并且可以保证系统的公平性。展开更多
Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of t...Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of the real‐word system are multiple relations,where entities are linked by different types of relations,and each relation is a view of the graph network.Second,the rich multi‐scale information(structure‐level and feature‐level)of the graph network can be seen as self‐supervised signals,which are not fully exploited.A novel contrastive self‐supervised representation learning framework on attributed multiplex graph networks with multi‐scale(named CoLM^(2)S)information is presented in this study.It mainly contains two components:intra‐relation contrast learning and interrelation contrastive learning.Specifically,the contrastive self‐supervised representation learning framework on attributed single‐layer graph networks with multi‐scale information(CoLMS)framework with the graph convolutional network as encoder to capture the intra‐relation information with multi‐scale structure‐level and feature‐level selfsupervised signals is introduced first.The structure‐level information includes the edge structure and sub‐graph structure,and the feature‐level information represents the output of different graph convolutional layer.Second,according to the consensus assumption among inter‐relations,the CoLM^(2)S framework is proposed to jointly learn various graph relations in attributed multiplex graph network to achieve global consensus node embedding.The proposed method can fully distil the graph information.Extensive experiments on unsupervised node clustering and graph visualisation tasks demonstrate the effectiveness of our methods,and it outperforms existing competitive baselines.展开更多
With the increased advancements of smart industries,cybersecurity has become a vital growth factor in the success of industrial transformation.The Industrial Internet of Things(IIoT)or Industry 4.0 has revolutionized ...With the increased advancements of smart industries,cybersecurity has become a vital growth factor in the success of industrial transformation.The Industrial Internet of Things(IIoT)or Industry 4.0 has revolutionized the concepts of manufacturing and production altogether.In industry 4.0,powerful IntrusionDetection Systems(IDS)play a significant role in ensuring network security.Though various intrusion detection techniques have been developed so far,it is challenging to protect the intricate data of networks.This is because conventional Machine Learning(ML)approaches are inadequate and insufficient to address the demands of dynamic IIoT networks.Further,the existing Deep Learning(DL)can be employed to identify anonymous intrusions.Therefore,the current study proposes a Hunger Games Search Optimization with Deep Learning-Driven Intrusion Detection(HGSODLID)model for the IIoT environment.The presented HGSODL-ID model exploits the linear normalization approach to transform the input data into a useful format.The HGSO algorithm is employed for Feature Selection(HGSO-FS)to reduce the curse of dimensionality.Moreover,Sparrow Search Optimization(SSO)is utilized with a Graph Convolutional Network(GCN)to classify and identify intrusions in the network.Finally,the SSO technique is exploited to fine-tune the hyper-parameters involved in the GCN model.The proposed HGSODL-ID model was experimentally validated using a benchmark dataset,and the results confirmed the superiority of the proposed HGSODL-ID method over recent approaches.展开更多
基金the financially supported by the National Natural Science Foundation of China(Grant No.52104013)the China Postdoctoral Science Foundation(Grant No.2022T150724)。
文摘Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model.
基金supported by the National Key R&D Program of China(2019YFA0606703)the Youth Innovation Promotion Association of the Chinese Academy of Sciences(Grant No.Y202025).
文摘The 2015/16 El Niño event ranks among the top three of the last 100 years in terms of intensity,but most dynamical models had a relatively low prediction skill for this event before the summer months.Therefore,the attribution of this particular event can help us to understand the cause of super El Niño–Southern Oscillation events and how to forecast them skillfully.The present study applies attribute methods based on a deep learning model to study the key factors related to the formation of this event.A deep learning model is trained using historical simulations from 21 CMIP6 models to predict the Niño-3.4 index.The integrated gradient method is then used to identify the key signals in the North Pacific that determine the evolution of the Niño-3.4 index.These crucial signals are then masked in the initial conditions to verify their roles in the prediction.In addition to confirming the key signals inducing the super El Niño event revealed in previous attribution studies,we identify the combined contribution of the tropical North Atlantic and the South Pacific oceans to the evolution and intensity of this event,emphasizing the crucial role of the interactions among them and the North Pacific.This approach is also applied to other El Niño events,revealing several new precursor signals.This study suggests that the deep learning method is useful in attributing the key factors inducing extreme tropical climate events.
文摘为了增加网络吞吐量并改善用户体验,提出一种基于Q学习(Q-learning)的多业务网络选择博弈(Multi-Service Network Selection Game based on Q-learning,QSNG)策略。该策略通过模糊推理和综合属性评估获得多业务网络效用函数,并将其用作Q-learning的奖励。用户通过博弈算法预测网络选择策略收益,避免访问负载较重的网络。同时,使用二进制指数退避算法减少多个用户并发访问某个网络的概率。仿真结果表明,所提策略可以根据用户的QoS需求和价格偏好自适应地切换到最合适的网络,将其与基于强化学习的网络辅助反馈(Reinforcement Learning with Network-Assisted Feedback,RLNF)策略和无线网络选择博弈(Radio Network Selection Games,RSG)策略相比,所提策略可以分别减少总切换数量的80%和60%,使网络吞吐量分别提高了7%和8%,并且可以保证系统的公平性。
基金support by the National Natural Science Foundation of China(NSFC)under grant number 61873274.
文摘Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of the real‐word system are multiple relations,where entities are linked by different types of relations,and each relation is a view of the graph network.Second,the rich multi‐scale information(structure‐level and feature‐level)of the graph network can be seen as self‐supervised signals,which are not fully exploited.A novel contrastive self‐supervised representation learning framework on attributed multiplex graph networks with multi‐scale(named CoLM^(2)S)information is presented in this study.It mainly contains two components:intra‐relation contrast learning and interrelation contrastive learning.Specifically,the contrastive self‐supervised representation learning framework on attributed single‐layer graph networks with multi‐scale information(CoLMS)framework with the graph convolutional network as encoder to capture the intra‐relation information with multi‐scale structure‐level and feature‐level selfsupervised signals is introduced first.The structure‐level information includes the edge structure and sub‐graph structure,and the feature‐level information represents the output of different graph convolutional layer.Second,according to the consensus assumption among inter‐relations,the CoLM^(2)S framework is proposed to jointly learn various graph relations in attributed multiplex graph network to achieve global consensus node embedding.The proposed method can fully distil the graph information.Extensive experiments on unsupervised node clustering and graph visualisation tasks demonstrate the effectiveness of our methods,and it outperforms existing competitive baselines.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R319)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code:22UQU4340237DSR44The authors are thankful to the Deanship of Scientific Research at Najran University for funding thiswork under theResearch Groups Funding program Grant Code(NU/RG/SERC/11/4).
文摘With the increased advancements of smart industries,cybersecurity has become a vital growth factor in the success of industrial transformation.The Industrial Internet of Things(IIoT)or Industry 4.0 has revolutionized the concepts of manufacturing and production altogether.In industry 4.0,powerful IntrusionDetection Systems(IDS)play a significant role in ensuring network security.Though various intrusion detection techniques have been developed so far,it is challenging to protect the intricate data of networks.This is because conventional Machine Learning(ML)approaches are inadequate and insufficient to address the demands of dynamic IIoT networks.Further,the existing Deep Learning(DL)can be employed to identify anonymous intrusions.Therefore,the current study proposes a Hunger Games Search Optimization with Deep Learning-Driven Intrusion Detection(HGSODLID)model for the IIoT environment.The presented HGSODL-ID model exploits the linear normalization approach to transform the input data into a useful format.The HGSO algorithm is employed for Feature Selection(HGSO-FS)to reduce the curse of dimensionality.Moreover,Sparrow Search Optimization(SSO)is utilized with a Graph Convolutional Network(GCN)to classify and identify intrusions in the network.Finally,the SSO technique is exploited to fine-tune the hyper-parameters involved in the GCN model.The proposed HGSODL-ID model was experimentally validated using a benchmark dataset,and the results confirmed the superiority of the proposed HGSODL-ID method over recent approaches.