Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and e...Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and efficient geomechanical upscaling technique for heterogeneous geological reservoirs is lacking to advance the applications of three-dimensional(3D)reservoir-scale geomechanical simulation considering detailed geological heterogeneities.Here,we develop convolutional neural network(CNN)proxies that reproduce the anisotropic nonlinear geomechanical response caused by lithological heterogeneity,and compute upscaled geomechanical properties from CNN proxies.The CNN proxies are trained using a large dataset of randomly generated spatially correlated sand-shale realizations as inputs and simulation results of their macroscopic geomechanical response as outputs.The trained CNN models can provide the upscaled shear strength(R^(2)>0.949),stress-strain behavior(R^(2)>0.925),and volumetric strain changes(R^(2)>0.958)that highly agree with the numerical simulation results while saving over two orders of magnitude of computational time.This is a major advantage in computing the upscaled geomechanical properties directly from geological realizations without the need to perform local numerical simulations to obtain the geomechanical response.The proposed CNN proxybased upscaling technique has the ability to(1)bridge the gap between the fine-scale geocellular models considering geological uncertainties and computationally efficient geomechanical models used to assess the geomechanical risks of large-scale subsurface development,and(2)improve the efficiency of numerical upscaling techniques that rely on local numerical simulations,leading to significantly increased computational time for uncertainty quantification using numerous geological realizations.展开更多
Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete hetero...Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete heterogeneous neuron networks are rarely reported.In this study,a new four-stable discrete locally active memristor is proposed and its nonvolatile and locally active properties are verified by its power-off plot and DC V–I diagram.Based on two-dimensional(2D)discrete Izhikevich neuron and 2D discrete Chialvo neuron,a heterogeneous discrete neuron network is constructed by using the proposed discrete memristor as a coupling synapse connecting the two heterogeneous neurons.Considering the coupling strength as the control parameter,chaotic firing,periodic firing,and hyperchaotic firing patterns are revealed.In particular,multiple coexisting firing patterns are observed,which are induced by different initial values of the memristor.Phase synchronization between the two heterogeneous neurons is discussed and it is found that they can achieve perfect synchronous at large coupling strength.Furthermore,the effect of Gaussian white noise on synchronization behaviors is also explored.We demonstrate that the presence of noise not only leads to the transition of firing patterns,but also achieves the phase synchronization between two heterogeneous neurons under low coupling strength.展开更多
In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver u...In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.展开更多
Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices...Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices reuse the cellular spectrum.To alleviate the interference,an efficient interference management way is to set exclusion zones around the cellular receivers.In this paper,we adopt a stochastic geometry approach to analyze the outage probabilities of cellular and D2D users in the D2D-enabled HetCNets.The main difficulties contain three aspects:1)how to model the location randomness of base stations,cellular and D2D users in practical networks;2)how to capture the randomness and interrelation of cellular and D2D transmissions due to the existence of random exclusion zones;3)how to characterize the different types of interference and their impacts on the outage probabilities of cellular and D2D users.We then run extensive Monte-Carlo simulations which manifest that our theoretical model is very accurate.展开更多
A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and oth...A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and other fields.Link prediction,as a key task to reveal the unobserved relationships in the network,is of great significance in heterogeneous information networks.This paper reviews the application of presentation-based learning methods in link prediction of heterogeneous information networks.This paper introduces the basic concepts of heterogeneous information networks,and the theoretical basis of representation learning,and discusses the specific application of the deep learning model in node embedding learning and link prediction in detail.The effectiveness and superiority of these methods on multiple real data sets are demonstrated by experimental verification.展开更多
On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in m...On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in multilingual environment and formulate corresponding control strategies to reduce the harm caused by rumor propagation.In this paper,considering the multilingual environment and intervention mechanism in the rumor-spreading process,an improved ignorants–spreaders-1–spreaders-2–removers(I2SR)rumor-spreading model with time delay and the nonlinear incidence is established in heterogeneous networks.Firstly,based on the mean-field equations corresponding to the model,the basic reproduction number is derived to ensure the existence of rumor-spreading equilibrium.Secondly,by applying Lyapunov stability theory and graph theory,the global stability of rumor-spreading equilibrium is analyzed in detail.In particular,aiming at the lowest control cost,the optimal control scheme is designed to optimize the intervention mechanism,and the optimal control conditions are derived using the Pontryagin's minimum principle.Finally,some illustrative examples are provided to verify the effectiveness of the theoretical results.The results show that optimizing the intervention mechanism can effectively reduce the densities of spreaders-1 and spreaders-2 within the expected time,which provides guiding insights for public opinion managers to control rumors.展开更多
This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordina...This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordinated planning problem into a mixed integer liner programming(MILP)problem based on time expanded graph.Then,the problem is transferred and reformulated into a consensus optimization framework which can be solved by satellite systems parallelly.With alternating direction method of multipliers(ADMM),a semi-distributed coordinated transmission task planning algorithm is proposed,in which each satellite system plans its own tasks based on local information and limited communication with the coordination center.Simulation results demonstrate that compared with the centralized and fully-distributed methods,the proposed semi-distributed coordinated method can strike a better balance among task complete rate,complexity,and the amount of information required to be exchanged.展开更多
ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the hete...ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the heterogeneous content network is critical to this task.Most traditional methods focus on modeling questions and users based on the textual content left in the community while ignoring the structural properties of heterogeneous CQA networks and always suffering from textual data sparsity issues.Recent approaches take advantage of structural proximities between nodes and attempt to fuse the textual content of nodes for modeling.However,they often fail to distinguish the nodes’personalized preferences and only consider the textual content of a part of the nodes in network embedding learning,while ignoring the semantic relevance of nodes.In this paper,we propose a novel framework that jointly considers the structural proximity relations and textual semantic relevance to model users and questions more comprehensively.Specifically,we learn topology-based embeddings through a hierarchical attentive network learning strategy,in which the proximity information and the personalized preference of nodes are encoded and preserved.Meanwhile,we utilize the node’s textual content and the text correlation between adjacent nodes to build the content-based embedding through a meta-context-aware skip-gram model.In addition,the user’s relative answer quality is incorporated to promote the ranking performance.Experimental results show that our proposed framework consistently and significantly outperforms the state-of-the-art baselines on three real-world datasets by taking the deep semantic understanding and structural feature learning together.The performance of the proposed work is analyzed in terms of MRR,P@K,and MAP and is proven to be more advanced than the existing methodologies.展开更多
The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cy...The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cyber Threat Intelligence(CTI)can facilitate APT actors’profiling for an immediate response.However,it is difficult for traditional manual methods to analyze attack behaviors from cyber threat intelligence due to its heterogeneous nature.Based on the Adversarial Tactics,Techniques and Common Knowledge(ATT&CK)of threat behavior description,this paper proposes a threat behavioral knowledge extraction framework that integrates Heterogeneous Text Network(HTN)and Graph Convolutional Network(GCN)to solve this issue.It leverages the hierarchical correlation relationships of attack techniques and tactics in the ATT&CK to construct a text network of heterogeneous cyber threat intelligence.With the help of the Bidirectional EncoderRepresentation fromTransformers(BERT)pretraining model to analyze the contextual semantics of cyber threat intelligence,the task of threat behavior identification is transformed into a text classification task,which automatically extracts attack behavior in CTI,then identifies the malware and advanced threat actors.The experimental results show that F1 achieve 94.86%and 92.15%for the multi-label classification tasks of tactics and techniques.Extend the experiment to verify the method’s effectiveness in identifying the malware and threat actors in APT attacks.The F1 for malware and advanced threat actors identification task reached 98.45%and 99.48%,which are better than the benchmark model in the experiment and achieve state of the art.The model can effectivelymodel threat intelligence text data and acquire knowledge and experience migration by correlating implied features with a priori knowledge to compensate for insufficient sample data and improve the classification performance and recognition ability of threat behavior in text.展开更多
In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous network...In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous networks due to low utilization of bandwidth.To address this problem,a network-aware adaptive PS load distribution scheme is proposed,which accelerates model synchronization by proactively adjusting the communication load on PSs according to network states.We evaluate the proposed scheme on MXNet,known as a realworld distributed training platform,and results show that our scheme achieves up to 2.68 times speed-up of model training in the dynamic and heterogeneous network environment.展开更多
Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the stru...Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the structural and semantic information contained in heterogeneous information networks(HINs)into low-dimensional embeddings;this task is called heterogeneous network embedding(HNE).Efficient HNE techniques can benefit various HIN-based machine learning tasks such as node classification,recommender systems,and information retrieval.Here,we provide a comprehensive survey of key advancements in the area of HNE.First,we define an encoder-decoder-based HNE model taxonomy.Then,we systematically overview,compare,and summarize various state-of-the-art HNE models and analyze the advantages and disadvantages of various model categories to identify more potentially competitive HNE frameworks.We also summarize the application fields,benchmark datasets,open source tools,andperformance evaluation in theHNEarea.Finally,wediscuss open issues and suggest promising future directions.We anticipate that this survey will provide deep insights into research in the field of HNE.展开更多
The future network world will be embedded with different generations of wireless technologies,such as 3G,4G and 5G.At the same time,the development of new devices equipped with multiple interfaces is growing rapidly i...The future network world will be embedded with different generations of wireless technologies,such as 3G,4G and 5G.At the same time,the development of new devices equipped with multiple interfaces is growing rapidly in recent years.As a consequence,the vertical handover protocol is developed in order to provide ubiquitous connectivity in the heterogeneous wireless environment.Indeed,by using this protocol,the users have opportunities to be connected to the Internet through a variety of wireless technologies at any time and anywhere.The main challenge of this protocol is how to select the best access network in terms of Quality of Service(QoS)for users.For that,many algorithms have been proposed and developed to deal with the issue in recent studies.However,all existing algorithms permit only the selection of one access network from the available networks during the vertical handover process.To cope with this problem,in this paper we propose a new approach based on k-partite graph.Firstly,we introduce k-partite graph theory to model the vertical handover problem.Secondly,the selection of the best path is performed by a robust and lightweight mechanism based on cost function and Dijkstra’s algorithm.The experimental results show that the proposed approach can achieve better performance of QoS than the existing algorithms for FTP traffic and video streaming.展开更多
5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent lat...5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent latency and bandwidth requirements;thus,it should be extremely flexible and dynamic.Slicing enables service providers to develop various network slice architectures.As users travel from one coverage region to another area,the callmust be routed to a slice thatmeets the same or different expectations.This research aims to develop and evaluate an algorithm to make handover decisions appearing in 5G sliced networks.Rules of thumb which indicates the accuracy regarding the training data classification schemes within machine learning should be considered for validation and selection of the appropriate machine learning strategies.Therefore,this study discusses the network model’s design and implementation of self-optimization Fuzzy Qlearning of the decision-making algorithm for slice handover.The algorithm’s performance is assessed by means of connection-level metrics considering the Quality of Service(QoS),specifically the probability of the new call to be blocked and the probability of a handoff call being dropped.Hence,within the network model,the call admission control(AC)method is modeled by leveraging supervised learning algorithm as prior knowledge of additional capacity.Moreover,to mitigate high complexity,the integration of fuzzy logic as well as Fuzzy Q-Learning is used to discretize state and the corresponding action spaces.The results generated from our proposal surpass the traditional methods without the use of supervised learning and fuzzy-Q learning.展开更多
In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro ba...In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro base station(MBS)and the small base station(SBS)transmit their confidential messages to the macro user(MU)and the small user(SU)over their shared spectrum respectively.To enhance the system sum rate(SSR)of MBS-MU and SBS-SU transmission,we propose joint antenna selection combined with optimal power allocation(JAS-OPA)scheme and independent antenna selection combined with optimal power allocation(IAS-OPA)scheme.The JAS-OPA scheme requires to know the channel state information(CSI)of transmission channels and interference channels,while the IAS-OPA scheme only needs to know the CSI of transmission channels.In addition,we carry out the analysis for conventional round-robin antenna selection combined with optimal power allocation(RR-OPA)as a benchmark scheme.We formulate the SSR maximization problem through the power allocation between MBS and SBS and propose iterative OPA algorithms for JAS-OPA,IAS-OPA and RR-OPA schemes,respectively.The results show that the OPA schemes outperform the equal power allocation in terms of SSR.Moreover,we provide the closed-form expression of the system outage probability(SOP)for IAS scheme and RR scheme,it shows the SOP performance can be significantly improved by our proposed IAS scheme compared with RR scheme.展开更多
The vehicular sensor network (VSN) is an important part of intelligent transportation, which is used for real-timedetection and operation control of vehicles and real-time transmission of data and information. In the ...The vehicular sensor network (VSN) is an important part of intelligent transportation, which is used for real-timedetection and operation control of vehicles and real-time transmission of data and information. In the environmentofVSN, massive private data generated by vehicles are transmitted in open channels and used by other vehicle users,so it is crucial to maintain high transmission efficiency and high confidentiality of data. To deal with this problem, inthis paper, we propose a heterogeneous fault-tolerant aggregate signcryption scheme with an equality test (HFTASET).The scheme combines fault-tolerant and aggregate signcryption,whichnot onlymakes up for the deficiency oflow security of aggregate signature, but alsomakes up for the deficiency that aggregate signcryption cannot tolerateinvalid signature. The scheme supports one verification pass when all signcryptions are valid, and it supportsunbounded aggregation when the total number of signcryptions grows dynamically. In addition, this schemesupports heterogeneous equality test, and realizes the access control of private data in different cryptographicenvironments, so as to achieve flexibility in the application of our scheme and realize the function of quick searchof plaintext or ciphertext. Then, the security of HFTAS-ET is demonstrated by strict theoretical analysis. Finally, weconduct strict and standardized experimental operation and performance evaluation, which shows that the schemehas better performance.展开更多
Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user...Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user by allowing concurrent access to two heterogeneous radio networks.In this paper,we propose a joint user association and fair scheduling algorithm(JUAFS)to deal with the resource allocation and load balancing issues for DC heterogeneous small cell networks.Considering different coverage sizes,numbers of users,and quality of experience characteristics of heterogeneous cells,we present a proportional fair scheduling for user association among cells and utilize interference graph to minimize the transmission conflict probability.Simulation results show the performance improvement of the proposed algorithm in spectrum efficiency and fairness comparing to the existing schemes.展开更多
A variety of wireless communication technologies have been developed to provide services to a large number of users.The future integrated 5G-WLAN wireless networks will support seamless and secure roaming,and various ...A variety of wireless communication technologies have been developed to provide services to a large number of users.The future integrated 5G-WLAN wireless networks will support seamless and secure roaming,and various types of real-time applications and services,which will be the trend of next-generation computing paradigm.In this paper,we discuss the privacy and security problems in 5G-WLAN heterogeneous networks and present a logical 5G-WLAN integrated architecture.We also propose a novel USIM and ECC based design of handover authentication for next-generation 5G-WLAN heterogeneous networks that can provide secure and seamless Internet connectivity.Our scheme has the features of strong security and better performance in terms of computation cost,energy cost,and storage cost as compared with the state-of-the-art schemes.展开更多
Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.Ho...Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.However,there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics.In particular,Chinese complaint reports,generated by urban complainers and collected by government employees,describe existing resident problems in daily life.Meanwhile,the reflected problems are required to respond speedily.Therefore,automatic summarization tasks for these reports have been developed.However,similar to traditional summarization models,the generated summaries still exist problems of informativeness and conciseness.To address these issues and generate suitably informative and less redundant summaries,a topic-based abstractive summarization method is proposed to obtain global and local features.Additionally,a heterogeneous graph of the original document is constructed using word-level and topic-level features.Experiments and analyses on public review datasets(Yelp and Amazon)and our constructed dataset(Chinese complaint reports)show that the proposed framework effectively improves the performance of the abstractive summarization model for Chinese complaint reports.展开更多
In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned...In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned among versatile users in order to achieve the best Quality of Experience(QoE)and performance objectives.Most researchers focused on Forward Error Correction(FEC)techniques when attempting to strike a balance between QoE and performance.However,as network capacity increases,the performance degrades,impacting the live visual experience.Recently,Deep Learning(DL)algorithms have been successfully integrated with FEC to stream videos across multiple heterogeneous networks.But these algorithms need to be changed to make the experience better without sacrificing packet loss and delay time.To address the previous challenge,this paper proposes a novel intelligent algorithm that streams video in multi-home heterogeneous networks based on network-centric characteristics.The proposed framework contains modules such as Intelligent Content Extraction Module(ICEM),Channel Status Monitor(CSM),and Adaptive FEC(AFEC).This framework adopts the Cognitive Learning-based Scheduling(CLS)Module,which works on the deep Reinforced Gated Recurrent Networks(RGRN)principle and embeds them along with the FEC to achieve better performances.The complete framework was developed using the Objective Modular Network Testbed in C++(OMNET++),Internet networking(INET),and Python 3.10,with Keras as the front end and Tensorflow 2.10 as the back end.With extensive experimentation,the proposed model outperforms the other existing intelligentmodels in terms of improving the QoE,minimizing the End-to-End Delay(EED),and maintaining the highest accuracy(98%)and a lower Root Mean Square Error(RMSE)value of 0.001.展开更多
In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communi...In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communication,such as video or telephone,which always meet the requirements of high priority(HP)data transmission first.If there is a large amount of low priority(LP)data,there will be a large amount of LP data that cannot be sent.This situation will cause excessive delay of LP data and packet dropping probability.In order to solve this problem,the data transmission process of high priority queue and low priority queue is studied.Considering the priority jump strategy to the priority queuing model,the queuing process with two priority data is modeled as a two-dimensionalMarkov chain.A state dependent priority jump queuing strategy is proposed,which can improve the discarding performance of low priority data.The quasi birth and death process method(QBD)and fixed point iterationmethod are used to solve the causality,and the steady-state probability distribution is further obtained.Then,performance parameters such as average queue length,average throughput,average delay and packet dropping probability for both high and low priority data can be expressed.The simulation results verify the correctness of the theoretical derivation.Meanwhile,the proposed priority jump queuing strategy can significantly improve the drop performance of low-priority data.展开更多
基金financial support provided by the Future Energy System at University of Alberta and NSERC Discovery Grant RGPIN-2023-04084。
文摘Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and efficient geomechanical upscaling technique for heterogeneous geological reservoirs is lacking to advance the applications of three-dimensional(3D)reservoir-scale geomechanical simulation considering detailed geological heterogeneities.Here,we develop convolutional neural network(CNN)proxies that reproduce the anisotropic nonlinear geomechanical response caused by lithological heterogeneity,and compute upscaled geomechanical properties from CNN proxies.The CNN proxies are trained using a large dataset of randomly generated spatially correlated sand-shale realizations as inputs and simulation results of their macroscopic geomechanical response as outputs.The trained CNN models can provide the upscaled shear strength(R^(2)>0.949),stress-strain behavior(R^(2)>0.925),and volumetric strain changes(R^(2)>0.958)that highly agree with the numerical simulation results while saving over two orders of magnitude of computational time.This is a major advantage in computing the upscaled geomechanical properties directly from geological realizations without the need to perform local numerical simulations to obtain the geomechanical response.The proposed CNN proxybased upscaling technique has the ability to(1)bridge the gap between the fine-scale geocellular models considering geological uncertainties and computationally efficient geomechanical models used to assess the geomechanical risks of large-scale subsurface development,and(2)improve the efficiency of numerical upscaling techniques that rely on local numerical simulations,leading to significantly increased computational time for uncertainty quantification using numerous geological realizations.
基金Project supported by the National Natural Science Foundations of China(Grant Nos.62171401 and 62071411).
文摘Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete heterogeneous neuron networks are rarely reported.In this study,a new four-stable discrete locally active memristor is proposed and its nonvolatile and locally active properties are verified by its power-off plot and DC V–I diagram.Based on two-dimensional(2D)discrete Izhikevich neuron and 2D discrete Chialvo neuron,a heterogeneous discrete neuron network is constructed by using the proposed discrete memristor as a coupling synapse connecting the two heterogeneous neurons.Considering the coupling strength as the control parameter,chaotic firing,periodic firing,and hyperchaotic firing patterns are revealed.In particular,multiple coexisting firing patterns are observed,which are induced by different initial values of the memristor.Phase synchronization between the two heterogeneous neurons is discussed and it is found that they can achieve perfect synchronous at large coupling strength.Furthermore,the effect of Gaussian white noise on synchronization behaviors is also explored.We demonstrate that the presence of noise not only leads to the transition of firing patterns,but also achieves the phase synchronization between two heterogeneous neurons under low coupling strength.
基金supported by the Key Research and Development Program of China(No.2022YFC3005401)Key Research and Development Program of China,Yunnan Province(No.202203AA080009,202202AF080003)Postgraduate Research&Practice Innovation Program of Jiangsu Province(No.KYCX21_0482).
文摘In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.
基金This work is funded in part by the Science and Technology Development Fund,Macao SAR(Grant Nos.0093/2022/A2,0076/2022/A2 and 0008/2022/AGJ)in part by the National Nature Science Foundation of China(Grant No.61872452)+3 种基金in part by Special fund for Dongguan’s Rural Revitalization Strategy in 2021(Grant No.20211800400102)in part by Dongguan Special Commissioner Project(Grant No.20211800500182)in part by Guangdong-Dongguan Joint Fund for Basic and Applied Research of Guangdong Province(Grant No.2020A1515110162)in part by University Special Fund of Guangdong Provincial Department of Education(Grant No.2022ZDZX1073).
文摘Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices reuse the cellular spectrum.To alleviate the interference,an efficient interference management way is to set exclusion zones around the cellular receivers.In this paper,we adopt a stochastic geometry approach to analyze the outage probabilities of cellular and D2D users in the D2D-enabled HetCNets.The main difficulties contain three aspects:1)how to model the location randomness of base stations,cellular and D2D users in practical networks;2)how to capture the randomness and interrelation of cellular and D2D transmissions due to the existence of random exclusion zones;3)how to characterize the different types of interference and their impacts on the outage probabilities of cellular and D2D users.We then run extensive Monte-Carlo simulations which manifest that our theoretical model is very accurate.
基金Science and Technology Research Project of Jiangxi Provincial Department of Education(Project No.GJJ211348,GJJ211347 and GJJ2201056)。
文摘A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and other fields.Link prediction,as a key task to reveal the unobserved relationships in the network,is of great significance in heterogeneous information networks.This paper reviews the application of presentation-based learning methods in link prediction of heterogeneous information networks.This paper introduces the basic concepts of heterogeneous information networks,and the theoretical basis of representation learning,and discusses the specific application of the deep learning model in node embedding learning and link prediction in detail.The effectiveness and superiority of these methods on multiple real data sets are demonstrated by experimental verification.
基金the National Natural Science Foundation of People’s Republic of China(Grant Nos.U1703262 and 62163035)the Special Project for Local Science and Technology Development Guided by the Central Government(Grant No.ZYYD2022A05)Xinjiang Key Laboratory of Applied Mathematics(Grant No.XJDX1401)。
文摘On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in multilingual environment and formulate corresponding control strategies to reduce the harm caused by rumor propagation.In this paper,considering the multilingual environment and intervention mechanism in the rumor-spreading process,an improved ignorants–spreaders-1–spreaders-2–removers(I2SR)rumor-spreading model with time delay and the nonlinear incidence is established in heterogeneous networks.Firstly,based on the mean-field equations corresponding to the model,the basic reproduction number is derived to ensure the existence of rumor-spreading equilibrium.Secondly,by applying Lyapunov stability theory and graph theory,the global stability of rumor-spreading equilibrium is analyzed in detail.In particular,aiming at the lowest control cost,the optimal control scheme is designed to optimize the intervention mechanism,and the optimal control conditions are derived using the Pontryagin's minimum principle.Finally,some illustrative examples are provided to verify the effectiveness of the theoretical results.The results show that optimizing the intervention mechanism can effectively reduce the densities of spreaders-1 and spreaders-2 within the expected time,which provides guiding insights for public opinion managers to control rumors.
基金supported in part by the NSF China under Grant(61701365,61801365,62001347)in part by Natural Science Foundation of Shaanxi Province(2020JQ-686)+4 种基金in part by the China Postdoctoral Science Foundation under Grant(2018M643581,2019TQ0210,2019TQ0241,2020M673344)in part by Young Talent fund of University Association for Science and Technology in Shaanxi,China(20200112)in part by Key Research and Development Program in Shaanxi Province of China(2021GY066)in part by Postdoctoral Foundation in Shaanxi Province of China(2018BSHEDZZ47)the Fundamental Research Funds for the Central Universities。
文摘This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordinated planning problem into a mixed integer liner programming(MILP)problem based on time expanded graph.Then,the problem is transferred and reformulated into a consensus optimization framework which can be solved by satellite systems parallelly.With alternating direction method of multipliers(ADMM),a semi-distributed coordinated transmission task planning algorithm is proposed,in which each satellite system plans its own tasks based on local information and limited communication with the coordination center.Simulation results demonstrate that compared with the centralized and fully-distributed methods,the proposed semi-distributed coordinated method can strike a better balance among task complete rate,complexity,and the amount of information required to be exchanged.
文摘ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the heterogeneous content network is critical to this task.Most traditional methods focus on modeling questions and users based on the textual content left in the community while ignoring the structural properties of heterogeneous CQA networks and always suffering from textual data sparsity issues.Recent approaches take advantage of structural proximities between nodes and attempt to fuse the textual content of nodes for modeling.However,they often fail to distinguish the nodes’personalized preferences and only consider the textual content of a part of the nodes in network embedding learning,while ignoring the semantic relevance of nodes.In this paper,we propose a novel framework that jointly considers the structural proximity relations and textual semantic relevance to model users and questions more comprehensively.Specifically,we learn topology-based embeddings through a hierarchical attentive network learning strategy,in which the proximity information and the personalized preference of nodes are encoded and preserved.Meanwhile,we utilize the node’s textual content and the text correlation between adjacent nodes to build the content-based embedding through a meta-context-aware skip-gram model.In addition,the user’s relative answer quality is incorporated to promote the ranking performance.Experimental results show that our proposed framework consistently and significantly outperforms the state-of-the-art baselines on three real-world datasets by taking the deep semantic understanding and structural feature learning together.The performance of the proposed work is analyzed in terms of MRR,P@K,and MAP and is proven to be more advanced than the existing methodologies.
基金supported by China’s National Key R&D Program,No.2019QY1404the National Natural Science Foundation of China,Grant No.U20A20161,U1836103the Basic Strengthening Program Project,No.2019-JCJQ-ZD-113.
文摘The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cyber Threat Intelligence(CTI)can facilitate APT actors’profiling for an immediate response.However,it is difficult for traditional manual methods to analyze attack behaviors from cyber threat intelligence due to its heterogeneous nature.Based on the Adversarial Tactics,Techniques and Common Knowledge(ATT&CK)of threat behavior description,this paper proposes a threat behavioral knowledge extraction framework that integrates Heterogeneous Text Network(HTN)and Graph Convolutional Network(GCN)to solve this issue.It leverages the hierarchical correlation relationships of attack techniques and tactics in the ATT&CK to construct a text network of heterogeneous cyber threat intelligence.With the help of the Bidirectional EncoderRepresentation fromTransformers(BERT)pretraining model to analyze the contextual semantics of cyber threat intelligence,the task of threat behavior identification is transformed into a text classification task,which automatically extracts attack behavior in CTI,then identifies the malware and advanced threat actors.The experimental results show that F1 achieve 94.86%and 92.15%for the multi-label classification tasks of tactics and techniques.Extend the experiment to verify the method’s effectiveness in identifying the malware and threat actors in APT attacks.The F1 for malware and advanced threat actors identification task reached 98.45%and 99.48%,which are better than the benchmark model in the experiment and achieve state of the art.The model can effectivelymodel threat intelligence text data and acquire knowledge and experience migration by correlating implied features with a priori knowledge to compensate for insufficient sample data and improve the classification performance and recognition ability of threat behavior in text.
基金partially supported by the computing power networks and new communication primitives project under Grant No. HC-CN-2020120001the National Natural Science Foundation of China under Grant No. 62102066Open Research Projects of Zhejiang Lab under Grant No. 2022QA0AB02
文摘In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous networks due to low utilization of bandwidth.To address this problem,a network-aware adaptive PS load distribution scheme is proposed,which accelerates model synchronization by proactively adjusting the communication load on PSs according to network states.We evaluate the proposed scheme on MXNet,known as a realworld distributed training platform,and results show that our scheme achieves up to 2.68 times speed-up of model training in the dynamic and heterogeneous network environment.
基金supported by the National Key Research and Development Plan of China(2017YFB0503700,2016YFB0501801)the National Natural Science Foundation of China(61170026,62173157)+1 种基金the Thirteen Five-Year Research Planning Project of National Language Committee(No.YB135-149)the Fundamental Research Funds for the Central Universities(Nos.CCNU20QN022,CCNU20QN021,CCNU20ZT012).
文摘Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the structural and semantic information contained in heterogeneous information networks(HINs)into low-dimensional embeddings;this task is called heterogeneous network embedding(HNE).Efficient HNE techniques can benefit various HIN-based machine learning tasks such as node classification,recommender systems,and information retrieval.Here,we provide a comprehensive survey of key advancements in the area of HNE.First,we define an encoder-decoder-based HNE model taxonomy.Then,we systematically overview,compare,and summarize various state-of-the-art HNE models and analyze the advantages and disadvantages of various model categories to identify more potentially competitive HNE frameworks.We also summarize the application fields,benchmark datasets,open source tools,andperformance evaluation in theHNEarea.Finally,wediscuss open issues and suggest promising future directions.We anticipate that this survey will provide deep insights into research in the field of HNE.
文摘The future network world will be embedded with different generations of wireless technologies,such as 3G,4G and 5G.At the same time,the development of new devices equipped with multiple interfaces is growing rapidly in recent years.As a consequence,the vertical handover protocol is developed in order to provide ubiquitous connectivity in the heterogeneous wireless environment.Indeed,by using this protocol,the users have opportunities to be connected to the Internet through a variety of wireless technologies at any time and anywhere.The main challenge of this protocol is how to select the best access network in terms of Quality of Service(QoS)for users.For that,many algorithms have been proposed and developed to deal with the issue in recent studies.However,all existing algorithms permit only the selection of one access network from the available networks during the vertical handover process.To cope with this problem,in this paper we propose a new approach based on k-partite graph.Firstly,we introduce k-partite graph theory to model the vertical handover problem.Secondly,the selection of the best path is performed by a robust and lightweight mechanism based on cost function and Dijkstra’s algorithm.The experimental results show that the proposed approach can achieve better performance of QoS than the existing algorithms for FTP traffic and video streaming.
基金This work was supported partially by the BK21 FOUR program of the National Research Foundation of Korea funded by the Ministry of Education(NRF5199991514504)by theMSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2023-2018-0-01431)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation).
文摘5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent latency and bandwidth requirements;thus,it should be extremely flexible and dynamic.Slicing enables service providers to develop various network slice architectures.As users travel from one coverage region to another area,the callmust be routed to a slice thatmeets the same or different expectations.This research aims to develop and evaluate an algorithm to make handover decisions appearing in 5G sliced networks.Rules of thumb which indicates the accuracy regarding the training data classification schemes within machine learning should be considered for validation and selection of the appropriate machine learning strategies.Therefore,this study discusses the network model’s design and implementation of self-optimization Fuzzy Qlearning of the decision-making algorithm for slice handover.The algorithm’s performance is assessed by means of connection-level metrics considering the Quality of Service(QoS),specifically the probability of the new call to be blocked and the probability of a handoff call being dropped.Hence,within the network model,the call admission control(AC)method is modeled by leveraging supervised learning algorithm as prior knowledge of additional capacity.Moreover,to mitigate high complexity,the integration of fuzzy logic as well as Fuzzy Q-Learning is used to discretize state and the corresponding action spaces.The results generated from our proposal surpass the traditional methods without the use of supervised learning and fuzzy-Q learning.
基金supported by National Natural Science Foundation of China(No.62071253)Postgraduate Research and Practice Innovation Program of Jiangsu Province(KYCX210747).
文摘In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro base station(MBS)and the small base station(SBS)transmit their confidential messages to the macro user(MU)and the small user(SU)over their shared spectrum respectively.To enhance the system sum rate(SSR)of MBS-MU and SBS-SU transmission,we propose joint antenna selection combined with optimal power allocation(JAS-OPA)scheme and independent antenna selection combined with optimal power allocation(IAS-OPA)scheme.The JAS-OPA scheme requires to know the channel state information(CSI)of transmission channels and interference channels,while the IAS-OPA scheme only needs to know the CSI of transmission channels.In addition,we carry out the analysis for conventional round-robin antenna selection combined with optimal power allocation(RR-OPA)as a benchmark scheme.We formulate the SSR maximization problem through the power allocation between MBS and SBS and propose iterative OPA algorithms for JAS-OPA,IAS-OPA and RR-OPA schemes,respectively.The results show that the OPA schemes outperform the equal power allocation in terms of SSR.Moreover,we provide the closed-form expression of the system outage probability(SOP)for IAS scheme and RR scheme,it shows the SOP performance can be significantly improved by our proposed IAS scheme compared with RR scheme.
基金supported in part by the Open Fund of Advanced Cryptography and System Security Key Laboratory of Sichuan Province under Grant SKLACSS-202102in part by the Intelligent Terminal Key Laboratory of Sichuan Province under Grant SCITLAB-1019.
文摘The vehicular sensor network (VSN) is an important part of intelligent transportation, which is used for real-timedetection and operation control of vehicles and real-time transmission of data and information. In the environmentofVSN, massive private data generated by vehicles are transmitted in open channels and used by other vehicle users,so it is crucial to maintain high transmission efficiency and high confidentiality of data. To deal with this problem, inthis paper, we propose a heterogeneous fault-tolerant aggregate signcryption scheme with an equality test (HFTASET).The scheme combines fault-tolerant and aggregate signcryption,whichnot onlymakes up for the deficiency oflow security of aggregate signature, but alsomakes up for the deficiency that aggregate signcryption cannot tolerateinvalid signature. The scheme supports one verification pass when all signcryptions are valid, and it supportsunbounded aggregation when the total number of signcryptions grows dynamically. In addition, this schemesupports heterogeneous equality test, and realizes the access control of private data in different cryptographicenvironments, so as to achieve flexibility in the application of our scheme and realize the function of quick searchof plaintext or ciphertext. Then, the security of HFTAS-ET is demonstrated by strict theoretical analysis. Finally, weconduct strict and standardized experimental operation and performance evaluation, which shows that the schemehas better performance.
基金supported in part by the National Natural Science Foundation of China under Grant 61871433,61828103in part by the Research Platform of South China Normal University and Foshan。
文摘Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user by allowing concurrent access to two heterogeneous radio networks.In this paper,we propose a joint user association and fair scheduling algorithm(JUAFS)to deal with the resource allocation and load balancing issues for DC heterogeneous small cell networks.Considering different coverage sizes,numbers of users,and quality of experience characteristics of heterogeneous cells,we present a proportional fair scheduling for user association among cells and utilize interference graph to minimize the transmission conflict probability.Simulation results show the performance improvement of the proposed algorithm in spectrum efficiency and fairness comparing to the existing schemes.
文摘A variety of wireless communication technologies have been developed to provide services to a large number of users.The future integrated 5G-WLAN wireless networks will support seamless and secure roaming,and various types of real-time applications and services,which will be the trend of next-generation computing paradigm.In this paper,we discuss the privacy and security problems in 5G-WLAN heterogeneous networks and present a logical 5G-WLAN integrated architecture.We also propose a novel USIM and ECC based design of handover authentication for next-generation 5G-WLAN heterogeneous networks that can provide secure and seamless Internet connectivity.Our scheme has the features of strong security and better performance in terms of computation cost,energy cost,and storage cost as compared with the state-of-the-art schemes.
基金supported byNationalNatural Science Foundation of China(52274205)and Project of Education Department of Liaoning Province(LJKZ0338).
文摘Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.However,there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics.In particular,Chinese complaint reports,generated by urban complainers and collected by government employees,describe existing resident problems in daily life.Meanwhile,the reflected problems are required to respond speedily.Therefore,automatic summarization tasks for these reports have been developed.However,similar to traditional summarization models,the generated summaries still exist problems of informativeness and conciseness.To address these issues and generate suitably informative and less redundant summaries,a topic-based abstractive summarization method is proposed to obtain global and local features.Additionally,a heterogeneous graph of the original document is constructed using word-level and topic-level features.Experiments and analyses on public review datasets(Yelp and Amazon)and our constructed dataset(Chinese complaint reports)show that the proposed framework effectively improves the performance of the abstractive summarization model for Chinese complaint reports.
文摘In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned among versatile users in order to achieve the best Quality of Experience(QoE)and performance objectives.Most researchers focused on Forward Error Correction(FEC)techniques when attempting to strike a balance between QoE and performance.However,as network capacity increases,the performance degrades,impacting the live visual experience.Recently,Deep Learning(DL)algorithms have been successfully integrated with FEC to stream videos across multiple heterogeneous networks.But these algorithms need to be changed to make the experience better without sacrificing packet loss and delay time.To address the previous challenge,this paper proposes a novel intelligent algorithm that streams video in multi-home heterogeneous networks based on network-centric characteristics.The proposed framework contains modules such as Intelligent Content Extraction Module(ICEM),Channel Status Monitor(CSM),and Adaptive FEC(AFEC).This framework adopts the Cognitive Learning-based Scheduling(CLS)Module,which works on the deep Reinforced Gated Recurrent Networks(RGRN)principle and embeds them along with the FEC to achieve better performances.The complete framework was developed using the Objective Modular Network Testbed in C++(OMNET++),Internet networking(INET),and Python 3.10,with Keras as the front end and Tensorflow 2.10 as the back end.With extensive experimentation,the proposed model outperforms the other existing intelligentmodels in terms of improving the QoE,minimizing the End-to-End Delay(EED),and maintaining the highest accuracy(98%)and a lower Root Mean Square Error(RMSE)value of 0.001.
基金2020 MajorNatural Science Research Project of Jiangsu Province Colleges and Universities:Research on Forensic Modeling and Analysis of the Internet of Things(20KJA520004)2020 Open Project of National and Local Joint Engineering Laboratory of Radio Frequency Integration andMicro-assembly Technology:Research on the Security Performance of Radio Frequency Energy Collection Cooperative Communication Network(KFJJ20200201)+1 种基金2021 Jiangsu Police Officer Academy Scientific Research Project:Research on D2D Cache Network Resource Optimization Based on Edge Computing Technology(2021SJYZK01)High-level Introduction of Talent Scientific Research Start-up Fund of Jiangsu Police Institute(JSPI19GKZL407).
文摘In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communication,such as video or telephone,which always meet the requirements of high priority(HP)data transmission first.If there is a large amount of low priority(LP)data,there will be a large amount of LP data that cannot be sent.This situation will cause excessive delay of LP data and packet dropping probability.In order to solve this problem,the data transmission process of high priority queue and low priority queue is studied.Considering the priority jump strategy to the priority queuing model,the queuing process with two priority data is modeled as a two-dimensionalMarkov chain.A state dependent priority jump queuing strategy is proposed,which can improve the discarding performance of low priority data.The quasi birth and death process method(QBD)and fixed point iterationmethod are used to solve the causality,and the steady-state probability distribution is further obtained.Then,performance parameters such as average queue length,average throughput,average delay and packet dropping probability for both high and low priority data can be expressed.The simulation results verify the correctness of the theoretical derivation.Meanwhile,the proposed priority jump queuing strategy can significantly improve the drop performance of low-priority data.