Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news text...Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news texts,resulting in unsatisfactory recommendation results.Besides,these traditional methods are more friendly to active users with rich historical behaviors.However,they can not effectively solve the long tail problem of inactive users.To address these issues,this research presents a novel general framework that combines Large Language Models(LLM)and Knowledge Graphs(KG)into traditional methods.To learn the contextual information of news text,we use LLMs’powerful text understanding ability to generate news representations with rich semantic information,and then,the generated news representations are used to enhance the news encoding in traditional methods.In addition,multi-hops relationship of news entities is mined and the structural information of news is encoded using KG,thus alleviating the challenge of long-tail distribution.Experimental results demonstrate that compared with various traditional models,on evaluation indicators such as AUC,MRR,nDCG@5 and nDCG@10,the framework significantly improves the recommendation performance.The successful integration of LLM and KG in our framework has established a feasible way for achieving more accurate personalized news recommendation.Our code is available at https://github.com/Xuan-ZW/LKPNR.展开更多
This paper presents an innovative surrogate modeling method using a graph neural network to compensate for gravitational and thermal deformation in large radio telescopes.Traditionally,rapid compensation is feasible f...This paper presents an innovative surrogate modeling method using a graph neural network to compensate for gravitational and thermal deformation in large radio telescopes.Traditionally,rapid compensation is feasible for gravitational deformation but not for temperature-induced deformation.The introduction of this method facilitates real-time calculation of deformation caused both by gravity and temperature.Constructing the surrogate model involves two key steps.First,the gravitational and thermal loads are encoded,which facilitates more efficient learning for the neural network.This is followed by employing a graph neural network as an end-to-end model.This model effectively maps external loads to deformation while preserving the spatial correlations between nodes.Simulation results affirm that the proposed method can successfully estimate the surface deformation of the main reflector in real-time and can deliver results that are practically indistinguishable from those obtained using finite element analysis.We also compare the proposed surrogate model method with the out-of-focus holography method and yield similar results.展开更多
Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face ...Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.展开更多
With the purpose of making calculation more efficient in practical hydraulic simulations, an improved algorithm was proposed and was applied in the practical water distribution field. This methodology was developed by...With the purpose of making calculation more efficient in practical hydraulic simulations, an improved algorithm was proposed and was applied in the practical water distribution field. This methodology was developed by expanding the traditional loop-equation theory through utilization of the advantages of the graph theory in efficiency. The utilization of the spanning tree technique from graph theory makes the proposed algorithm efficient in calculation and simple to use for computer coding. The algorithms for topological generation and practical implementations are presented in detail in this paper. Through the application to a practical urban system, the consumption of the CPU time and computation memory were decreased while the accuracy was greatly enhanced compared with the present existing methods.展开更多
A modified shifting bottleneck algorithm was proposed to solve scheduling problems of a large-scale job shop.Firstly,a new structured algorithm was employed for sub-problems so as to reduce the computational burden an...A modified shifting bottleneck algorithm was proposed to solve scheduling problems of a large-scale job shop.Firstly,a new structured algorithm was employed for sub-problems so as to reduce the computational burden and suit for large-scale instances more effectively.The modified cycle avoidance method,incorporating with the disjunctive graph model and topological sort algorithm,was applied to guaranteeing the feasibility of solutions with considering delayed precedence constraints.Finally,simulation experiments were carried out to verify the feasibility and effectiveness of the modified method.The results demonstrate that the proposed algorithm can solve the large-scale job shop scheduling problems(JSSPs) within a reasonable period of time and obtaining satisfactory solutions simultaneously.展开更多
基金supported by National Key R&D Program of China(2022QY2000-02).
文摘Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news texts,resulting in unsatisfactory recommendation results.Besides,these traditional methods are more friendly to active users with rich historical behaviors.However,they can not effectively solve the long tail problem of inactive users.To address these issues,this research presents a novel general framework that combines Large Language Models(LLM)and Knowledge Graphs(KG)into traditional methods.To learn the contextual information of news text,we use LLMs’powerful text understanding ability to generate news representations with rich semantic information,and then,the generated news representations are used to enhance the news encoding in traditional methods.In addition,multi-hops relationship of news entities is mined and the structural information of news is encoded using KG,thus alleviating the challenge of long-tail distribution.Experimental results demonstrate that compared with various traditional models,on evaluation indicators such as AUC,MRR,nDCG@5 and nDCG@10,the framework significantly improves the recommendation performance.The successful integration of LLM and KG in our framework has established a feasible way for achieving more accurate personalized news recommendation.Our code is available at https://github.com/Xuan-ZW/LKPNR.
基金supported by the National Key Basic Research and Development Program of China(2021YFC22035-01)the National Natural Science Foundation of China(U1931137).
文摘This paper presents an innovative surrogate modeling method using a graph neural network to compensate for gravitational and thermal deformation in large radio telescopes.Traditionally,rapid compensation is feasible for gravitational deformation but not for temperature-induced deformation.The introduction of this method facilitates real-time calculation of deformation caused both by gravity and temperature.Constructing the surrogate model involves two key steps.First,the gravitational and thermal loads are encoded,which facilitates more efficient learning for the neural network.This is followed by employing a graph neural network as an end-to-end model.This model effectively maps external loads to deformation while preserving the spatial correlations between nodes.Simulation results affirm that the proposed method can successfully estimate the surface deformation of the main reflector in real-time and can deliver results that are practically indistinguishable from those obtained using finite element analysis.We also compare the proposed surrogate model method with the out-of-focus holography method and yield similar results.
基金Supported by the National Major Scientific and Technological Special Project for"Significant New Drugs Development’’(No.2018ZX09201008)Special Fund Project for Information Development from Shanghai Municipal Commission of Economy and Information(No.201701013)
文摘Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.
文摘With the purpose of making calculation more efficient in practical hydraulic simulations, an improved algorithm was proposed and was applied in the practical water distribution field. This methodology was developed by expanding the traditional loop-equation theory through utilization of the advantages of the graph theory in efficiency. The utilization of the spanning tree technique from graph theory makes the proposed algorithm efficient in calculation and simple to use for computer coding. The algorithms for topological generation and practical implementations are presented in detail in this paper. Through the application to a practical urban system, the consumption of the CPU time and computation memory were decreased while the accuracy was greatly enhanced compared with the present existing methods.
基金National Natural Science Foundations of China(Nos.71471135,61273035)
文摘A modified shifting bottleneck algorithm was proposed to solve scheduling problems of a large-scale job shop.Firstly,a new structured algorithm was employed for sub-problems so as to reduce the computational burden and suit for large-scale instances more effectively.The modified cycle avoidance method,incorporating with the disjunctive graph model and topological sort algorithm,was applied to guaranteeing the feasibility of solutions with considering delayed precedence constraints.Finally,simulation experiments were carried out to verify the feasibility and effectiveness of the modified method.The results demonstrate that the proposed algorithm can solve the large-scale job shop scheduling problems(JSSPs) within a reasonable period of time and obtaining satisfactory solutions simultaneously.