Purpose:This paper aims to address the limitations in existing research on the evolution of knowledge flow networks by proposing a meso-level institutional field knowledge flow network evolution model(IKM).The purpose...Purpose:This paper aims to address the limitations in existing research on the evolution of knowledge flow networks by proposing a meso-level institutional field knowledge flow network evolution model(IKM).The purpose is to simulate the construction process of a knowledge flow network using knowledge organizations as units and to investigate its effectiveness in replicating institutional field knowledge flow networks.Design/Methodology/Approach:The IKM model enhances the preferential attachment and growth observed in scale-free BA networks,while incorporating three adjustment parameters to simulate the selection of connection targets and the types of nodes involved in the network evolution process Using the PageRank algorithm to calculate the significance of nodes within the knowledge flow network.To compare its performance,the BA and DMS models are also employed for simulating the network.Pearson coefficient analysis is conducted on the simulated networks generated by the IKM,BA and DMS models,as well as on the actual network.Findings:The research findings demonstrate that the IKM model outperforms the BA and DMS models in replicating the institutional field knowledge flow network.It provides comprehensive insights into the evolution mechanism of knowledge flow networks in the scientific research realm.The model also exhibits potential applicability to other knowledge networks that involve knowledge organizations as node units.Research Limitations:This study has some limitations.Firstly,it primarily focuses on the evolution of knowledge flow networks within the field of physics,neglecting other fields.Additionally,the analysis is based on a specific set of data,which may limit the generalizability of the findings.Future research could address these limitations by exploring knowledge flow networks in diverse fields and utilizing broader datasets.Practical Implications:The proposed IKM model offers practical implications for the construction and analysis of knowledge flow networks within institutions.It provides a valuable tool for understanding and managing knowledge exchange between knowledge organizations.The model can aid in optimizing knowledge flow and enhancing collaboration within organizations.Originality/value:This research highlights the significance of meso-level studies in understanding knowledge organization and its impact on knowledge flow networks.The IKM model demonstrates its effectiveness in replicating institutional field knowledge flow networks and offers practical implications for knowledge management in institutions.Moreover,the model has the potential to be applied to other knowledge networks,which are formed by knowledge organizations as node units.展开更多
Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news text...Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news texts,resulting in unsatisfactory recommendation results.Besides,these traditional methods are more friendly to active users with rich historical behaviors.However,they can not effectively solve the long tail problem of inactive users.To address these issues,this research presents a novel general framework that combines Large Language Models(LLM)and Knowledge Graphs(KG)into traditional methods.To learn the contextual information of news text,we use LLMs’powerful text understanding ability to generate news representations with rich semantic information,and then,the generated news representations are used to enhance the news encoding in traditional methods.In addition,multi-hops relationship of news entities is mined and the structural information of news is encoded using KG,thus alleviating the challenge of long-tail distribution.Experimental results demonstrate that compared with various traditional models,on evaluation indicators such as AUC,MRR,nDCG@5 and nDCG@10,the framework significantly improves the recommendation performance.The successful integration of LLM and KG in our framework has established a feasible way for achieving more accurate personalized news recommendation.Our code is available at https://github.com/Xuan-ZW/LKPNR.展开更多
Objective:To explore the impact of a continuous precision nursing model on patients’Knowledge,Attitudes,and Practices(KAP)and cardiac function during the nursing process of patients undergoing percutaneous coronary a...Objective:To explore the impact of a continuous precision nursing model on patients’Knowledge,Attitudes,and Practices(KAP)and cardiac function during the nursing process of patients undergoing percutaneous coronary angiography and stent implantation.Methods:Ninety patients who underwent percutaneous coronary angiography and stent implantation in our hospital from April 2022 to April 2023 were selected and randomly divided into the control group(45 cases),in which routine nursing support was carried out during the treatment process,and the observation group(45 cases),in which continuous precision nursing model was carried out during the treatment process.Comparisons were made between the two groups of patients on their KAP,cardiac function,and quality of life during recovery.Results:There was no difference in the left ventricular ejection fraction(LVEF),cardiac output(CO),and cardiac index(CI)levels before intervention.After the intervention,the levels of cardiac function in the observation group were higher than those of the control group(P<0.05).There was no difference in the Exercise of Self-Care Agency(ESCA)self-care ability scale scores before the intervention.After the intervention,the observation group had higher ESCA scores than the control group(P<0.05).Conclusion:Implementation of a continuous precision nursing model in the care of patients undergoing percutaneous coronary angiography and stent implantation improved the patient’s cardiac function,and KAP,and promoted recovery.展开更多
Knowledge distillation,as a pivotal technique in the field of model compression,has been widely applied across various domains.However,the problem of student model performance being limited due to inherent biases in t...Knowledge distillation,as a pivotal technique in the field of model compression,has been widely applied across various domains.However,the problem of student model performance being limited due to inherent biases in the teacher model during the distillation process still persists.To address the inherent biases in knowledge distillation,we propose a de-biased knowledge distillation framework tailored for binary classification tasks.For the pre-trained teacher model,biases in the soft labels are mitigated through knowledge infusion and label de-biasing techniques.Based on this,a de-biased distillation loss is introduced,allowing the de-biased labels to replace the soft labels as the fitting target for the student model.This approach enables the student model to learn from the corrected model information,achieving high-performance deployment on lightweight student models.Experiments conducted on multiple real-world datasets demonstrate that deep learning models compressed under the de-biased knowledge distillation framework significantly outperform traditional response-based and feature-based knowledge distillation models across various evaluation metrics,highlighting the effectiveness and superiority of the de-biased knowledge distillation framework in model compression.展开更多
With the construction of new power systems,the power grid has become extremely large,with an increasing proportion of new energy and AC/DC hybrid connections.The dynamic characteristics and fault patterns of the power...With the construction of new power systems,the power grid has become extremely large,with an increasing proportion of new energy and AC/DC hybrid connections.The dynamic characteristics and fault patterns of the power grid are complex;additionally,power grid control is difficult,operation risks are high,and the task of fault handling is arduous.Traditional power-grid fault handling relies primarily on human experience.The difference in and lack of knowledge reserve of control personnel restrict the accuracy and timeliness of fault handling.Therefore,this mode of operation is no longer suitable for the requirements of new systems.Based on the multi-source heterogeneous data of power grid dispatch,this paper proposes a joint entity–relationship extraction method for power-grid dispatch fault processing based on a pre-trained model,constructs a knowledge graph of power-grid dispatch fault processing and designs,and develops a fault-processing auxiliary decision-making system based on the knowledge graph.It was applied to study a provincial dispatch control center,and it effectively improved the accident processing ability and intelligent level of accident management and control of the power grid.展开更多
Cyber Threat Intelligence(CTI)is a valuable resource for cybersecurity defense,but it also poses challenges due to its multi-source and heterogeneous nature.Security personnel may be unable to use CTI effectively to u...Cyber Threat Intelligence(CTI)is a valuable resource for cybersecurity defense,but it also poses challenges due to its multi-source and heterogeneous nature.Security personnel may be unable to use CTI effectively to understand the condition and trend of a cyberattack and respond promptly.To address these challenges,we propose a novel approach that consists of three steps.First,we construct the attack and defense analysis of the cybersecurity ontology(ADACO)model by integrating multiple cybersecurity databases.Second,we develop the threat evolution prediction algorithm(TEPA),which can automatically detect threats at device nodes,correlate and map multisource threat information,and dynamically infer the threat evolution process.TEPA leverages knowledge graphs to represent comprehensive threat scenarios and achieves better performance in simulated experiments by combining structural and textual features of entities.Third,we design the intelligent defense decision algorithm(IDDA),which can provide intelligent recommendations for security personnel regarding the most suitable defense techniques.IDDA outperforms the baseline methods in the comparative experiment.展开更多
In order to solve the problem of modeling product configuration knowledge at the semantic level to successfully implement the mass customization strategy, an approach of ontology-based configuration knowledge modeling...In order to solve the problem of modeling product configuration knowledge at the semantic level to successfully implement the mass customization strategy, an approach of ontology-based configuration knowledge modeling, combining semantic web technologies, was proposed. A general configuration ontology was developed to provide a common concept structure for modeling configuration knowledge and rules of specific product domains. The OWL web ontology language and semantic web rule language (SWRL) were used to formally represent the configuration ontology, domain configuration knowledge and rules to enhance the consistency, maintainability and reusability of all the configuration knowledge. The configuration knowledge modeling of a customizable personal computer family shows that the approach can provide explicit, computerunderstandable knowledge semantics for specific product configuration domains and can efficiently support automatic configuration tasks of complex products.展开更多
For an extract description of threads information in question and answer (QnA) web forums, it is proposed to construct a QnA knowledge presentation model in the English language, and then an entire solution for the ...For an extract description of threads information in question and answer (QnA) web forums, it is proposed to construct a QnA knowledge presentation model in the English language, and then an entire solution for the QnA knowledge system is presented, including data gathering, platform building and applications design. With pre-defined dictionary and grammatical analysis, the model draws semantic information, grammatical information and knowledge confidence into IR methods, in the form of statement sets and term sets with semantic links. Theoretical analysis shows that the statement model can provide an exact presentation for QnA knowledge, breaking through any limits from original QnA patterns and being adaptable to various query demands; the semantic links between terms can assist the statement model, in terms of deducing new from existing knowledge. The model makes use of both information retrieval (IR) and natural language processing (NLP) features, strengthening the knowledge presentation ability. Many knowledge-based applications built upon this model can be improved, providing better performance.展开更多
In order to find the completeness threshold which offers a practical method of making bounded model checking complete, the over-approximation for the complete threshold is presented. First, a linear logic of knowledge...In order to find the completeness threshold which offers a practical method of making bounded model checking complete, the over-approximation for the complete threshold is presented. First, a linear logic of knowledge is introduced into the past tense operator, and then a new temporal epistemic logic LTLKP is obtained, so that LTLKP can naturally and precisely describe the system's reliability. Secondly, a set of prior algorithms are designed to calculate the maximal reachable depth and the length of the longest of loop free paths in the structure based on the graph structure theory. Finally, some theorems are proposed to show how to approximate the complete threshold with the diameter and recurrence diameter. The proposed work resolves the completeness threshold problem so that the completeness of bounded model checking can be guaranteed.展开更多
Recently,pre-trained language representation models such as bidirec-tional encoder representations from transformers(BERT)have been performing well in commonsense question answering(CSQA).However,there is a problem th...Recently,pre-trained language representation models such as bidirec-tional encoder representations from transformers(BERT)have been performing well in commonsense question answering(CSQA).However,there is a problem that the models do not directly use explicit information of knowledge sources existing outside.To augment this,additional methods such as knowledge-aware graph network(KagNet)and multi-hop graph relation network(MHGRN)have been proposed.In this study,we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers(ALBERT)with knowledge graph information extraction technique.We also propose to applying the novel method,schema graph expansion to recent language models.Then,we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent.Furthermore,we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.展开更多
Configuration knowledge is a dynamic information set which is evolving and enriching on and on. Product model is the instantiation of configuration knowledge and the evolution of configuration knowledge is the essenti...Configuration knowledge is a dynamic information set which is evolving and enriching on and on. Product model is the instantiation of configuration knowledge and the evolution of configuration knowledge is the essential inherent reason which causes the models dynamic evolvement. In the traditional model evolvement process, the inheriting and reuse of configuration knowledge was always ignored. Aim at solving the above problem, the multistage rhombus evolution mode of configuration knowledge is discussed in this paper. The product model based on configuration knowledge is put forward in different levels to achieve the models dynamic evolvement and automatic upgrading. The evolving configuration knowledge drives the product model to evolve directly according to the rule of up-layer evolvement. Furthermore, a new configuration knowledge reuse and optimization technology is presented to inheriting and reuse the foregone configuration knowledge in the course of model evolvement. At last, the air separation equipment which is related with the project is taken as an example to illuminate that the presented model evolvement and configuration knowledge reuse technology are validity and practical.展开更多
A knowledge model with temporal and spatial characteristics for the quantitative design of a cultural pattern in wheat production, using systems analysis and dynamic modeling techniques, was developed for wheat manage...A knowledge model with temporal and spatial characteristics for the quantitative design of a cultural pattern in wheat production, using systems analysis and dynamic modeling techniques, was developed for wheat management, as a decision-making tool in digital farming. The fundamental relationships and algorithms of wheat growth indices and management criteria to cultivars, ecological environments, and production levels were derived from the existing literature and research data to establish a knowledge model system for quantitative wheat management using Visual C^++. The system designed a cultural management plan for general management guidelines and crop regulation indices for timecourse control criteria during the wheat-growing period. The cultural management plan module included submodels to determine target grain yield and quality, cultivar choice, sowing date, population density, sowing rate, fertilization strategy, and water management, whereas the crop regulation indices module included submodels for suitable development stages, dynamic growth indices, source-sink indices, and nutrient indices. Ewluation of the knowledge model by design studies on the basis of data sets of different eco-sites, cultiwrs, and soil types indicated a favorable performance of the model system in recommending growth indices and management criteria under diverse conditions. Practical application of the knowledge model system in comparative field experiments produced yield gains of 2.4% to 16.5%. Thus, the presented knowledge model system overcame some of the difficulties of the traditional wheat management patterns and expert systems, and laid a foundation for facilitating the digitization of wheat management.展开更多
Based on research concerning dynamic relationships of winter wheat growth to environments and production conditions, a winter wheat model for selecting suitable sowing date, population density and sowing rate under di...Based on research concerning dynamic relationships of winter wheat growth to environments and production conditions, a winter wheat model for selecting suitable sowing date, population density and sowing rate under different varieties, spatial and temporal environments was developed. Case studies on sowing date with the data sets of five different eco-sites, three climatic years and soil fertility levels, and on population density and sowing rate with the data sets of two different variety types, three different soil types, soil fertility levels, sowing dates and grain yield levels indicate a good model performance for decision-making.展开更多
By analyzing and extracting the research progress on nitrogen fertilization in wheat, a dynamic knowledge model for management decision-making on total nitrogen rate, ratios of organic to inorganic and of basal to dre...By analyzing and extracting the research progress on nitrogen fertilization in wheat, a dynamic knowledge model for management decision-making on total nitrogen rate, ratios of organic to inorganic and of basal to dressing nitrogen under different environments and cultivars in wheat was developed with principle of nutrient balance and by integrating the quantitative effects of grain yield and quality targets, soil characters, variety traits and water management levels. Case studies on the nitrogen fertilization model with the data sets of different eco-sites, cultivars, soil fertility levels, grain yield and quality targets and water management levels indicate a good performance of the model system in decision-making and wide applicability.展开更多
By applying the system analysis principle and mathematical modeling technique to knowledge expression system for crop cultural management, the fundamental relationships and quantitative algorithms of wheat growth and ...By applying the system analysis principle and mathematical modeling technique to knowledge expression system for crop cultural management, the fundamental relationships and quantitative algorithms of wheat growth and management indices to variety types, ecological environments and production levels were analysed and extracted, and a dynamic knowledge model with temporal and spatial characters for wheat management(WheatKnow)was developed. By adopting the soft component characteristics as non language relevance , re-utilization and portable system maintenance. and by further integrating the wheat growth simulation model(WheatGrow)and intelligent system for wheat management, a comprehensive and digital knowledge model, growth model and component-based decision support system for wheat management(MBDSSWM)was established on the platforms of Visual C++ and Visual Basic. The MBDSSWM realized the effective integration and coupling of the prediction and decision-making functions for digital crop management.展开更多
BACKGROUND Cerebrovascular disease(CVD)poses a serious threat to human health and safety.Thus,developing a reasonable exercise program plays an important role in the long-term recovery and prognosis for patients with ...BACKGROUND Cerebrovascular disease(CVD)poses a serious threat to human health and safety.Thus,developing a reasonable exercise program plays an important role in the long-term recovery and prognosis for patients with CVD.Studies have shown that predictive nursing can improve the quality of care and that the information–knowledge–attitude–practice(IKAP)nursing model has a positive impact on patients who suffered a stroke.Few studies have combined these two nursing models to treat CVD.AIM To explore the effect of the IKAP nursing model combined with predictive nursing on the Fugl–Meyer motor function(FMA)score,Barthel index score,and disease knowledge mastery rate in patients with CVD.METHODS A total of 140 patients with CVD treated at our hospital between December 2019 and September 2021 were randomly divided into two groups,with 70 patients in each.The control group received routine nursing,while the observation group received the IKAP nursing model combined with predictive nursing.Both groups were observed for self-care ability,motor function,and disease knowledge mastery rate after one month of nursing.RESULTS There was no clear difference between the Barthel index and FMA scores of the two groups before nursing(P>0.05);however,their scores increased after nursing.This increase was more apparent in the observation group,and the difference was statistically significant(P<0.05).The rates of disease knowledge mastery,timely medication,appropriate exercise,and reasonable diet were significantly higher in the observation group than in the control group(P<0.05).The satisfaction rate in the observation group(97.14%)was significantly higher than that in the control group(81.43%;P<0.05).CONCLUSION The IKAP nursing model,combined with predictive nursing,is more effective than routine nursing in the care of patients with CVD,and it can significantly improve the Barthel index and FMA scores with better knowledge acquisition,as well as produce high satisfaction in patients.Moreover,they can be widely used in the clinical setting.展开更多
Event extraction stands as a significant endeavor within the realm of information extraction,aspiring to automatically extract structured event information from vast volumes of unstructured text.Extracting event eleme...Event extraction stands as a significant endeavor within the realm of information extraction,aspiring to automatically extract structured event information from vast volumes of unstructured text.Extracting event elements from multi-modal data remains a challenging task due to the presence of a large number of images and overlapping event elements in the data.Although researchers have proposed various methods to accomplish this task,most existing event extraction models cannot address these challenges because they are only applicable to text scenarios.To solve the above issues,this paper proposes a multi-modal event extraction method based on knowledge fusion.Specifically,for event-type recognition,we use a meticulous pipeline approach that integrates multiple pre-trained models.This approach enables a more comprehensive capture of the multidimensional event semantic features present in military texts,thereby enhancing the interconnectedness of information between trigger words and events.For event element extraction,we propose a method for constructing a priori templates that combine event types with corresponding trigger words.This approach facilitates the acquisition of fine-grained input samples containing event trigger words,thus enabling the model to understand the semantic relationships between elements in greater depth.Furthermore,a fusion method for spatial mapping of textual event elements and image elements is proposed to reduce the category number overload and effectively achieve multi-modal knowledge fusion.The experimental results based on the CCKS 2022 dataset show that our method has achieved competitive results,with a comprehensive evaluation value F1-score of 53.4%for the model.These results validate the effectiveness of our method in extracting event elements from multi-modal data.展开更多
In this paper,a novel method of ultra-lightweight convolution neural network(CNN)design based on neural architecture search(NAS)and knowledge distillation(KD)is proposed.It can realize the automatic construction of th...In this paper,a novel method of ultra-lightweight convolution neural network(CNN)design based on neural architecture search(NAS)and knowledge distillation(KD)is proposed.It can realize the automatic construction of the space target inverse synthetic aperture radar(ISAR)image recognition model with ultra-lightweight and high accuracy.This method introduces the NAS method into the radar image recognition for the first time,which solves the time-consuming and labor-consuming problems in the artificial design of the space target ISAR image automatic recognition model(STIIARM).On this basis,the NAS model’s knowledge is transferred to the student model with lower computational complexity by the flow of the solution procedure(FSP)distillation method.Thus,the decline of recognition accuracy caused by the direct compression of model structural parameters can be effectively avoided,and the ultralightweight STIIARM can be obtained.In the method,the Inverted Linear Bottleneck(ILB)and Inverted Residual Block(IRB)are firstly taken as each block’s basic structure in CNN.And the expansion ratio,output filter size,number of IRBs,and convolution kernel size are set as the search parameters to construct a hierarchical decomposition search space.Then,the recognition accuracy and computational complexity are taken as the objective function and constraint conditions,respectively,and the global optimization model of the CNN architecture search is established.Next,the simulated annealing(SA)algorithm is used as the search strategy to search out the lightweight and high accuracy STIIARM directly.After that,based on the three principles of similar block structure,the same corresponding channel number,and the minimum computational complexity,the more lightweight student model is designed,and the FSP matrix pairing between the NAS model and student model is completed.Finally,by minimizing the loss between the FSP matrix pairs of the NAS model and student model,the student model’s weight adjustment is completed.Thus the ultra-lightweight and high accuracy STIIARM is obtained.The proposed method’s effectiveness is verified by the simulation experiments on the ISAR image dataset of five types of space targets.展开更多
To improve the efficiency and accuracy of carbonate reservoir research,a unified reservoir knowledge base linking geological knowledge management with reservoir research is proposed.The reservoir knowledge base serves...To improve the efficiency and accuracy of carbonate reservoir research,a unified reservoir knowledge base linking geological knowledge management with reservoir research is proposed.The reservoir knowledge base serves high-quality analysis,evaluation,description and geological modeling of reservoirs.The knowledge framework is divided into three categories:technical service standard,technical research method and professional knowledge and cases related to geological objects.In order to build a knowledge base,first of all,it is necessary to form a knowledge classification system and knowledge description standards;secondly,to sort out theoretical understandings and various technical methods for different geologic objects and work out a technical service standard package according to the technical standard;thirdly,to collect typical outcrop and reservoir cases,constantly expand the content of the knowledge base through systematic extraction,sorting and saving,and construct professional knowledge about geological objects.Through the use of encyclopedia based collaborative editing architecture,knowledge construction and sharing can be realized.Geological objects and related attribute parameters can be automatically extracted by using natural language processing(NLP)technology,and outcrop data can be collected by using modern fine measurement technology,to enhance the efficiency of knowledge acquisition,extraction and sorting.In this paper,the geological modeling of fracture-cavity reservoir in the Tarim Basin is taken as an example to illustrate the construction of knowledge base of carbonate reservoir and its application in geological modeling of fracture-cavity carbonate reservoir.展开更多
With the rapid development of the Internet of Things(IoT),the automation of edge-side equipment has emerged as a significant trend.The existing fault diagnosismethods have the characteristics of heavy computing and st...With the rapid development of the Internet of Things(IoT),the automation of edge-side equipment has emerged as a significant trend.The existing fault diagnosismethods have the characteristics of heavy computing and storage load,and most of them have computational redundancy,which is not suitable for deployment on edge devices with limited resources and capabilities.This paper proposes a novel two-stage edge-side fault diagnosis method based on double knowledge distillation.First,we offer a clustering-based self-knowledge distillation approach(Cluster KD),which takes the mean value of the sample diagnosis results,clusters them,and takes the clustering results as the terms of the loss function.It utilizes the correlations between faults of the same type to improve the accuracy of the teacher model,especially for fault categories with high similarity.Then,the double knowledge distillation framework uses ordinary knowledge distillation to build a lightweightmodel for edge-side deployment.We propose a two-stage edge-side fault diagnosismethod(TSM)that separates fault detection and fault diagnosis into different stages:in the first stage,a fault detection model based on a denoising auto-encoder(DAE)is adopted to achieve fast fault responses;in the second stage,a diverse convolutionmodel with variance weighting(DCMVW)is used to diagnose faults in detail,extracting features frommicro andmacro perspectives.Through comparison experiments conducted on two fault datasets,it is proven that the proposed method has high accuracy,low delays,and small computation,which is suitable for intelligent edge-side fault diagnosis.In addition,experiments show that our approach has a smooth training process and good balance.展开更多
基金supported in part by the National Natural Science Foundation of China under Grant 72264036in part by the West Light Foundation of The Chinese Academy of Sciences under Grant 2020-XBQNXZ-020+1 种基金Social Science Foundation of Xinjiang under Grant 2023BGL077the Research Program for High-level Talent Program of Xinjiang University of Finance and Economics 2022XGC041,2022XGC042.
文摘Purpose:This paper aims to address the limitations in existing research on the evolution of knowledge flow networks by proposing a meso-level institutional field knowledge flow network evolution model(IKM).The purpose is to simulate the construction process of a knowledge flow network using knowledge organizations as units and to investigate its effectiveness in replicating institutional field knowledge flow networks.Design/Methodology/Approach:The IKM model enhances the preferential attachment and growth observed in scale-free BA networks,while incorporating three adjustment parameters to simulate the selection of connection targets and the types of nodes involved in the network evolution process Using the PageRank algorithm to calculate the significance of nodes within the knowledge flow network.To compare its performance,the BA and DMS models are also employed for simulating the network.Pearson coefficient analysis is conducted on the simulated networks generated by the IKM,BA and DMS models,as well as on the actual network.Findings:The research findings demonstrate that the IKM model outperforms the BA and DMS models in replicating the institutional field knowledge flow network.It provides comprehensive insights into the evolution mechanism of knowledge flow networks in the scientific research realm.The model also exhibits potential applicability to other knowledge networks that involve knowledge organizations as node units.Research Limitations:This study has some limitations.Firstly,it primarily focuses on the evolution of knowledge flow networks within the field of physics,neglecting other fields.Additionally,the analysis is based on a specific set of data,which may limit the generalizability of the findings.Future research could address these limitations by exploring knowledge flow networks in diverse fields and utilizing broader datasets.Practical Implications:The proposed IKM model offers practical implications for the construction and analysis of knowledge flow networks within institutions.It provides a valuable tool for understanding and managing knowledge exchange between knowledge organizations.The model can aid in optimizing knowledge flow and enhancing collaboration within organizations.Originality/value:This research highlights the significance of meso-level studies in understanding knowledge organization and its impact on knowledge flow networks.The IKM model demonstrates its effectiveness in replicating institutional field knowledge flow networks and offers practical implications for knowledge management in institutions.Moreover,the model has the potential to be applied to other knowledge networks,which are formed by knowledge organizations as node units.
基金supported by National Key R&D Program of China(2022QY2000-02).
文摘Accurately recommending candidate news to users is a basic challenge of personalized news recommendation systems.Traditional methods are usually difficult to learn and acquire complex semantic information in news texts,resulting in unsatisfactory recommendation results.Besides,these traditional methods are more friendly to active users with rich historical behaviors.However,they can not effectively solve the long tail problem of inactive users.To address these issues,this research presents a novel general framework that combines Large Language Models(LLM)and Knowledge Graphs(KG)into traditional methods.To learn the contextual information of news text,we use LLMs’powerful text understanding ability to generate news representations with rich semantic information,and then,the generated news representations are used to enhance the news encoding in traditional methods.In addition,multi-hops relationship of news entities is mined and the structural information of news is encoded using KG,thus alleviating the challenge of long-tail distribution.Experimental results demonstrate that compared with various traditional models,on evaluation indicators such as AUC,MRR,nDCG@5 and nDCG@10,the framework significantly improves the recommendation performance.The successful integration of LLM and KG in our framework has established a feasible way for achieving more accurate personalized news recommendation.Our code is available at https://github.com/Xuan-ZW/LKPNR.
文摘Objective:To explore the impact of a continuous precision nursing model on patients’Knowledge,Attitudes,and Practices(KAP)and cardiac function during the nursing process of patients undergoing percutaneous coronary angiography and stent implantation.Methods:Ninety patients who underwent percutaneous coronary angiography and stent implantation in our hospital from April 2022 to April 2023 were selected and randomly divided into the control group(45 cases),in which routine nursing support was carried out during the treatment process,and the observation group(45 cases),in which continuous precision nursing model was carried out during the treatment process.Comparisons were made between the two groups of patients on their KAP,cardiac function,and quality of life during recovery.Results:There was no difference in the left ventricular ejection fraction(LVEF),cardiac output(CO),and cardiac index(CI)levels before intervention.After the intervention,the levels of cardiac function in the observation group were higher than those of the control group(P<0.05).There was no difference in the Exercise of Self-Care Agency(ESCA)self-care ability scale scores before the intervention.After the intervention,the observation group had higher ESCA scores than the control group(P<0.05).Conclusion:Implementation of a continuous precision nursing model in the care of patients undergoing percutaneous coronary angiography and stent implantation improved the patient’s cardiac function,and KAP,and promoted recovery.
基金supported by the National Natural Science Foundation of China under Grant No.62172056Young Elite Scientists Sponsorship Program by CAST under Grant No.2022QNRC001.
文摘Knowledge distillation,as a pivotal technique in the field of model compression,has been widely applied across various domains.However,the problem of student model performance being limited due to inherent biases in the teacher model during the distillation process still persists.To address the inherent biases in knowledge distillation,we propose a de-biased knowledge distillation framework tailored for binary classification tasks.For the pre-trained teacher model,biases in the soft labels are mitigated through knowledge infusion and label de-biasing techniques.Based on this,a de-biased distillation loss is introduced,allowing the de-biased labels to replace the soft labels as the fitting target for the student model.This approach enables the student model to learn from the corrected model information,achieving high-performance deployment on lightweight student models.Experiments conducted on multiple real-world datasets demonstrate that deep learning models compressed under the de-biased knowledge distillation framework significantly outperform traditional response-based and feature-based knowledge distillation models across various evaluation metrics,highlighting the effectiveness and superiority of the de-biased knowledge distillation framework in model compression.
基金supported by the Science and Technology Project of the State Grid Corporation“Research on Key Technologies of Power Artificial Intelligence Open Platform”(5700-202155260A-0-0-00).
文摘With the construction of new power systems,the power grid has become extremely large,with an increasing proportion of new energy and AC/DC hybrid connections.The dynamic characteristics and fault patterns of the power grid are complex;additionally,power grid control is difficult,operation risks are high,and the task of fault handling is arduous.Traditional power-grid fault handling relies primarily on human experience.The difference in and lack of knowledge reserve of control personnel restrict the accuracy and timeliness of fault handling.Therefore,this mode of operation is no longer suitable for the requirements of new systems.Based on the multi-source heterogeneous data of power grid dispatch,this paper proposes a joint entity–relationship extraction method for power-grid dispatch fault processing based on a pre-trained model,constructs a knowledge graph of power-grid dispatch fault processing and designs,and develops a fault-processing auxiliary decision-making system based on the knowledge graph.It was applied to study a provincial dispatch control center,and it effectively improved the accident processing ability and intelligent level of accident management and control of the power grid.
文摘Cyber Threat Intelligence(CTI)is a valuable resource for cybersecurity defense,but it also poses challenges due to its multi-source and heterogeneous nature.Security personnel may be unable to use CTI effectively to understand the condition and trend of a cyberattack and respond promptly.To address these challenges,we propose a novel approach that consists of three steps.First,we construct the attack and defense analysis of the cybersecurity ontology(ADACO)model by integrating multiple cybersecurity databases.Second,we develop the threat evolution prediction algorithm(TEPA),which can automatically detect threats at device nodes,correlate and map multisource threat information,and dynamically infer the threat evolution process.TEPA leverages knowledge graphs to represent comprehensive threat scenarios and achieves better performance in simulated experiments by combining structural and textual features of entities.Third,we design the intelligent defense decision algorithm(IDDA),which can provide intelligent recommendations for security personnel regarding the most suitable defense techniques.IDDA outperforms the baseline methods in the comparative experiment.
基金The National Natural Science Foundation of China(No.70471023).
文摘In order to solve the problem of modeling product configuration knowledge at the semantic level to successfully implement the mass customization strategy, an approach of ontology-based configuration knowledge modeling, combining semantic web technologies, was proposed. A general configuration ontology was developed to provide a common concept structure for modeling configuration knowledge and rules of specific product domains. The OWL web ontology language and semantic web rule language (SWRL) were used to formally represent the configuration ontology, domain configuration knowledge and rules to enhance the consistency, maintainability and reusability of all the configuration knowledge. The configuration knowledge modeling of a customizable personal computer family shows that the approach can provide explicit, computerunderstandable knowledge semantics for specific product configuration domains and can efficiently support automatic configuration tasks of complex products.
基金Microsoft Research Asia Internet Services in Aca-demic Research Fund (NoFY07-RES-OPP-116)Tianjin Technological Development Program Project (No06YFGZGX05900)
文摘For an extract description of threads information in question and answer (QnA) web forums, it is proposed to construct a QnA knowledge presentation model in the English language, and then an entire solution for the QnA knowledge system is presented, including data gathering, platform building and applications design. With pre-defined dictionary and grammatical analysis, the model draws semantic information, grammatical information and knowledge confidence into IR methods, in the form of statement sets and term sets with semantic links. Theoretical analysis shows that the statement model can provide an exact presentation for QnA knowledge, breaking through any limits from original QnA patterns and being adaptable to various query demands; the semantic links between terms can assist the statement model, in terms of deducing new from existing knowledge. The model makes use of both information retrieval (IR) and natural language processing (NLP) features, strengthening the knowledge presentation ability. Many knowledge-based applications built upon this model can be improved, providing better performance.
基金The National Natural Science Foundation of China (No.10974093)the Scientific Research Foundation for Senior Personnel of Jiangsu University (No.07JDG014)the Natural Science Foundation of Higher Education Institutions of Jiangsu Province (No.08KJD520015)
文摘In order to find the completeness threshold which offers a practical method of making bounded model checking complete, the over-approximation for the complete threshold is presented. First, a linear logic of knowledge is introduced into the past tense operator, and then a new temporal epistemic logic LTLKP is obtained, so that LTLKP can naturally and precisely describe the system's reliability. Secondly, a set of prior algorithms are designed to calculate the maximal reachable depth and the length of the longest of loop free paths in the structure based on the graph structure theory. Finally, some theorems are proposed to show how to approximate the complete threshold with the diameter and recurrence diameter. The proposed work resolves the completeness threshold problem so that the completeness of bounded model checking can be guaranteed.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea Government(MSIT)(No.2020R1G1A1100493).
文摘Recently,pre-trained language representation models such as bidirec-tional encoder representations from transformers(BERT)have been performing well in commonsense question answering(CSQA).However,there is a problem that the models do not directly use explicit information of knowledge sources existing outside.To augment this,additional methods such as knowledge-aware graph network(KagNet)and multi-hop graph relation network(MHGRN)have been proposed.In this study,we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers(ALBERT)with knowledge graph information extraction technique.We also propose to applying the novel method,schema graph expansion to recent language models.Then,we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent.Furthermore,we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.
基金supported by National Natural Science Foundation of China (Grant No. 50835008, Grant No. 50875237, Grant No. 50705084)National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z190, Grant No. 2008AA042301)
文摘Configuration knowledge is a dynamic information set which is evolving and enriching on and on. Product model is the instantiation of configuration knowledge and the evolution of configuration knowledge is the essential inherent reason which causes the models dynamic evolvement. In the traditional model evolvement process, the inheriting and reuse of configuration knowledge was always ignored. Aim at solving the above problem, the multistage rhombus evolution mode of configuration knowledge is discussed in this paper. The product model based on configuration knowledge is put forward in different levels to achieve the models dynamic evolvement and automatic upgrading. The evolving configuration knowledge drives the product model to evolve directly according to the rule of up-layer evolvement. Furthermore, a new configuration knowledge reuse and optimization technology is presented to inheriting and reuse the foregone configuration knowledge in the course of model evolvement. At last, the air separation equipment which is related with the project is taken as an example to illuminate that the presented model evolvement and configuration knowledge reuse technology are validity and practical.
基金Project supported by the National High-Technology Research and Development Program of China (863 Program) (No. 2003AA209030)the National Natural Science Foundation of China (No. 30030090)and the Hi-Tech Research and Development Program of Jiangsu Province (No. BG2004320).
文摘A knowledge model with temporal and spatial characteristics for the quantitative design of a cultural pattern in wheat production, using systems analysis and dynamic modeling techniques, was developed for wheat management, as a decision-making tool in digital farming. The fundamental relationships and algorithms of wheat growth indices and management criteria to cultivars, ecological environments, and production levels were derived from the existing literature and research data to establish a knowledge model system for quantitative wheat management using Visual C^++. The system designed a cultural management plan for general management guidelines and crop regulation indices for timecourse control criteria during the wheat-growing period. The cultural management plan module included submodels to determine target grain yield and quality, cultivar choice, sowing date, population density, sowing rate, fertilization strategy, and water management, whereas the crop regulation indices module included submodels for suitable development stages, dynamic growth indices, source-sink indices, and nutrient indices. Ewluation of the knowledge model by design studies on the basis of data sets of different eco-sites, cultiwrs, and soil types indicated a favorable performance of the model system in recommending growth indices and management criteria under diverse conditions. Practical application of the knowledge model system in comparative field experiments produced yield gains of 2.4% to 16.5%. Thus, the presented knowledge model system overcame some of the difficulties of the traditional wheat management patterns and expert systems, and laid a foundation for facilitating the digitization of wheat management.
基金the National Natural Science Foundation of China(30030090) National“863”Plans of China(2001AA245041,2001AA115420).
文摘Based on research concerning dynamic relationships of winter wheat growth to environments and production conditions, a winter wheat model for selecting suitable sowing date, population density and sowing rate under different varieties, spatial and temporal environments was developed. Case studies on sowing date with the data sets of five different eco-sites, three climatic years and soil fertility levels, and on population density and sowing rate with the data sets of two different variety types, three different soil types, soil fertility levels, sowing dates and grain yield levels indicate a good model performance for decision-making.
基金supported by the National Natural Science Foundation of China(30030090)National High Tech R&D Program(863 Program)of China(2001AA245041,2001AA115420).
文摘By analyzing and extracting the research progress on nitrogen fertilization in wheat, a dynamic knowledge model for management decision-making on total nitrogen rate, ratios of organic to inorganic and of basal to dressing nitrogen under different environments and cultivars in wheat was developed with principle of nutrient balance and by integrating the quantitative effects of grain yield and quality targets, soil characters, variety traits and water management levels. Case studies on the nitrogen fertilization model with the data sets of different eco-sites, cultivars, soil fertility levels, grain yield and quality targets and water management levels indicate a good performance of the model system in decision-making and wide applicability.
基金supported by the National Natural Science Foundation of China(30030090)the National 863 Program,China(2001AA115420,2001AA245041).
文摘By applying the system analysis principle and mathematical modeling technique to knowledge expression system for crop cultural management, the fundamental relationships and quantitative algorithms of wheat growth and management indices to variety types, ecological environments and production levels were analysed and extracted, and a dynamic knowledge model with temporal and spatial characters for wheat management(WheatKnow)was developed. By adopting the soft component characteristics as non language relevance , re-utilization and portable system maintenance. and by further integrating the wheat growth simulation model(WheatGrow)and intelligent system for wheat management, a comprehensive and digital knowledge model, growth model and component-based decision support system for wheat management(MBDSSWM)was established on the platforms of Visual C++ and Visual Basic. The MBDSSWM realized the effective integration and coupling of the prediction and decision-making functions for digital crop management.
基金Supported by Basic scientific research industry of Heilongjiang Provincial undergraduate universities in 2019,No.2019-KYYWF-1213.
文摘BACKGROUND Cerebrovascular disease(CVD)poses a serious threat to human health and safety.Thus,developing a reasonable exercise program plays an important role in the long-term recovery and prognosis for patients with CVD.Studies have shown that predictive nursing can improve the quality of care and that the information–knowledge–attitude–practice(IKAP)nursing model has a positive impact on patients who suffered a stroke.Few studies have combined these two nursing models to treat CVD.AIM To explore the effect of the IKAP nursing model combined with predictive nursing on the Fugl–Meyer motor function(FMA)score,Barthel index score,and disease knowledge mastery rate in patients with CVD.METHODS A total of 140 patients with CVD treated at our hospital between December 2019 and September 2021 were randomly divided into two groups,with 70 patients in each.The control group received routine nursing,while the observation group received the IKAP nursing model combined with predictive nursing.Both groups were observed for self-care ability,motor function,and disease knowledge mastery rate after one month of nursing.RESULTS There was no clear difference between the Barthel index and FMA scores of the two groups before nursing(P>0.05);however,their scores increased after nursing.This increase was more apparent in the observation group,and the difference was statistically significant(P<0.05).The rates of disease knowledge mastery,timely medication,appropriate exercise,and reasonable diet were significantly higher in the observation group than in the control group(P<0.05).The satisfaction rate in the observation group(97.14%)was significantly higher than that in the control group(81.43%;P<0.05).CONCLUSION The IKAP nursing model,combined with predictive nursing,is more effective than routine nursing in the care of patients with CVD,and it can significantly improve the Barthel index and FMA scores with better knowledge acquisition,as well as produce high satisfaction in patients.Moreover,they can be widely used in the clinical setting.
基金supported by the National Natural Science Foundation of China(Grant No.81973695)Discipline with Strong Characteristics of Liaocheng University-Intelligent Science and Technology(Grant No.319462208).
文摘Event extraction stands as a significant endeavor within the realm of information extraction,aspiring to automatically extract structured event information from vast volumes of unstructured text.Extracting event elements from multi-modal data remains a challenging task due to the presence of a large number of images and overlapping event elements in the data.Although researchers have proposed various methods to accomplish this task,most existing event extraction models cannot address these challenges because they are only applicable to text scenarios.To solve the above issues,this paper proposes a multi-modal event extraction method based on knowledge fusion.Specifically,for event-type recognition,we use a meticulous pipeline approach that integrates multiple pre-trained models.This approach enables a more comprehensive capture of the multidimensional event semantic features present in military texts,thereby enhancing the interconnectedness of information between trigger words and events.For event element extraction,we propose a method for constructing a priori templates that combine event types with corresponding trigger words.This approach facilitates the acquisition of fine-grained input samples containing event trigger words,thus enabling the model to understand the semantic relationships between elements in greater depth.Furthermore,a fusion method for spatial mapping of textual event elements and image elements is proposed to reduce the category number overload and effectively achieve multi-modal knowledge fusion.The experimental results based on the CCKS 2022 dataset show that our method has achieved competitive results,with a comprehensive evaluation value F1-score of 53.4%for the model.These results validate the effectiveness of our method in extracting event elements from multi-modal data.
文摘In this paper,a novel method of ultra-lightweight convolution neural network(CNN)design based on neural architecture search(NAS)and knowledge distillation(KD)is proposed.It can realize the automatic construction of the space target inverse synthetic aperture radar(ISAR)image recognition model with ultra-lightweight and high accuracy.This method introduces the NAS method into the radar image recognition for the first time,which solves the time-consuming and labor-consuming problems in the artificial design of the space target ISAR image automatic recognition model(STIIARM).On this basis,the NAS model’s knowledge is transferred to the student model with lower computational complexity by the flow of the solution procedure(FSP)distillation method.Thus,the decline of recognition accuracy caused by the direct compression of model structural parameters can be effectively avoided,and the ultralightweight STIIARM can be obtained.In the method,the Inverted Linear Bottleneck(ILB)and Inverted Residual Block(IRB)are firstly taken as each block’s basic structure in CNN.And the expansion ratio,output filter size,number of IRBs,and convolution kernel size are set as the search parameters to construct a hierarchical decomposition search space.Then,the recognition accuracy and computational complexity are taken as the objective function and constraint conditions,respectively,and the global optimization model of the CNN architecture search is established.Next,the simulated annealing(SA)algorithm is used as the search strategy to search out the lightweight and high accuracy STIIARM directly.After that,based on the three principles of similar block structure,the same corresponding channel number,and the minimum computational complexity,the more lightweight student model is designed,and the FSP matrix pairing between the NAS model and student model is completed.Finally,by minimizing the loss between the FSP matrix pairs of the NAS model and student model,the student model’s weight adjustment is completed.Thus the ultra-lightweight and high accuracy STIIARM is obtained.The proposed method’s effectiveness is verified by the simulation experiments on the ISAR image dataset of five types of space targets.
基金Supported by the China National Science and Technology Major Project(2016ZX05014-002,2017ZX05005)Chinese Academy of Sciences Pilot A Special Project(XDA14010205)。
文摘To improve the efficiency and accuracy of carbonate reservoir research,a unified reservoir knowledge base linking geological knowledge management with reservoir research is proposed.The reservoir knowledge base serves high-quality analysis,evaluation,description and geological modeling of reservoirs.The knowledge framework is divided into three categories:technical service standard,technical research method and professional knowledge and cases related to geological objects.In order to build a knowledge base,first of all,it is necessary to form a knowledge classification system and knowledge description standards;secondly,to sort out theoretical understandings and various technical methods for different geologic objects and work out a technical service standard package according to the technical standard;thirdly,to collect typical outcrop and reservoir cases,constantly expand the content of the knowledge base through systematic extraction,sorting and saving,and construct professional knowledge about geological objects.Through the use of encyclopedia based collaborative editing architecture,knowledge construction and sharing can be realized.Geological objects and related attribute parameters can be automatically extracted by using natural language processing(NLP)technology,and outcrop data can be collected by using modern fine measurement technology,to enhance the efficiency of knowledge acquisition,extraction and sorting.In this paper,the geological modeling of fracture-cavity reservoir in the Tarim Basin is taken as an example to illustrate the construction of knowledge base of carbonate reservoir and its application in geological modeling of fracture-cavity carbonate reservoir.
基金supported by the National Key R&D Program of China(2019YFB2103202).
文摘With the rapid development of the Internet of Things(IoT),the automation of edge-side equipment has emerged as a significant trend.The existing fault diagnosismethods have the characteristics of heavy computing and storage load,and most of them have computational redundancy,which is not suitable for deployment on edge devices with limited resources and capabilities.This paper proposes a novel two-stage edge-side fault diagnosis method based on double knowledge distillation.First,we offer a clustering-based self-knowledge distillation approach(Cluster KD),which takes the mean value of the sample diagnosis results,clusters them,and takes the clustering results as the terms of the loss function.It utilizes the correlations between faults of the same type to improve the accuracy of the teacher model,especially for fault categories with high similarity.Then,the double knowledge distillation framework uses ordinary knowledge distillation to build a lightweightmodel for edge-side deployment.We propose a two-stage edge-side fault diagnosismethod(TSM)that separates fault detection and fault diagnosis into different stages:in the first stage,a fault detection model based on a denoising auto-encoder(DAE)is adopted to achieve fast fault responses;in the second stage,a diverse convolutionmodel with variance weighting(DCMVW)is used to diagnose faults in detail,extracting features frommicro andmacro perspectives.Through comparison experiments conducted on two fault datasets,it is proven that the proposed method has high accuracy,low delays,and small computation,which is suitable for intelligent edge-side fault diagnosis.In addition,experiments show that our approach has a smooth training process and good balance.