The presented research illustrates the applicability and productiveness of the systematic literature review methodology, a non-empirical methodology in the geological sciences, particularly volcanology. The systematic...The presented research illustrates the applicability and productiveness of the systematic literature review methodology, a non-empirical methodology in the geological sciences, particularly volcanology. The systematic literature review methodology is a replicable, rigorous, and transparent methodology for synthesizing existing literature to answer questions on a specific topic. The synthesis allows for knowledge consolidation, such as identifying knowledge gaps. In our illustration of this methodology, we focused on the expanding knowledge about the magma pathway at Mount Cameroon, one of Africa’s active volcanoes. Our synthesis of the relevant international geoscience research literature is based on the framework of knowledge about the magma pathway beneath a typical basaltic volcano. The framework has three primary components: magma supply, storage, and transport to erupting vents. Across these components is a total of twelve secondary components. The result is a previously non-existent and fragmented overall understanding of the magma pathway at Mount Cameroon. The gaps in the understanding (such as in the magma supply rates, timescales of chamber processes, and magma ascent rates) may be addressed in future research. Another key implication of the presented research lies in the proof of concept of the systematic literature review methodology as an applicable qualitative research methodology in the study of volcanoes.展开更多
Systematic Review and Meta-analysis are techniques which attempt to associate the findings from similar studies and deliver quantitative summaries of the research literature [1]. The Systematic review of research...Systematic Review and Meta-analysis are techniques which attempt to associate the findings from similar studies and deliver quantitative summaries of the research literature [1]. The Systematic review of research literature identifies the common research methods, research design, sample size, parameters used, survey instruments, etc. used by the group of researchers. This study intends to fulfill this purpose in order to identify common research mythologies, dependent variables, sample sizes, moderators and mediators used in the field of analysing technology adoption based studies that utilizes the UTAUT2 model. This research collected over 59 published articles and conducted descriptive analytics. The results have revealed performance expectancy/perceived usefulness, trust and habit as the best predictors of consumer behavioural intentions towards the adoption of mobile application. Behavioural intention was the best predictor of use behaviour among the 57 articles selected. 274 was the mean sample size of research with 25 mean questionnaire items. SPSS and AMOS were the most common softwares used in all 57 studies, and 32 of those studies used UTAUT1 model while 14 researches incorporated the UTAUT2 model. There were also two promising predictors such as perceived risk on behavioural intention and habit on use behaviour.展开更多
The exponential use of artificial intelligence(AI)to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed.While AI is a powerfulmeans to discover interes...The exponential use of artificial intelligence(AI)to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed.While AI is a powerfulmeans to discover interesting patterns and obtain predictive models,the use of these algorithms comes with a great responsibility,as an incomplete or unbalanced set of training data or an unproper interpretation of the models’outcomes could result in misleading conclusions that ultimately could become very dangerous.For these reasons,it is important to rely on expert knowledge when applying these methods.However,not every user can count on this specific expertise;non-AIexpert users could also benefit from applying these powerful algorithms to their domain problems,but they need basic guidelines to obtain themost out of AI models.The goal of this work is to present a systematic review of the literature to analyze studies whose outcomes are explainable rules and heuristics to select suitable AI algorithms given a set of input features.The systematic review follows the methodology proposed by Kitchenham and other authors in the field of software engineering.As a result,9 papers that tackle AI algorithmrecommendation through tangible and traceable rules and heuristics were collected.The reduced number of retrieved papers suggests a lack of reporting explicit rules and heuristics when testing the suitability and performance of AI algorithms.展开更多
The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine l...The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.展开更多
The integration of environmental,social,and governance(ESG)principles has become a pivotal factor in shaping sustainable and responsible corporate practices.The present study investigates the integration of ESG princi...The integration of environmental,social,and governance(ESG)principles has become a pivotal factor in shaping sustainable and responsible corporate practices.The present study investigates the integration of ESG principles within corporate governance models in Asia-Pacific countries,focusing on socialization.By examining the governance culture,legal frameworks,and corporate practices in these representative countries,the paper delineates a strategic framework for embedding social governance into corporate strategies.The study introduces a Cultural,Economic,Legal,and Political(CELP)framework to assess corporate social governance,investigating the correlation between business practices and social changes.Through a systematic literature review and detailed thematic analysis,this paper aims to offer actionable insights and recommendations,guiding corporations in their transition towards more sustainable and socially responsible business practices.展开更多
Artificial intelligence (AI) has garnered significant interest within the educational domain over the past decade, promising to revolutionise teaching and learning. This paper provides a comprehensive overview of syst...Artificial intelligence (AI) has garnered significant interest within the educational domain over the past decade, promising to revolutionise teaching and learning. This paper provides a comprehensive overview of systematic reviews conducted from 2010 to 2023 on the implementation of AI in K-12 education. By synthesising findings from ten selected systematic reviews, this study explores the multifaceted opportunities and challenges posed by AI in education. The analysis reveals several key findings: AI’s potential to personalise learning, enhance student motivation, and improve teaching efficiency are highlighted as major strengths. However, the study also identifies critical concerns, including teacher resistance, high implementation costs, ethical considerations, and the need for extensive teacher training. These findings represent the most significant insights from the analysis, while additional findings further underscore the complexity and scope of AI integration in educational settings. The study employs a SWOT analysis to summarise these insights, identifying key areas for future research and policy development. This review aims to guide educators, policymakers, and researchers in effectively leveraging AI to enhance educational outcomes while addressing its inherent challenges.展开更多
This paper focuses on facilitating state-of-the-art applications of big data analytics(BDA) architectures and infrastructures to telecommunications(telecom) industrial sector.Telecom companies are dealing with terabyt...This paper focuses on facilitating state-of-the-art applications of big data analytics(BDA) architectures and infrastructures to telecommunications(telecom) industrial sector.Telecom companies are dealing with terabytes to petabytes of data on a daily basis. Io T applications in telecom are further contributing to this data deluge. Recent advances in BDA have exposed new opportunities to get actionable insights from telecom big data. These benefits and the fast-changing BDA technology landscape make it important to investigate existing BDA applications to telecom sector. For this, we initially determine published research on BDA applications to telecom through a systematic literature review through which we filter 38 articles and categorize them in frameworks, use cases, literature reviews, white papers and experimental validations. We also discuss the benefits and challenges mentioned in these articles. We find that experiments are all proof of concepts(POC) on a severely limited BDA technology stack(as compared to the available technology stack), i.e.,we did not find any work focusing on full-fledged BDA implementation in an operational telecom environment. To facilitate these applications at research-level, we propose a state-of-the-art lambda architecture for BDA pipeline implementation(called Lambda Tel) based completely on open source BDA technologies and the standard Python language, along with relevant guidelines.We discovered only one research paper which presented a relatively-limited lambda architecture using the proprietary AWS cloud infrastructure. We believe Lambda Tel presents a clear roadmap for telecom industry practitioners to implement and enhance BDA applications in their enterprises.展开更多
The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big da...The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big data allows for boundless potential outcomes for discovering knowledge.Big data analytics(BDA)in healthcare can,for instance,help determine causes of diseases,generate effective diagnoses,enhance Qo S guarantees by increasing efficiency of the healthcare delivery and effectiveness and viability of treatments,generate accurate predictions of readmissions,enhance clinical care,and pinpoint opportunities for cost savings.However,BDA implementations in any domain are generally complicated and resource-intensive with a high failure rate and no roadmap or success strategies to guide the practitioners.In this paper,we present a comprehensive roadmap to derive insights from BDA in the healthcare(patient care)domain,based on the results of a systematic literature review.We initially determine big data characteristics for healthcare and then review BDA applications to healthcare in academic research focusing particularly on No SQL databases.We also identify the limitations and challenges of these applications and justify the potential of No SQL databases to address these challenges and further enhance BDA healthcare research.We then propose and describe a state-of-the-art BDA architecture called Med-BDA for healthcare domain which solves all current BDA challenges and is based on the latest zeta big data paradigm.We also present success strategies to ensure the working of Med-BDA along with outlining the major benefits of BDA applications to healthcare.Finally,we compare our work with other related literature reviews across twelve hallmark features to justify the novelty and importance of our work.The aforementioned contributions of our work are collectively unique and clearly present a roadmap for clinical administrators,practitioners and professionals to successfully implement BDA initiatives in their organizations.展开更多
BACKGROUND Gastroenteropancreatic neuroendocrine tumours(GEP-NETs)are slow-growing cancers that arise from diffuse endocrine cells in the gastrointestinal tract(GINETs)or the pancreas(P-NETs).They are relatively uncom...BACKGROUND Gastroenteropancreatic neuroendocrine tumours(GEP-NETs)are slow-growing cancers that arise from diffuse endocrine cells in the gastrointestinal tract(GINETs)or the pancreas(P-NETs).They are relatively uncommon,accounting for 2%of all gastrointestinal malignancies.The usual treatment options in advanced GEP-NET patients with metastatic disease include chemotherapy,biological therapies,and peptide receptor radionuclide therapy.Understanding the impact of treatment on GEP-NET patients is paramount given the nature of the disease.Health-related quality of life(HRQoL)is increasingly important as a concept reflecting the patients’perspective in conjunction with the disease presentation,severity and treatment.AIM To conduct a systematic literature review to identify literature reporting HRQoL data in patients with GEP-NETs between January 1985 and November 2019.METHODS The PRISMA guiding principles were applied.MEDLINE,Embase and the Cochrane library were searched.Data extracted from the publications included type of study,patient population data(mid-gut/hind-gut/GI-NET/P-NET),sample size,intervention/comparators,HRQoL instruments,average and data spread of overall and sub-scores,and follow-up time for data collection.RESULTS Forty-three publications met the inclusion criteria.The heterogeneous nature of the different study populations was evident;the percentage of female participants ranged between 30%-60%,whilst average age ranged from 53.8 to 67.0 years.Eight studies investigated GI-NET patients only,six studies focused exclusively on P-NET patients and the remaining studies involved both patient populations or did not report the location of the primary tumour.The most commonly used instrument was the European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-C30(n=28)with consistent results across studies;the GI-NET-specific module Quality of Life Questionnaire-GINET21 was used in six of these studies.A number of randomised trials demonstrated no HRQoL changes between active treatment and placebo arms.The Phase III NETTER-1 study provides the best data available for advanced GEP-NET patients;it shows that peptide receptor radionuclide therapy can significantly improve GEP-NET patients’HRQoL.CONCLUSION HRQoL instruments offer a means to monitor patients’general disease condition,disease progression and their physical and mental well-being.Instruments including the commonly used European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-C30 and GINET21 lack,however,validation and a defined minimal clinical important difference specifically for GINET and P-NET patients.展开更多
Locus of control theory,which was developed by Rotter,suggests that there are two main types of peoples’behaviors when attributing their failure or success of their life events:external locus of control and internal....Locus of control theory,which was developed by Rotter,suggests that there are two main types of peoples’behaviors when attributing their failure or success of their life events:external locus of control and internal.The way that individuals act is determined by their expectations of their specific behaviors and the value that they add to these expectations.For instance,people who fit in the internal category are more likely to attribute their life events to their own behaviors,skills,and attitudes,while people who fit in the external category,tend to attribute their acts to fate,chance,and other exterior factors that are out of their control.The aim of this systematic literature review was to define the fundamental concept of LOC theory,to investigate major findings of the theory in accordance with LOC and procrastination,job satisfaction,and performance and lastly,to discuss the practical use of the theory in the organizational context.展开更多
Improved rice varieties(IRVs)play a significant role in establishing food security and improving livelihood in the Global South since its introduction in the 1960s.However,the adoption of new IRVs has remained relativ...Improved rice varieties(IRVs)play a significant role in establishing food security and improving livelihood in the Global South since its introduction in the 1960s.However,the adoption of new IRVs has remained relatively low.This low adoption poses a challenge to rice-producing and consuming countries as they are increasingly threatened by production shortages,malnutrition,and poor rice quality.Many empirical studies have attempted to identify the determinants influencing the adoption of IRVs by distinguishing the characteristics between adopters and non-adopters.This review showed a consensus on the important determinants influencing the adoption of IRVs in the Global South.Findings synthesized from 99 studies suggested that variables(farm size,education,information access and farm location)examined extensively are not necessarily the most important determinants of adoption when undertaking a weighted analysis.Terrain,source of seed and technology-related attributes(perceived yield,maturity,ease of use,marketability and technical efficiency)are more important determinants of adoption,with determinants changing according to adoption type(probability or intensity of adoption),variety type and region.The recommendations for future adoption studies include:incorporating more technology-specific variables,increasing research for overlooked regions and variety types,shifting away from predominant static analysis by capturing the dynamics of the adoption process,and considering the potential biases in analyses.This review will facilitate the development of targeted interventions and policies that promote IRV adoption in the Global South.展开更多
The autonomous vehicle(AV)technology has the potential to significantly improve safety and efficiency of the transportation and logistics industry.Full-scale AV testing is limited by time,space,and cost,while simulati...The autonomous vehicle(AV)technology has the potential to significantly improve safety and efficiency of the transportation and logistics industry.Full-scale AV testing is limited by time,space,and cost,while simulation-based testing often lacks the necessary accuracy of AV and environmental modeling.In recent years,several initiatives have emerged to test autonomous software and hardware on scaled vehicles.This systematic literature review provides an overview of the literature surrounding small-scale self-driving cars,summarizing the current autonomous platforms deployed and focusing on the software and hardware developments in this field.The studies published in English-language journals or conference papers that present small-scale testing of self-driving cars were included.Web of Science,Scopus,Springer Link,Wiley,ACM Digital Library,and TRID databases were used for the literature search.The systematic literature search found 38 eligible studies.Research gaps in the reviewed papers were identified to provide guidance for future research.Some key takeaway emerging from this manuscript are:(i)there is a need to improve the models and neural network architectures used in autonomous driving systems,as most papers present only preliminary results;(ii)increasing datasets and sharing databases can help in developing more reliable control policies and reducing bias and variance in the training process;(iii)small-scaled vehicles to ensure safety is a major benefit,and incorporating data about unsafe driving behaviors and infrastructure problems can improve the accuracy of predictive models.展开更多
Innovation capabilities(ICs)represent a crucial source of competitive advantage for firms.However,the literature on ICs is extensive,leading to a diverse understanding of their nature and measurement.A notable gap exi...Innovation capabilities(ICs)represent a crucial source of competitive advantage for firms.However,the literature on ICs is extensive,leading to a diverse understanding of their nature and measurement.A notable gap exists in delineating the dimensions constituting ICs.This article aims to address this gap by identifying and pinpointing the various dimensions of ICs through a systematic literature review(SLR).The initial step involves identifying the diverse dimensions used in ICs,providing a distinctive insight for assessing their metrics.Notably,this SLR stands out as the only comprehensive analysis of various ICs dimensions,organizing them coherently.Examining 103 articles from the Web of Science and Scopus databases spanning from 2001 to 2022,the results reveal an amalgam of scales and associated approaches for IC measurement.This study contributes to the literature by systematically identifying and analyzing the main dimensions employed by researchers to measure ICs.Additionally,it highlights the foundational theoretical approaches of the identified studies.In practical terms,the study consolidates and presents the identified dimensions and metrics in integrative tables,offering researchers and companies valuable insights into diverse innovation paths that impact performance.展开更多
This study comprehensively analyzes the status,characteristics,focal points,and evolving trends of global research on“stroke risk analysis”over the past four years(2020–2023),aiming to provide insights for directin...This study comprehensively analyzes the status,characteristics,focal points,and evolving trends of global research on“stroke risk analysis”over the past four years(2020–2023),aiming to provide insights for directing future research endeavors.By utilizing the Newcastle-Ottawa Scale,63 high-quality research papers were selected and subjected to a systematic literature review.In terms of research methods,stroke risk analysis research has evolved from clinical trials(e.g.,establishing control groups,using authoritative scales)towards statistical and data analysis methods(e.g.,decision tree analysis).Regarding research factors,early studies primarily focused on pathological factors associated with hemorrhagic and ischemic stroke,such as hypertension,hyperlipidemia,and diabetes.Recent research from the past two years indicates a shift towards emerging factors,including temperature conditions,air quality,and Corona Virus Disease 2019(COVID-19).In terms of application domains,stroke research covers a broad range of fields but mainly focuses on exploring risk factors,interventions during diagnosis and treatment stages,and rehabilitation,with clinical diagnosis,treatment,and drug intervention studies being predominant.While the research landscape is becoming increasingly diversified and comprehensive,there remains a need for more comprehensive and in-depth studies on novel topics,as well as integrated applications of research methods,presenting ample opportunities for exploring dependent variables in future stroke.展开更多
Blockchain is considered by many to be a disruptive core technology.Although many researchers have realized the importance of blockchain,the research of blockchain is still in its infancy.Consequently,this study revie...Blockchain is considered by many to be a disruptive core technology.Although many researchers have realized the importance of blockchain,the research of blockchain is still in its infancy.Consequently,this study reviews the current academic research on blockchain,especially in the subject area of business and economics.Based on a systematic review of the literature retrieved from the Web of Science service,we explore the top-cited articles,most productive countries,and most common keywords.Additionally,we conduct a clustering analysis and identify the following five research themes:“economic benefit,”“blockchain technology,”“initial coin offerings,”“fintech revolution,”and“sharing economy.”Recommendations on future research directions and practical applications are also provided in this paper.展开更多
The negative cardiorespiratory health outcomes due to extreme temperatures and air pollution are widely stud-ied,but knowledge about the effectiveness of the implementation of adaptive mechanisms remains unclear.The o...The negative cardiorespiratory health outcomes due to extreme temperatures and air pollution are widely stud-ied,but knowledge about the effectiveness of the implementation of adaptive mechanisms remains unclear.The objective of this paper is to explore the evidence of adaptive mechanisms for cardiorespiratory diseases regard-ing extreme temperatures and air pollution by comparing the results of two systematic literature review(SLR)processes sharing the same initial research question but led by two research groups with different academic back-grounds working in the same multidisciplinary team.We start by presenting the methodological procedures and the results of the SLR triggered by the research group mainly composed by researchers with a background in ge-ography(named geographical strategy).We then compare these results with those achieved in the SLR led by the research group with a background in epidemiology(named epidemiological strategy).Both SLR were developed under the EU Horizon 2020 Project“EXHAUSTION”.The results showed:1)the lack of evidence regarding the effectiveness of adaptation measures,namely due to the limited number of studies about the topic,the prepon-derance of studies dedicated to heat extremes or the unbalance between different adaptation measures;2)that the choice of search terms in the geographical strategy,despite being more comprehensive at first sight,ended up retrieving less results,but it brought new studies that can complement the results of the epidemiological strategy.Therefore,it is suggested that to strengthen the empirical evidence of the effectiveness of adaptation measures,powerful multidisciplinary teams should work together in the preparation of SLR in topics of great complexity,such as the one presented in this paper.展开更多
What is open innovation? There are different definitions of open innovation, depending, at least, on three parameters: source, ownership, or users of the knowledge linked to innovation. The aim of the paper is to ma...What is open innovation? There are different definitions of open innovation, depending, at least, on three parameters: source, ownership, or users of the knowledge linked to innovation. The aim of the paper is to make a systematic literature review, to map open innovation studies, and to re-conceptualize the openness according to two dimensions: degree of technology convergence and ontology of openness. In particular, we propose a classification of open innovation, based on the distinction between the originator/developer of the knowledge and the user. Users are a ubiquitous category, because they can be originators, as well as customers of the innovation itself. Therefore, we point out that there are three types of open innovation, whose degree of openness is defined according to an ontological dimension: at users’ level, at an industry level, and among different fields or industries. Firm’s structure affects the propensity to open innovation adoption; and the type of innovation itself. Finally, we identify another literature gap: the relationship between the open innovation model and grand challenge. Even if open innovation seems to be ideally connected to grand challenges and many industries actually adopt this model, there seems to emerge a gap in literature. Therefore, we propose a conceptual model for future researches.展开更多
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili...Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.展开更多
Objective To review the domestic and foreign economic studies on CDK4/6 inhibitors in first-line or second-line treatment of HR+/HER2-advanced breast cancer,and to analyze the main methodologies and research results.M...Objective To review the domestic and foreign economic studies on CDK4/6 inhibitors in first-line or second-line treatment of HR+/HER2-advanced breast cancer,and to analyze the main methodologies and research results.Methods Systematic literature review was used to search PubMed,EMBASE,Cochrane Library,CNKI,CBM,and Wanfang database.The incremental cost-effectiveness ratio was taken as the main outcome index,and all pharmacoeconomic evaluations with CDK4/6 inhibitors as intervention measures were included,such as Palbociclib,Ribociclib,and Abemaciclib.According to the Quality of Health Economic Studies Instrument,the quality of the included articles was evaluated,and then the included literature was analyzed.Results and Conclusion A total of 16 pharmacoeconomic evaluation studies were included,mainly from the perspective of national healthcare systems or third-party payers.Only 2 studies focused on second-line treatment,and the remaining treatment levels were first-line treatment.In terms of model structure,7 studies adopted the Markov model,6 studies adopted the PSM model,and 3 studies adopted the DES model.The basic analysis results showed that CDK4/6 inhibitor combined with endocrine regimen was not economical compared with endocrine alone regimen when the threshold was the conventional willingness to pay(WTP)value of each country.The uncertainty analysis included deterministic sensitivity analysis and probability sensitivity analysis.The included studies are all Cost-Utility Analysis with high-quality evaluation,which can provide evidence support for health-related decision-makers in decision-making.It can also provide methodological reference for the economic evaluation of other targeted drugs.展开更多
Biotechnology policies and regulations must be revised and updated to reflect the most recent advances in plantbreeding technology. New Plant Breeding Techniques(NPBT) such as gene editing have been applied to address...Biotechnology policies and regulations must be revised and updated to reflect the most recent advances in plantbreeding technology. New Plant Breeding Techniques(NPBT) such as gene editing have been applied to address the myriad of challenges in plant breeding, while the use of NPBT as emerging biotechnological tools raises legal and ethical concerns. This study aims to highlight how gene editing is operationalized in the existing literature and examine the critical issues of ethical and legal issues of gene editing for plant breeding. We carried out a systematic literature review(SLR) to provide the current states of ethical and legal discourses surrounding this topic. We also identified critical research priority areas and policy gaps that must be addressed when designing the future governance of gene editing in plant breeding.展开更多
文摘The presented research illustrates the applicability and productiveness of the systematic literature review methodology, a non-empirical methodology in the geological sciences, particularly volcanology. The systematic literature review methodology is a replicable, rigorous, and transparent methodology for synthesizing existing literature to answer questions on a specific topic. The synthesis allows for knowledge consolidation, such as identifying knowledge gaps. In our illustration of this methodology, we focused on the expanding knowledge about the magma pathway at Mount Cameroon, one of Africa’s active volcanoes. Our synthesis of the relevant international geoscience research literature is based on the framework of knowledge about the magma pathway beneath a typical basaltic volcano. The framework has three primary components: magma supply, storage, and transport to erupting vents. Across these components is a total of twelve secondary components. The result is a previously non-existent and fragmented overall understanding of the magma pathway at Mount Cameroon. The gaps in the understanding (such as in the magma supply rates, timescales of chamber processes, and magma ascent rates) may be addressed in future research. Another key implication of the presented research lies in the proof of concept of the systematic literature review methodology as an applicable qualitative research methodology in the study of volcanoes.
文摘Systematic Review and Meta-analysis are techniques which attempt to associate the findings from similar studies and deliver quantitative summaries of the research literature [1]. The Systematic review of research literature identifies the common research methods, research design, sample size, parameters used, survey instruments, etc. used by the group of researchers. This study intends to fulfill this purpose in order to identify common research mythologies, dependent variables, sample sizes, moderators and mediators used in the field of analysing technology adoption based studies that utilizes the UTAUT2 model. This research collected over 59 published articles and conducted descriptive analytics. The results have revealed performance expectancy/perceived usefulness, trust and habit as the best predictors of consumer behavioural intentions towards the adoption of mobile application. Behavioural intention was the best predictor of use behaviour among the 57 articles selected. 274 was the mean sample size of research with 25 mean questionnaire items. SPSS and AMOS were the most common softwares used in all 57 studies, and 32 of those studies used UTAUT1 model while 14 researches incorporated the UTAUT2 model. There were also two promising predictors such as perceived risk on behavioural intention and habit on use behaviour.
基金funded by the Spanish Government Ministry of Economy and Competitiveness through the DEFINES Project Grant No. (TIN2016-80172-R)the Ministry of Science and Innovation through the AVisSA Project Grant No. (PID2020-118345RBI00)supported by the Spanish Ministry of Education and Vocational Training under an FPU Fellowship (FPU17/03276).
文摘The exponential use of artificial intelligence(AI)to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed.While AI is a powerfulmeans to discover interesting patterns and obtain predictive models,the use of these algorithms comes with a great responsibility,as an incomplete or unbalanced set of training data or an unproper interpretation of the models’outcomes could result in misleading conclusions that ultimately could become very dangerous.For these reasons,it is important to rely on expert knowledge when applying these methods.However,not every user can count on this specific expertise;non-AIexpert users could also benefit from applying these powerful algorithms to their domain problems,but they need basic guidelines to obtain themost out of AI models.The goal of this work is to present a systematic review of the literature to analyze studies whose outcomes are explainable rules and heuristics to select suitable AI algorithms given a set of input features.The systematic review follows the methodology proposed by Kitchenham and other authors in the field of software engineering.As a result,9 papers that tackle AI algorithmrecommendation through tangible and traceable rules and heuristics were collected.The reduced number of retrieved papers suggests a lack of reporting explicit rules and heuristics when testing the suitability and performance of AI algorithms.
文摘The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.
文摘The integration of environmental,social,and governance(ESG)principles has become a pivotal factor in shaping sustainable and responsible corporate practices.The present study investigates the integration of ESG principles within corporate governance models in Asia-Pacific countries,focusing on socialization.By examining the governance culture,legal frameworks,and corporate practices in these representative countries,the paper delineates a strategic framework for embedding social governance into corporate strategies.The study introduces a Cultural,Economic,Legal,and Political(CELP)framework to assess corporate social governance,investigating the correlation between business practices and social changes.Through a systematic literature review and detailed thematic analysis,this paper aims to offer actionable insights and recommendations,guiding corporations in their transition towards more sustainable and socially responsible business practices.
文摘Artificial intelligence (AI) has garnered significant interest within the educational domain over the past decade, promising to revolutionise teaching and learning. This paper provides a comprehensive overview of systematic reviews conducted from 2010 to 2023 on the implementation of AI in K-12 education. By synthesising findings from ten selected systematic reviews, this study explores the multifaceted opportunities and challenges posed by AI in education. The analysis reveals several key findings: AI’s potential to personalise learning, enhance student motivation, and improve teaching efficiency are highlighted as major strengths. However, the study also identifies critical concerns, including teacher resistance, high implementation costs, ethical considerations, and the need for extensive teacher training. These findings represent the most significant insights from the analysis, while additional findings further underscore the complexity and scope of AI integration in educational settings. The study employs a SWOT analysis to summarise these insights, identifying key areas for future research and policy development. This review aims to guide educators, policymakers, and researchers in effectively leveraging AI to enhance educational outcomes while addressing its inherent challenges.
基金supported in part by the Big Data Analytics Laboratory(BDALAB)at the Institute of Business Administration under the research grant approved by the Higher Education Commission of Pakistan(www.hec.gov.pk)the Darbi company(www.darbi.io)
文摘This paper focuses on facilitating state-of-the-art applications of big data analytics(BDA) architectures and infrastructures to telecommunications(telecom) industrial sector.Telecom companies are dealing with terabytes to petabytes of data on a daily basis. Io T applications in telecom are further contributing to this data deluge. Recent advances in BDA have exposed new opportunities to get actionable insights from telecom big data. These benefits and the fast-changing BDA technology landscape make it important to investigate existing BDA applications to telecom sector. For this, we initially determine published research on BDA applications to telecom through a systematic literature review through which we filter 38 articles and categorize them in frameworks, use cases, literature reviews, white papers and experimental validations. We also discuss the benefits and challenges mentioned in these articles. We find that experiments are all proof of concepts(POC) on a severely limited BDA technology stack(as compared to the available technology stack), i.e.,we did not find any work focusing on full-fledged BDA implementation in an operational telecom environment. To facilitate these applications at research-level, we propose a state-of-the-art lambda architecture for BDA pipeline implementation(called Lambda Tel) based completely on open source BDA technologies and the standard Python language, along with relevant guidelines.We discovered only one research paper which presented a relatively-limited lambda architecture using the proprietary AWS cloud infrastructure. We believe Lambda Tel presents a clear roadmap for telecom industry practitioners to implement and enhance BDA applications in their enterprises.
基金supported by two research grants provided by the Karachi Institute of Economics and Technology(KIET)the Big Data Analytics Laboratory at the Insitute of Business Administration(IBAKarachi)。
文摘The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big data allows for boundless potential outcomes for discovering knowledge.Big data analytics(BDA)in healthcare can,for instance,help determine causes of diseases,generate effective diagnoses,enhance Qo S guarantees by increasing efficiency of the healthcare delivery and effectiveness and viability of treatments,generate accurate predictions of readmissions,enhance clinical care,and pinpoint opportunities for cost savings.However,BDA implementations in any domain are generally complicated and resource-intensive with a high failure rate and no roadmap or success strategies to guide the practitioners.In this paper,we present a comprehensive roadmap to derive insights from BDA in the healthcare(patient care)domain,based on the results of a systematic literature review.We initially determine big data characteristics for healthcare and then review BDA applications to healthcare in academic research focusing particularly on No SQL databases.We also identify the limitations and challenges of these applications and justify the potential of No SQL databases to address these challenges and further enhance BDA healthcare research.We then propose and describe a state-of-the-art BDA architecture called Med-BDA for healthcare domain which solves all current BDA challenges and is based on the latest zeta big data paradigm.We also present success strategies to ensure the working of Med-BDA along with outlining the major benefits of BDA applications to healthcare.Finally,we compare our work with other related literature reviews across twelve hallmark features to justify the novelty and importance of our work.The aforementioned contributions of our work are collectively unique and clearly present a roadmap for clinical administrators,practitioners and professionals to successfully implement BDA initiatives in their organizations.
文摘BACKGROUND Gastroenteropancreatic neuroendocrine tumours(GEP-NETs)are slow-growing cancers that arise from diffuse endocrine cells in the gastrointestinal tract(GINETs)or the pancreas(P-NETs).They are relatively uncommon,accounting for 2%of all gastrointestinal malignancies.The usual treatment options in advanced GEP-NET patients with metastatic disease include chemotherapy,biological therapies,and peptide receptor radionuclide therapy.Understanding the impact of treatment on GEP-NET patients is paramount given the nature of the disease.Health-related quality of life(HRQoL)is increasingly important as a concept reflecting the patients’perspective in conjunction with the disease presentation,severity and treatment.AIM To conduct a systematic literature review to identify literature reporting HRQoL data in patients with GEP-NETs between January 1985 and November 2019.METHODS The PRISMA guiding principles were applied.MEDLINE,Embase and the Cochrane library were searched.Data extracted from the publications included type of study,patient population data(mid-gut/hind-gut/GI-NET/P-NET),sample size,intervention/comparators,HRQoL instruments,average and data spread of overall and sub-scores,and follow-up time for data collection.RESULTS Forty-three publications met the inclusion criteria.The heterogeneous nature of the different study populations was evident;the percentage of female participants ranged between 30%-60%,whilst average age ranged from 53.8 to 67.0 years.Eight studies investigated GI-NET patients only,six studies focused exclusively on P-NET patients and the remaining studies involved both patient populations or did not report the location of the primary tumour.The most commonly used instrument was the European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-C30(n=28)with consistent results across studies;the GI-NET-specific module Quality of Life Questionnaire-GINET21 was used in six of these studies.A number of randomised trials demonstrated no HRQoL changes between active treatment and placebo arms.The Phase III NETTER-1 study provides the best data available for advanced GEP-NET patients;it shows that peptide receptor radionuclide therapy can significantly improve GEP-NET patients’HRQoL.CONCLUSION HRQoL instruments offer a means to monitor patients’general disease condition,disease progression and their physical and mental well-being.Instruments including the commonly used European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-C30 and GINET21 lack,however,validation and a defined minimal clinical important difference specifically for GINET and P-NET patients.
文摘Locus of control theory,which was developed by Rotter,suggests that there are two main types of peoples’behaviors when attributing their failure or success of their life events:external locus of control and internal.The way that individuals act is determined by their expectations of their specific behaviors and the value that they add to these expectations.For instance,people who fit in the internal category are more likely to attribute their life events to their own behaviors,skills,and attitudes,while people who fit in the external category,tend to attribute their acts to fate,chance,and other exterior factors that are out of their control.The aim of this systematic literature review was to define the fundamental concept of LOC theory,to investigate major findings of the theory in accordance with LOC and procrastination,job satisfaction,and performance and lastly,to discuss the practical use of the theory in the organizational context.
文摘Improved rice varieties(IRVs)play a significant role in establishing food security and improving livelihood in the Global South since its introduction in the 1960s.However,the adoption of new IRVs has remained relatively low.This low adoption poses a challenge to rice-producing and consuming countries as they are increasingly threatened by production shortages,malnutrition,and poor rice quality.Many empirical studies have attempted to identify the determinants influencing the adoption of IRVs by distinguishing the characteristics between adopters and non-adopters.This review showed a consensus on the important determinants influencing the adoption of IRVs in the Global South.Findings synthesized from 99 studies suggested that variables(farm size,education,information access and farm location)examined extensively are not necessarily the most important determinants of adoption when undertaking a weighted analysis.Terrain,source of seed and technology-related attributes(perceived yield,maturity,ease of use,marketability and technical efficiency)are more important determinants of adoption,with determinants changing according to adoption type(probability or intensity of adoption),variety type and region.The recommendations for future adoption studies include:incorporating more technology-specific variables,increasing research for overlooked regions and variety types,shifting away from predominant static analysis by capturing the dynamics of the adoption process,and considering the potential biases in analyses.This review will facilitate the development of targeted interventions and policies that promote IRV adoption in the Global South.
基金funded by the Brazilian National Council for Scientific and Technological Development(CNPq),under research grant number 408186/2021-6.
文摘The autonomous vehicle(AV)technology has the potential to significantly improve safety and efficiency of the transportation and logistics industry.Full-scale AV testing is limited by time,space,and cost,while simulation-based testing often lacks the necessary accuracy of AV and environmental modeling.In recent years,several initiatives have emerged to test autonomous software and hardware on scaled vehicles.This systematic literature review provides an overview of the literature surrounding small-scale self-driving cars,summarizing the current autonomous platforms deployed and focusing on the software and hardware developments in this field.The studies published in English-language journals or conference papers that present small-scale testing of self-driving cars were included.Web of Science,Scopus,Springer Link,Wiley,ACM Digital Library,and TRID databases were used for the literature search.The systematic literature search found 38 eligible studies.Research gaps in the reviewed papers were identified to provide guidance for future research.Some key takeaway emerging from this manuscript are:(i)there is a need to improve the models and neural network architectures used in autonomous driving systems,as most papers present only preliminary results;(ii)increasing datasets and sharing databases can help in developing more reliable control policies and reducing bias and variance in the training process;(iii)small-scaled vehicles to ensure safety is a major benefit,and incorporating data about unsafe driving behaviors and infrastructure problems can improve the accuracy of predictive models.
文摘Innovation capabilities(ICs)represent a crucial source of competitive advantage for firms.However,the literature on ICs is extensive,leading to a diverse understanding of their nature and measurement.A notable gap exists in delineating the dimensions constituting ICs.This article aims to address this gap by identifying and pinpointing the various dimensions of ICs through a systematic literature review(SLR).The initial step involves identifying the diverse dimensions used in ICs,providing a distinctive insight for assessing their metrics.Notably,this SLR stands out as the only comprehensive analysis of various ICs dimensions,organizing them coherently.Examining 103 articles from the Web of Science and Scopus databases spanning from 2001 to 2022,the results reveal an amalgam of scales and associated approaches for IC measurement.This study contributes to the literature by systematically identifying and analyzing the main dimensions employed by researchers to measure ICs.Additionally,it highlights the foundational theoretical approaches of the identified studies.In practical terms,the study consolidates and presents the identified dimensions and metrics in integrative tables,offering researchers and companies valuable insights into diverse innovation paths that impact performance.
基金funded by 2020 National Social Science Fund(grant number:20BTQ073)The Special Fund for the“Community Medicine and Health Management Research Project”of the Shanghai Society of Integrated Traditional Chinese and Western Medicine(grant number:2023SQ19).
文摘This study comprehensively analyzes the status,characteristics,focal points,and evolving trends of global research on“stroke risk analysis”over the past four years(2020–2023),aiming to provide insights for directing future research endeavors.By utilizing the Newcastle-Ottawa Scale,63 high-quality research papers were selected and subjected to a systematic literature review.In terms of research methods,stroke risk analysis research has evolved from clinical trials(e.g.,establishing control groups,using authoritative scales)towards statistical and data analysis methods(e.g.,decision tree analysis).Regarding research factors,early studies primarily focused on pathological factors associated with hemorrhagic and ischemic stroke,such as hypertension,hyperlipidemia,and diabetes.Recent research from the past two years indicates a shift towards emerging factors,including temperature conditions,air quality,and Corona Virus Disease 2019(COVID-19).In terms of application domains,stroke research covers a broad range of fields but mainly focuses on exploring risk factors,interventions during diagnosis and treatment stages,and rehabilitation,with clinical diagnosis,treatment,and drug intervention studies being predominant.While the research landscape is becoming increasingly diversified and comprehensive,there remains a need for more comprehensive and in-depth studies on novel topics,as well as integrated applications of research methods,presenting ample opportunities for exploring dependent variables in future stroke.
基金supported by grants from National Natural Science Foundation of China(Nos.71701168 and 71701034).
文摘Blockchain is considered by many to be a disruptive core technology.Although many researchers have realized the importance of blockchain,the research of blockchain is still in its infancy.Consequently,this study reviews the current academic research on blockchain,especially in the subject area of business and economics.Based on a systematic review of the literature retrieved from the Web of Science service,we explore the top-cited articles,most productive countries,and most common keywords.Additionally,we conduct a clustering analysis and identify the following five research themes:“economic benefit,”“blockchain technology,”“initial coin offerings,”“fintech revolution,”and“sharing economy.”Recommendations on future research directions and practical applications are also provided in this paper.
基金This research was conducted in the framework of the EXHAUSTION project.The project has received funding from the European Union’s Horizon 2020 Research and Innovation Program(Grant No.820655).
文摘The negative cardiorespiratory health outcomes due to extreme temperatures and air pollution are widely stud-ied,but knowledge about the effectiveness of the implementation of adaptive mechanisms remains unclear.The objective of this paper is to explore the evidence of adaptive mechanisms for cardiorespiratory diseases regard-ing extreme temperatures and air pollution by comparing the results of two systematic literature review(SLR)processes sharing the same initial research question but led by two research groups with different academic back-grounds working in the same multidisciplinary team.We start by presenting the methodological procedures and the results of the SLR triggered by the research group mainly composed by researchers with a background in ge-ography(named geographical strategy).We then compare these results with those achieved in the SLR led by the research group with a background in epidemiology(named epidemiological strategy).Both SLR were developed under the EU Horizon 2020 Project“EXHAUSTION”.The results showed:1)the lack of evidence regarding the effectiveness of adaptation measures,namely due to the limited number of studies about the topic,the prepon-derance of studies dedicated to heat extremes or the unbalance between different adaptation measures;2)that the choice of search terms in the geographical strategy,despite being more comprehensive at first sight,ended up retrieving less results,but it brought new studies that can complement the results of the epidemiological strategy.Therefore,it is suggested that to strengthen the empirical evidence of the effectiveness of adaptation measures,powerful multidisciplinary teams should work together in the preparation of SLR in topics of great complexity,such as the one presented in this paper.
文摘What is open innovation? There are different definitions of open innovation, depending, at least, on three parameters: source, ownership, or users of the knowledge linked to innovation. The aim of the paper is to make a systematic literature review, to map open innovation studies, and to re-conceptualize the openness according to two dimensions: degree of technology convergence and ontology of openness. In particular, we propose a classification of open innovation, based on the distinction between the originator/developer of the knowledge and the user. Users are a ubiquitous category, because they can be originators, as well as customers of the innovation itself. Therefore, we point out that there are three types of open innovation, whose degree of openness is defined according to an ontological dimension: at users’ level, at an industry level, and among different fields or industries. Firm’s structure affects the propensity to open innovation adoption; and the type of innovation itself. Finally, we identify another literature gap: the relationship between the open innovation model and grand challenge. Even if open innovation seems to be ideally connected to grand challenges and many industries actually adopt this model, there seems to emerge a gap in literature. Therefore, we propose a conceptual model for future researches.
文摘Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.
基金The cooperative project of Shenyang Pharmaceutical University and Zhonglian Medicine-“Research on the Marketing Strategy of Imported Drugs in China”(2020-0-4-048).
文摘Objective To review the domestic and foreign economic studies on CDK4/6 inhibitors in first-line or second-line treatment of HR+/HER2-advanced breast cancer,and to analyze the main methodologies and research results.Methods Systematic literature review was used to search PubMed,EMBASE,Cochrane Library,CNKI,CBM,and Wanfang database.The incremental cost-effectiveness ratio was taken as the main outcome index,and all pharmacoeconomic evaluations with CDK4/6 inhibitors as intervention measures were included,such as Palbociclib,Ribociclib,and Abemaciclib.According to the Quality of Health Economic Studies Instrument,the quality of the included articles was evaluated,and then the included literature was analyzed.Results and Conclusion A total of 16 pharmacoeconomic evaluation studies were included,mainly from the perspective of national healthcare systems or third-party payers.Only 2 studies focused on second-line treatment,and the remaining treatment levels were first-line treatment.In terms of model structure,7 studies adopted the Markov model,6 studies adopted the PSM model,and 3 studies adopted the DES model.The basic analysis results showed that CDK4/6 inhibitor combined with endocrine regimen was not economical compared with endocrine alone regimen when the threshold was the conventional willingness to pay(WTP)value of each country.The uncertainty analysis included deterministic sensitivity analysis and probability sensitivity analysis.The included studies are all Cost-Utility Analysis with high-quality evaluation,which can provide evidence support for health-related decision-makers in decision-making.It can also provide methodological reference for the economic evaluation of other targeted drugs.
基金supported by the Ministry of Higher Education(MoHE)Malaysia under the Fundamental Research Grant Scheme(No.FRGS/1/2021/SS0/UM/02/6)the Universiti Malaya Research University Grant(No.RU004A-2020).
文摘Biotechnology policies and regulations must be revised and updated to reflect the most recent advances in plantbreeding technology. New Plant Breeding Techniques(NPBT) such as gene editing have been applied to address the myriad of challenges in plant breeding, while the use of NPBT as emerging biotechnological tools raises legal and ethical concerns. This study aims to highlight how gene editing is operationalized in the existing literature and examine the critical issues of ethical and legal issues of gene editing for plant breeding. We carried out a systematic literature review(SLR) to provide the current states of ethical and legal discourses surrounding this topic. We also identified critical research priority areas and policy gaps that must be addressed when designing the future governance of gene editing in plant breeding.