This paper proposes teaching reforms in communication engineering majors,emphasizing the implementation of digital and adaptive teaching methodologies,integrating emerging technologies,breaking free from the constrain...This paper proposes teaching reforms in communication engineering majors,emphasizing the implementation of digital and adaptive teaching methodologies,integrating emerging technologies,breaking free from the constraints of traditional education,and fostering high-caliber talents.The reform measures encompass fundamental data collection,recognition of individual characteristics,recommendation of adaptive learning resources,process-oriented teaching management,adaptive student guidance and early warning systems,personalized evaluation,and the construction of an integrated service platform.These measures,when combined,form a comprehensive system that is expected to enhance teaching quality and efficiency,and facilitate student development.展开更多
Digital Twin(DT)supports real time analysis and provides a reliable simulation platform in the Internet of Things(IoT).The creation and application of DT hinges on amounts of data,which poses pressure on the applicati...Digital Twin(DT)supports real time analysis and provides a reliable simulation platform in the Internet of Things(IoT).The creation and application of DT hinges on amounts of data,which poses pressure on the application of Artificial Intelligence(AI)for DT descriptions and intelligent decision-making.Federated Learning(FL)is a cutting-edge technology that enables geographically dispersed devices to collaboratively train a shared global model locally rather than relying on a data center to perform model training.Therefore,DT can benefit by combining with FL,successfully solving the"data island"problem in traditional AI.However,FL still faces serious challenges,such as enduring single-point failures,suffering from poison attacks,lacking effective incentive mechanisms.Before the successful deployment of DT,we should tackle the issues caused by FL.Researchers from industry and academia have recognized the potential of introducing Blockchain Technology(BT)into FL to overcome the challenges faced by FL,where BT acting as a distributed and immutable ledger,can store data in a secure,traceable,and trusted manner.However,to the best of our knowledge,a comprehensive literature review on this topic is still missing.In this paper,we review existing works about blockchain-enabled FL and visualize their prospects with DT.To this end,we first propose evaluation requirements with respect to security,faulttolerance,fairness,efficiency,cost-saving,profitability,and support for heterogeneity.Then,we classify existing literature according to the functionalities of BT in FL and analyze their advantages and disadvantages based on the proposed evaluation requirements.Finally,we discuss open problems in the existing literature and the future of DT supported by blockchain-enabled FL,based on which we further propose some directions for future research.展开更多
In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading...In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.展开更多
The Autonomous Underwater Glider(AUG)is a kind of prevailing underwater intelligent internet vehicle and occupies a dominant position in industrial applications,in which path planning is an essential problem.Due to th...The Autonomous Underwater Glider(AUG)is a kind of prevailing underwater intelligent internet vehicle and occupies a dominant position in industrial applications,in which path planning is an essential problem.Due to the complexity and variability of the ocean,accurate environment modeling and flexible path planning algorithms are pivotal challenges.The traditional models mainly utilize mathematical functions,which are not complete and reliable.Most existing path planning algorithms depend on the environment and lack flexibility.To overcome these challenges,we propose a path planning system for underwater intelligent internet vehicles.It applies digital twins and sensor data to map the real ocean environment to a virtual digital space,which provides a comprehensive and reliable environment for path simulation.We design a value-based reinforcement learning path planning algorithm and explore the optimal network structure parameters.The path simulation is controlled by a closed-loop model integrated into the terminal vehicle through edge computing.The integration of state input enriches the learning of neural networks and helps to improve generalization and flexibility.The task-related reward function promotes the rapid convergence of the training.The experimental results prove that our reinforcement learning based path planning algorithm has great flexibility and can effectively adapt to a variety of different ocean conditions.展开更多
The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data gen...The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data generated by the IIo T,coupled with heterogeneous computation capacity across IIo T devices,and users’data privacy concerns,have posed challenges towards achieving industrial edge intelligence(IEI).To achieve IEI,in this paper,we propose a semi-federated learning framework where a portion of the data with higher privacy is kept locally and a portion of the less private data can be potentially uploaded to the edge server.In addition,we leverage digital twins to overcome the problem of computation capacity heterogeneity of IIo T devices through the mapping of physical entities.We formulate a synchronization latency minimization problem which jointly optimizes edge association and the proportion of uploaded nonprivate data.As the joint problem is NP-hard and combinatorial and taking into account the reality of largescale device training,we develop a multi-agent hybrid action deep reinforcement learning(DRL)algorithm to find the optimal solution.Simulation results show that our proposed DRL algorithm can reduce latency and have a better convergence performance for semi-federated learning compared to benchmark algorithms.展开更多
Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to...Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to managing SPL variability,underscoring the critical importance of robust cybersecurity measures.This paper advocates for leveraging machine learning(ML)to address variability management issues and fortify the security of SPL.In the context of the broader special issue theme on innovative cybersecurity approaches,our proposed ML-based framework offers an interdisciplinary perspective,blending insights from computing,social sciences,and business.Specifically,it employs ML for demand analysis,dynamic feature extraction,and enhanced feature selection in distributed settings,contributing to cyber-resilient ecosystems.Our experiments demonstrate the framework’s superiority,emphasizing its potential to boost productivity and security in SPLs.As digital threats evolve,this research catalyzes interdisciplinary collaborations,aligning with the special issue’s goal of breaking down academic barriers to strengthen digital ecosystems against sophisticated attacks while upholding ethics,privacy,and human values.展开更多
With the widespread use of lithium-ion batteries in electric vehicles,energy storage,and mobile terminals,there is an urgent need to develop cathode materials with specific properties.However,existing material control...With the widespread use of lithium-ion batteries in electric vehicles,energy storage,and mobile terminals,there is an urgent need to develop cathode materials with specific properties.However,existing material control synthesis routes based on repetitive experiments are often costly and inefficient,which is unsuitable for the broader application of novel materials.The development of machine learning and its combination with materials design offers a potential pathway for optimizing materials.Here,we present a design synthesis paradigm for developing high energy Ni-rich cathodes with thermal/kinetic simulation and propose a coupled image-morphology machine learning model.The paradigm can accurately predict the reaction conditions required for synthesizing cathode precursors with specific morphologies,helping to shorten the experimental duration and costs.After the model-guided design synthesis,cathode materials with different morphological characteristics can be obtained,and the best shows a high discharge capacity of 206 mAh g^(−1)at 0.1C and 83%capacity retention after 200 cycles.This work provides guidance for designing cathode materials for lithium-ion batteries,which may point the way to a fast and cost-effective direction for controlling the morphology of all types of particles.展开更多
A general prediction model for seven heavy metals was established using the heavy metal contents of 207soil samples measured by a portable X-ray fluorescence spectrometer(XRF)and six environmental factors as model cor...A general prediction model for seven heavy metals was established using the heavy metal contents of 207soil samples measured by a portable X-ray fluorescence spectrometer(XRF)and six environmental factors as model correction coefficients.The eXtreme Gradient Boosting(XGBoost)model was used to fit the relationship between the content of heavy metals and environment characteristics to evaluate the soil ecological risk of the smelting site.The results demonstrated that the generalized prediction model developed for Pb,Cd,and As was highly accurate with fitted coefficients(R~2)values of 0.911,0.950,and 0.835,respectively.Topsoil presented the highest ecological risk,and there existed high potential ecological risk at some positions with different depths due to high mobility of Cd.Generally,the application of machine learning significantly increased the accuracy of pXRF measurements,and identified key environmental factors.The adapted potential ecological risk assessment emphasized the need to focus on Pb,Cd,and As in future site remediation efforts.展开更多
Limited by battery and computing re-sources,the computing-intensive tasks generated by Internet of Things(IoT)devices cannot be processed all by themselves.Mobile edge computing(MEC)is a suitable solution for this pro...Limited by battery and computing re-sources,the computing-intensive tasks generated by Internet of Things(IoT)devices cannot be processed all by themselves.Mobile edge computing(MEC)is a suitable solution for this problem,and the gener-ated tasks can be offloaded from IoT devices to MEC.In this paper,we study the problem of dynamic task offloading for digital twin-empowered MEC.Digital twin techniques are applied to provide information of environment and share the training data of agent de-ployed on IoT devices.We formulate the task offload-ing problem with the goal of maximizing the energy efficiency and the workload balance among the ESs.Then,we reformulate the problem as an MDP problem and design DRL-based energy efficient task offloading(DEETO)algorithm to solve it.Comparative experi-ments are carried out which show the superiority of our DEETO algorithm in improving energy efficiency and balancing the workload.展开更多
Over the past two decades,digital microfluidic biochips have been in much demand for safety-critical and biomedical applications and increasingly important in point-of-care analysis,drug discovery,and immunoassays,amo...Over the past two decades,digital microfluidic biochips have been in much demand for safety-critical and biomedical applications and increasingly important in point-of-care analysis,drug discovery,and immunoassays,among other areas.However,for complex bioassays,finding routes for the transportation of droplets in an electrowetting-on-dielectric digital biochip while maintaining their discreteness is a challenging task.In this study,we propose a deep reinforcement learning-based droplet routing technique for digital microfluidic biochips.The technique is implemented on a distributed architecture to optimize the possible paths for predefined source–target pairs of droplets.The actors of the technique calculate the possible routes of the source–target pairs and store the experience in a replay buffer,and the learner fetches the experiences and updates the routing paths.The proposed algorithm was applied to benchmark suitesⅠand Ⅲ as two different test benches,and it achieved significant improvements over state-of-the-art techniques.展开更多
In this paper,to deal with the heterogeneity in federated learning(FL)systems,a knowledge distillation(KD)driven training framework for FL is proposed,where each user can select its neural network model on demand and ...In this paper,to deal with the heterogeneity in federated learning(FL)systems,a knowledge distillation(KD)driven training framework for FL is proposed,where each user can select its neural network model on demand and distill knowledge from a big teacher model using its own private dataset.To overcome the challenge of train the big teacher model in resource limited user devices,the digital twin(DT)is exploit in the way that the teacher model can be trained at DT located in the server with enough computing resources.Then,during model distillation,each user can update the parameters of its model at either the physical entity or the digital agent.The joint problem of model selection and training offloading and resource allocation for users is formulated as a mixed integer programming(MIP)problem.To solve the problem,Q-learning and optimization are jointly used,where Q-learning selects models for users and determines whether to train locally or on the server,and optimization is used to allocate resources for users based on the output of Q-learning.Simulation results show the proposed DT-assisted KD framework and joint optimization method can significantly improve the average accuracy of users while reducing the total delay.展开更多
The connected autonomous vehicle is considered an effective way to improve transport safety and efficiency.To overcome the limited sensing and computing capabilities of individual vehicles,we design a digital twin ass...The connected autonomous vehicle is considered an effective way to improve transport safety and efficiency.To overcome the limited sensing and computing capabilities of individual vehicles,we design a digital twin assisted decision-making framework for Internet of Vehicles,by leveraging the integration of communication,sensing and computing.In this framework,the digital twin entities residing on edge can effectively communicate and cooperate with each other to plan sub-targets for their respective vehicles,while the vehicles only need to achieve the sub-targets by generating a sequence of atomic actions.Furthermore,we propose a hierarchical multiagent reinforcement learning approach to implement the framework,which can be trained in an end-to-end way.In the proposed approach,the communication interval of digital twin entities could adapt to timevarying environment.Extensive experiments on driving decision-making have been performed in traffic junction scenarios of different difficulties.The experimental results show that the proposed approach can largely improve collaboration efficiency while reducing communication overhead.展开更多
Mechanical metamaterials such as auxetic materials have attracted great interest due to their unusual properties that are dictated by their architectures.However,these architected materials usually have low stiffness ...Mechanical metamaterials such as auxetic materials have attracted great interest due to their unusual properties that are dictated by their architectures.However,these architected materials usually have low stiffness because of the bending or rotation deformation mechanisms in the microstructures.In this work,a convolutional neural network(CNN)based self-learning multi-objective optimization is performed to design digital composite materials.The CNN models have undergone rigorous training using randomly generated two-phase digital composite materials,along with their corresponding Poisson's ratios and stiffness values.Then the CNN models are used for designing composite material structures with the minimum Poisson's ratio at a given volume fraction constraint.Furthermore,we have designed composite materials with optimized stiffness while exhibiting a desired Poisson's ratio(negative,zero,or positive).The optimized designs have been successfully and efficiently obtained,and their validity has been confirmed through finite element analysis results.This self-learning multi-objective optimization model offers a promising approach for achieving comprehensive multi-objective optimization.展开更多
We propose a high-accuracy artifacts-free single-frame digital holographic phase demodulation scheme for relatively lowcarrier frequency holograms-deep learning assisted variational Hilbert quantitative phase imaging(...We propose a high-accuracy artifacts-free single-frame digital holographic phase demodulation scheme for relatively lowcarrier frequency holograms-deep learning assisted variational Hilbert quantitative phase imaging(DL-VHQPI).The method,incorporating a conventional deep neural network into a complete physical model utilizing the idea of residual compensation,reliably and robustly recovers the quantitative phase information of the test objects.It can significantly alleviate spectrum-overlapping-caused phase artifacts under the slightly off-axis digital holographic system.Compared to the conventional end-to-end networks(without a physical model),the proposed method can reduce the dataset size dramatically while maintaining the imaging quality and model generalization.The DL-VHQPI is quantitatively studied by numerical simulation.The live-cell experiment is designed to demonstrate the method's practicality in biological research.The proposed idea of the deep learning-assisted physical model might be extended to diverse computational imaging techniques.展开更多
Automatic identification of cyberbullying is a problem that is gaining traction,especially in the Machine Learning areas.Not only is it complicated,but it has also become a pressing necessity,considering how social me...Automatic identification of cyberbullying is a problem that is gaining traction,especially in the Machine Learning areas.Not only is it complicated,but it has also become a pressing necessity,considering how social media has become an integral part of adolescents’lives and how serious the impacts of cyberbullying and online harassment can be,particularly among teenagers.This paper contains a systematic literature review of modern strategies,machine learning methods,and technical means for detecting cyberbullying and the aggressive command of an individual in the information space of the Internet.We undertake an in-depth review of 13 papers from four scientific databases.The article provides an overview of scientific literature to analyze the problem of cyberbullying detection from the point of view of machine learning and natural language processing.In this review,we consider a cyberbullying detection framework on social media platforms,which includes data collection,data processing,feature selection,feature extraction,and the application ofmachine learning to classify whether texts contain cyberbullying or not.This article seeks to guide future research on this topic toward a more consistent perspective with the phenomenon’s description and depiction,allowing future solutions to be more practical and effective.展开更多
Machine learning(ML) models provide great opportunities to accelerate novel material development, offering a virtual alternative to laborious and resource-intensive empirical methods. In this work, the second of a two...Machine learning(ML) models provide great opportunities to accelerate novel material development, offering a virtual alternative to laborious and resource-intensive empirical methods. In this work, the second of a two-part study, an ML approach is presented that offers accelerated digital design of Mg alloys. A systematic evaluation of four ML regression algorithms was explored to rationalise the complex relationships in Mg-alloy data and to capture the composition-processing-property patterns. Cross-validation and hold-out set validation techniques were utilised for unbiased estimation of model performance. Using atomic and thermodynamic properties of the alloys, feature augmentation was examined to define the most descriptive representation spaces for the alloy data. Additionally, a graphical user interface(GUI) webtool was developed to facilitate the use of the proposed models in predicting the mechanical properties of new Mg alloys. The results demonstrate that random forest regression model and neural network are robust models for predicting the ultimate tensile strength and ductility of Mg alloys, with accuracies of ~80% and 70% respectively. The developed models in this work are a step towards high-throughput screening of novel candidates for target mechanical properties and provide ML-guided alloy design.展开更多
Soil water content(SWC)is one of the critical indicators in various fields such as geotechnical engineering and agriculture.To avoid the time-consuming,destructive,and laborious drawbacks of conventional SWC measureme...Soil water content(SWC)is one of the critical indicators in various fields such as geotechnical engineering and agriculture.To avoid the time-consuming,destructive,and laborious drawbacks of conventional SWC measurements,the image-based SWC prediction is considered based on recent advances in quantitative soil color analysis.In this study,a promising method based on the Gaussian-fitting gray histogram is proposed for extracting characteristic parameters by analyzing soil images,aiming to alleviate the interference of complex surface conditions with color information extraction.In addition,an identity matrix consisting of 32 characteristic parameters from eight color spaces is constituted to describe the multi-dimensional information of the soil images.Meanwhile,a subset of 10 parameters is identified through three variable analytical methods.Then,four machine learning models for SWC prediction based on partial least squares regression(PLSR),random forest(RF),support vector machines regression(SVMR),and Gaussian process regression(GPR),are established using 32 and 10 characteristic parameters,and their performance is compared.The results show that the characteristic parameters obtained by Gaussian-fitting can effectively reduce the interference from soil surface conditions.The RGB,CIEXYZ,and CIELCH color spaces and lightness parameters,as the inputs,are more suitable for the SWC prediction models.Furthermore,it is found that 10 parameters could also serve as optimal and generalizable predictors without considerably reducing prediction accuracy,and the GPR model has the best prediction performance(R^(2)≥0.95,RMSE≤2.01%,RPD≥4.95,and RPIQ≥6.37).The proposed image-based SWC predictive models combined with effective color information and machine learning can achieve a transient and highly precise SWC prediction,providing valuable insights for mapping soil moisture fields.展开更多
Individuals with special needs learn more slowly than their peers and they need repetitions to be permanent.However,in crowded classrooms,it is dif-ficult for a teacher to deal with each student individually.This probl...Individuals with special needs learn more slowly than their peers and they need repetitions to be permanent.However,in crowded classrooms,it is dif-ficult for a teacher to deal with each student individually.This problem can be overcome by using supportive education applications.However,the majority of such applications are not designed for special education and therefore they are not efficient as expected.Special education students differ from their peers in terms of their development,characteristics,and educational qualifications.The handwriting skills of individuals with special needs are lower than their peers.This makes the task of Handwriting Recognition(HWR)more difficult.To over-come this problem,we propose a new personalized handwriting verification sys-tem that validates digits from the handwriting of special education students.The system uses a Convolutional Neural Network(CNN)created and trained from scratch.The data set used is obtained by collecting the handwriting of the students with the help of a tablet.A special education center is visited and the handwrittenfigures of the students are collected under the supervision of special education tea-chers.The system is designed as a person-dependent system as every student has their writing style.Overall,the system achieves promising results,reaching a recognition accuracy of about 94%.Overall,the system can verify special educa-tion students’handwriting digits with high accuracy and is ready to integrate with a mobile application that is designed to teach digits to special education students.展开更多
To realize high-accuracy physical-cyber digital twin(DT)mapping in a manufacturing system,a huge amount of data need to be collected and analyzed in real-time.Traditional DTs systems are deployed in cloud or edge serv...To realize high-accuracy physical-cyber digital twin(DT)mapping in a manufacturing system,a huge amount of data need to be collected and analyzed in real-time.Traditional DTs systems are deployed in cloud or edge servers independently,whilst it is hard to apply in real production systems due to the high interaction or execution delay.This results in a low consistency in the temporal dimension of the physical-cyber model.In this work,we propose a novel efficient edge-cloud DT manufacturing system,which is inspired by resource scheduling technology.Specifically,an edge-cloud collaborative DTs system deployment architecture is first constructed.Then,deterministic and uncertainty optimization adaptive strategies are presented to choose a more powerful server for running DT-based applications.We model the adaptive optimization problems as dynamic programming problems and propose a novel collaborative clustering parallel Q-learning(CCPQL)algorithm and prediction-based CCPQL to solve the problems.The proposed approach reduces the total delay with a higher convergence rate.Numerical simulation results are provided to validate the approach,which would have great potential in dynamic and complex industrial internet environments.展开更多
Purpose:This study aims to explore Chilean students'digital technology usage patterns andapproaches to learningDesignlApproach/Methods:We conducted this study in two stages We worked with onesemester learning mana...Purpose:This study aims to explore Chilean students'digital technology usage patterns andapproaches to learningDesignlApproach/Methods:We conducted this study in two stages We worked with onesemester learning management systems(LMS),library,and students records data in the firstone.We performed a k-means cluster analysis to identify groups with similar usage patterns.Inthe second stage,we invited students from emerging dusters to participate in group interviews.Thematic analysis was employed to analyze them.Findings:Three groups were identified:ID digital library users/high performers,who adopteddeeper approaches to learning obtained higher marks,and used learning resources to integratematerials and expand understanding 2)LMS and physical library userslmid-performers,whoadopted mainly strategicapproaches obtained marks dlose to average,and used learning resources for studying in an organized manner toget good marks and 3)lower users of LMS andlibrarylmidlow performers,who adopted mainly a surface approach,obtained mid-to-lower-than-averagemarks,and used learning resources for minimum content understanding Originality/Value:We demonstrated the importance of combining learning analytics data withqualitative methods to make sense of digital technology usage patternss approaches to learningare associated with learning resources use.Practical recommendations are presented.展开更多
基金2024 Education and Teaching Reform Project of Hainan Tropical Ocean University(RHYxgnw2024-16)。
文摘This paper proposes teaching reforms in communication engineering majors,emphasizing the implementation of digital and adaptive teaching methodologies,integrating emerging technologies,breaking free from the constraints of traditional education,and fostering high-caliber talents.The reform measures encompass fundamental data collection,recognition of individual characteristics,recommendation of adaptive learning resources,process-oriented teaching management,adaptive student guidance and early warning systems,personalized evaluation,and the construction of an integrated service platform.These measures,when combined,form a comprehensive system that is expected to enhance teaching quality and efficiency,and facilitate student development.
基金supported in part by the National Natural Science Foundation of China under Grant 62072351in part by the Academy of Finland under Grant 308087,Grant 335262,Grant 345072,and Grant 350464+1 种基金in part by the Open Project of Zhejiang Lab under Grant 2021PD0AB01in part by the 111 Project under Grant B16037.
文摘Digital Twin(DT)supports real time analysis and provides a reliable simulation platform in the Internet of Things(IoT).The creation and application of DT hinges on amounts of data,which poses pressure on the application of Artificial Intelligence(AI)for DT descriptions and intelligent decision-making.Federated Learning(FL)is a cutting-edge technology that enables geographically dispersed devices to collaboratively train a shared global model locally rather than relying on a data center to perform model training.Therefore,DT can benefit by combining with FL,successfully solving the"data island"problem in traditional AI.However,FL still faces serious challenges,such as enduring single-point failures,suffering from poison attacks,lacking effective incentive mechanisms.Before the successful deployment of DT,we should tackle the issues caused by FL.Researchers from industry and academia have recognized the potential of introducing Blockchain Technology(BT)into FL to overcome the challenges faced by FL,where BT acting as a distributed and immutable ledger,can store data in a secure,traceable,and trusted manner.However,to the best of our knowledge,a comprehensive literature review on this topic is still missing.In this paper,we review existing works about blockchain-enabled FL and visualize their prospects with DT.To this end,we first propose evaluation requirements with respect to security,faulttolerance,fairness,efficiency,cost-saving,profitability,and support for heterogeneity.Then,we classify existing literature according to the functionalities of BT in FL and analyze their advantages and disadvantages based on the proposed evaluation requirements.Finally,we discuss open problems in the existing literature and the future of DT supported by blockchain-enabled FL,based on which we further propose some directions for future research.
基金This project was funded by Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah underGrant No.(IFPIP-1127-611-1443)the authors,therefore,acknowledge with thanks DSR technical and financial support.
文摘In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.
基金supported by the National Natural Science Foundation of China(No.61871283).
文摘The Autonomous Underwater Glider(AUG)is a kind of prevailing underwater intelligent internet vehicle and occupies a dominant position in industrial applications,in which path planning is an essential problem.Due to the complexity and variability of the ocean,accurate environment modeling and flexible path planning algorithms are pivotal challenges.The traditional models mainly utilize mathematical functions,which are not complete and reliable.Most existing path planning algorithms depend on the environment and lack flexibility.To overcome these challenges,we propose a path planning system for underwater intelligent internet vehicles.It applies digital twins and sensor data to map the real ocean environment to a virtual digital space,which provides a comprehensive and reliable environment for path simulation.We design a value-based reinforcement learning path planning algorithm and explore the optimal network structure parameters.The path simulation is controlled by a closed-loop model integrated into the terminal vehicle through edge computing.The integration of state input enriches the learning of neural networks and helps to improve generalization and flexibility.The task-related reward function promotes the rapid convergence of the training.The experimental results prove that our reinforcement learning based path planning algorithm has great flexibility and can effectively adapt to a variety of different ocean conditions.
基金supported in part by the National Nature Science Foundation of China under Grant 62001168in part by the Foundation and Application Research Grant of Guangzhou under Grant 202102020515。
文摘The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data generated by the IIo T,coupled with heterogeneous computation capacity across IIo T devices,and users’data privacy concerns,have posed challenges towards achieving industrial edge intelligence(IEI).To achieve IEI,in this paper,we propose a semi-federated learning framework where a portion of the data with higher privacy is kept locally and a portion of the less private data can be potentially uploaded to the edge server.In addition,we leverage digital twins to overcome the problem of computation capacity heterogeneity of IIo T devices through the mapping of physical entities.We formulate a synchronization latency minimization problem which jointly optimizes edge association and the proportion of uploaded nonprivate data.As the joint problem is NP-hard and combinatorial and taking into account the reality of largescale device training,we develop a multi-agent hybrid action deep reinforcement learning(DRL)algorithm to find the optimal solution.Simulation results show that our proposed DRL algorithm can reduce latency and have a better convergence performance for semi-federated learning compared to benchmark algorithms.
基金supported via funding from Ministry of Defense,Government of Pakistan under Project Number AHQ/95013/6/4/8/NASTP(ACP).Titled:Development of ICT and Artificial Intelligence Based Precision Agriculture Systems Utilizing Dual-Use Aerospace Technologies-GREENAI.
文摘Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to managing SPL variability,underscoring the critical importance of robust cybersecurity measures.This paper advocates for leveraging machine learning(ML)to address variability management issues and fortify the security of SPL.In the context of the broader special issue theme on innovative cybersecurity approaches,our proposed ML-based framework offers an interdisciplinary perspective,blending insights from computing,social sciences,and business.Specifically,it employs ML for demand analysis,dynamic feature extraction,and enhanced feature selection in distributed settings,contributing to cyber-resilient ecosystems.Our experiments demonstrate the framework’s superiority,emphasizing its potential to boost productivity and security in SPLs.As digital threats evolve,this research catalyzes interdisciplinary collaborations,aligning with the special issue’s goal of breaking down academic barriers to strengthen digital ecosystems against sophisticated attacks while upholding ethics,privacy,and human values.
基金supported by the National Natural Science Foundation of China(52072036)the Key Research and Development Program of Henan province,China(231111242500).
文摘With the widespread use of lithium-ion batteries in electric vehicles,energy storage,and mobile terminals,there is an urgent need to develop cathode materials with specific properties.However,existing material control synthesis routes based on repetitive experiments are often costly and inefficient,which is unsuitable for the broader application of novel materials.The development of machine learning and its combination with materials design offers a potential pathway for optimizing materials.Here,we present a design synthesis paradigm for developing high energy Ni-rich cathodes with thermal/kinetic simulation and propose a coupled image-morphology machine learning model.The paradigm can accurately predict the reaction conditions required for synthesizing cathode precursors with specific morphologies,helping to shorten the experimental duration and costs.After the model-guided design synthesis,cathode materials with different morphological characteristics can be obtained,and the best shows a high discharge capacity of 206 mAh g^(−1)at 0.1C and 83%capacity retention after 200 cycles.This work provides guidance for designing cathode materials for lithium-ion batteries,which may point the way to a fast and cost-effective direction for controlling the morphology of all types of particles.
基金financially supported from the National Key Research and Development Program of China(No.2019YFC1803601)the Fundamental Research Funds for the Central Universities of Central South University,China(No.2023ZZTS0801)+1 种基金the Postgraduate Innovative Project of Central South University,China(No.2023XQLH068)the Postgraduate Scientific Research Innovation Project of Hunan Province,China(No.QL20230054)。
文摘A general prediction model for seven heavy metals was established using the heavy metal contents of 207soil samples measured by a portable X-ray fluorescence spectrometer(XRF)and six environmental factors as model correction coefficients.The eXtreme Gradient Boosting(XGBoost)model was used to fit the relationship between the content of heavy metals and environment characteristics to evaluate the soil ecological risk of the smelting site.The results demonstrated that the generalized prediction model developed for Pb,Cd,and As was highly accurate with fitted coefficients(R~2)values of 0.911,0.950,and 0.835,respectively.Topsoil presented the highest ecological risk,and there existed high potential ecological risk at some positions with different depths due to high mobility of Cd.Generally,the application of machine learning significantly increased the accuracy of pXRF measurements,and identified key environmental factors.The adapted potential ecological risk assessment emphasized the need to focus on Pb,Cd,and As in future site remediation efforts.
基金This work was partly supported by the Project of Cultivation for young top-motch Talents of Beijing Municipal Institutions(No BPHR202203225)the Young Elite Scientists Sponsorship Program by BAST(BYESS2023031)the National key research and development program(No 2022YFF0604502).
文摘Limited by battery and computing re-sources,the computing-intensive tasks generated by Internet of Things(IoT)devices cannot be processed all by themselves.Mobile edge computing(MEC)is a suitable solution for this problem,and the gener-ated tasks can be offloaded from IoT devices to MEC.In this paper,we study the problem of dynamic task offloading for digital twin-empowered MEC.Digital twin techniques are applied to provide information of environment and share the training data of agent de-ployed on IoT devices.We formulate the task offload-ing problem with the goal of maximizing the energy efficiency and the workload balance among the ESs.Then,we reformulate the problem as an MDP problem and design DRL-based energy efficient task offloading(DEETO)algorithm to solve it.Comparative experi-ments are carried out which show the superiority of our DEETO algorithm in improving energy efficiency and balancing the workload.
文摘Over the past two decades,digital microfluidic biochips have been in much demand for safety-critical and biomedical applications and increasingly important in point-of-care analysis,drug discovery,and immunoassays,among other areas.However,for complex bioassays,finding routes for the transportation of droplets in an electrowetting-on-dielectric digital biochip while maintaining their discreteness is a challenging task.In this study,we propose a deep reinforcement learning-based droplet routing technique for digital microfluidic biochips.The technique is implemented on a distributed architecture to optimize the possible paths for predefined source–target pairs of droplets.The actors of the technique calculate the possible routes of the source–target pairs and store the experience in a replay buffer,and the learner fetches the experiences and updates the routing paths.The proposed algorithm was applied to benchmark suitesⅠand Ⅲ as two different test benches,and it achieved significant improvements over state-of-the-art techniques.
基金supported by the National Key Research and Development Program of China (2020YFB1807700)the National Natural Science Foundation of China (NSFC)under Grant No.62071356the Chongqing Key Laboratory of Mobile Communications Technology under Grant cqupt-mct202202。
文摘In this paper,to deal with the heterogeneity in federated learning(FL)systems,a knowledge distillation(KD)driven training framework for FL is proposed,where each user can select its neural network model on demand and distill knowledge from a big teacher model using its own private dataset.To overcome the challenge of train the big teacher model in resource limited user devices,the digital twin(DT)is exploit in the way that the teacher model can be trained at DT located in the server with enough computing resources.Then,during model distillation,each user can update the parameters of its model at either the physical entity or the digital agent.The joint problem of model selection and training offloading and resource allocation for users is formulated as a mixed integer programming(MIP)problem.To solve the problem,Q-learning and optimization are jointly used,where Q-learning selects models for users and determines whether to train locally or on the server,and optimization is used to allocate resources for users based on the output of Q-learning.Simulation results show the proposed DT-assisted KD framework and joint optimization method can significantly improve the average accuracy of users while reducing the total delay.
基金supported in part by the Natural Science Foundation of China under Grant 62001054,Grant 62272053 and Grant 61901191in part by the Natural Science Foundation of Shandong Province of China under Grant ZR2020LZH005in part by the Fundamental Research Funds for the Central Universities。
文摘The connected autonomous vehicle is considered an effective way to improve transport safety and efficiency.To overcome the limited sensing and computing capabilities of individual vehicles,we design a digital twin assisted decision-making framework for Internet of Vehicles,by leveraging the integration of communication,sensing and computing.In this framework,the digital twin entities residing on edge can effectively communicate and cooperate with each other to plan sub-targets for their respective vehicles,while the vehicles only need to achieve the sub-targets by generating a sequence of atomic actions.Furthermore,we propose a hierarchical multiagent reinforcement learning approach to implement the framework,which can be trained in an end-to-end way.In the proposed approach,the communication interval of digital twin entities could adapt to timevarying environment.Extensive experiments on driving decision-making have been performed in traffic junction scenarios of different difficulties.The experimental results show that the proposed approach can largely improve collaboration efficiency while reducing communication overhead.
文摘Mechanical metamaterials such as auxetic materials have attracted great interest due to their unusual properties that are dictated by their architectures.However,these architected materials usually have low stiffness because of the bending or rotation deformation mechanisms in the microstructures.In this work,a convolutional neural network(CNN)based self-learning multi-objective optimization is performed to design digital composite materials.The CNN models have undergone rigorous training using randomly generated two-phase digital composite materials,along with their corresponding Poisson's ratios and stiffness values.Then the CNN models are used for designing composite material structures with the minimum Poisson's ratio at a given volume fraction constraint.Furthermore,we have designed composite materials with optimized stiffness while exhibiting a desired Poisson's ratio(negative,zero,or positive).The optimized designs have been successfully and efficiently obtained,and their validity has been confirmed through finite element analysis results.This self-learning multi-objective optimization model offers a promising approach for achieving comprehensive multi-objective optimization.
基金We are grateful for financial supports from the National Natural Science Foundation of China(61905115,62105151,62175109,U21B2033,62227818)Leading Technology of Jiangsu Basic Research Plan(BK20192003)+5 种基金Youth Foundation of Jiangsu Province(BK20190445,BK20210338)Biomedical Competition Foundation of Jiangsu Province(BE2022847)Key National Industrial Technology Cooperation Foundation of Jiangsu Province(BZ2022039)Fundamental Research Funds for the Central Universities(30920032101)Open Research Fund of Jiangsu Key Laboratory of Spectral Imaging&Intelligent Sense(JSGP202105,JSGP202201)National Science Center,Poland(2020/37/B/ST7/03629).The authors thank F.Sun for her contribution to this paper in terms of language expression and grammatical correction.
文摘We propose a high-accuracy artifacts-free single-frame digital holographic phase demodulation scheme for relatively lowcarrier frequency holograms-deep learning assisted variational Hilbert quantitative phase imaging(DL-VHQPI).The method,incorporating a conventional deep neural network into a complete physical model utilizing the idea of residual compensation,reliably and robustly recovers the quantitative phase information of the test objects.It can significantly alleviate spectrum-overlapping-caused phase artifacts under the slightly off-axis digital holographic system.Compared to the conventional end-to-end networks(without a physical model),the proposed method can reduce the dataset size dramatically while maintaining the imaging quality and model generalization.The DL-VHQPI is quantitatively studied by numerical simulation.The live-cell experiment is designed to demonstrate the method's practicality in biological research.The proposed idea of the deep learning-assisted physical model might be extended to diverse computational imaging techniques.
文摘Automatic identification of cyberbullying is a problem that is gaining traction,especially in the Machine Learning areas.Not only is it complicated,but it has also become a pressing necessity,considering how social media has become an integral part of adolescents’lives and how serious the impacts of cyberbullying and online harassment can be,particularly among teenagers.This paper contains a systematic literature review of modern strategies,machine learning methods,and technical means for detecting cyberbullying and the aggressive command of an individual in the information space of the Internet.We undertake an in-depth review of 13 papers from four scientific databases.The article provides an overview of scientific literature to analyze the problem of cyberbullying detection from the point of view of machine learning and natural language processing.In this review,we consider a cyberbullying detection framework on social media platforms,which includes data collection,data processing,feature selection,feature extraction,and the application ofmachine learning to classify whether texts contain cyberbullying or not.This article seeks to guide future research on this topic toward a more consistent perspective with the phenomenon’s description and depiction,allowing future solutions to be more practical and effective.
基金the support of the Monash-IITB Academy Scholarshipthe Australian Research Council for funding the present research (DP190103592)。
文摘Machine learning(ML) models provide great opportunities to accelerate novel material development, offering a virtual alternative to laborious and resource-intensive empirical methods. In this work, the second of a two-part study, an ML approach is presented that offers accelerated digital design of Mg alloys. A systematic evaluation of four ML regression algorithms was explored to rationalise the complex relationships in Mg-alloy data and to capture the composition-processing-property patterns. Cross-validation and hold-out set validation techniques were utilised for unbiased estimation of model performance. Using atomic and thermodynamic properties of the alloys, feature augmentation was examined to define the most descriptive representation spaces for the alloy data. Additionally, a graphical user interface(GUI) webtool was developed to facilitate the use of the proposed models in predicting the mechanical properties of new Mg alloys. The results demonstrate that random forest regression model and neural network are robust models for predicting the ultimate tensile strength and ductility of Mg alloys, with accuracies of ~80% and 70% respectively. The developed models in this work are a step towards high-throughput screening of novel candidates for target mechanical properties and provide ML-guided alloy design.
文摘Soil water content(SWC)is one of the critical indicators in various fields such as geotechnical engineering and agriculture.To avoid the time-consuming,destructive,and laborious drawbacks of conventional SWC measurements,the image-based SWC prediction is considered based on recent advances in quantitative soil color analysis.In this study,a promising method based on the Gaussian-fitting gray histogram is proposed for extracting characteristic parameters by analyzing soil images,aiming to alleviate the interference of complex surface conditions with color information extraction.In addition,an identity matrix consisting of 32 characteristic parameters from eight color spaces is constituted to describe the multi-dimensional information of the soil images.Meanwhile,a subset of 10 parameters is identified through three variable analytical methods.Then,four machine learning models for SWC prediction based on partial least squares regression(PLSR),random forest(RF),support vector machines regression(SVMR),and Gaussian process regression(GPR),are established using 32 and 10 characteristic parameters,and their performance is compared.The results show that the characteristic parameters obtained by Gaussian-fitting can effectively reduce the interference from soil surface conditions.The RGB,CIEXYZ,and CIELCH color spaces and lightness parameters,as the inputs,are more suitable for the SWC prediction models.Furthermore,it is found that 10 parameters could also serve as optimal and generalizable predictors without considerably reducing prediction accuracy,and the GPR model has the best prediction performance(R^(2)≥0.95,RMSE≤2.01%,RPD≥4.95,and RPIQ≥6.37).The proposed image-based SWC predictive models combined with effective color information and machine learning can achieve a transient and highly precise SWC prediction,providing valuable insights for mapping soil moisture fields.
文摘Individuals with special needs learn more slowly than their peers and they need repetitions to be permanent.However,in crowded classrooms,it is dif-ficult for a teacher to deal with each student individually.This problem can be overcome by using supportive education applications.However,the majority of such applications are not designed for special education and therefore they are not efficient as expected.Special education students differ from their peers in terms of their development,characteristics,and educational qualifications.The handwriting skills of individuals with special needs are lower than their peers.This makes the task of Handwriting Recognition(HWR)more difficult.To over-come this problem,we propose a new personalized handwriting verification sys-tem that validates digits from the handwriting of special education students.The system uses a Convolutional Neural Network(CNN)created and trained from scratch.The data set used is obtained by collecting the handwriting of the students with the help of a tablet.A special education center is visited and the handwrittenfigures of the students are collected under the supervision of special education tea-chers.The system is designed as a person-dependent system as every student has their writing style.Overall,the system achieves promising results,reaching a recognition accuracy of about 94%.Overall,the system can verify special educa-tion students’handwriting digits with high accuracy and is ready to integrate with a mobile application that is designed to teach digits to special education students.
基金supported by 2019 Industrial Internet Innovation Development Project of Ministry of Industry and Information Technology of P.R. China “Comprehensive Security Defense Platform Project for Industrial/Enterprise Networks”Research on Key Technologies of wireless edge intelligent collaboration for industrial internet scenarios (L202017)+1 种基金Natural Science Foundation of China, No.61971050BUPT Excellent Ph.D. Students Foundation (CX2020214)。
文摘To realize high-accuracy physical-cyber digital twin(DT)mapping in a manufacturing system,a huge amount of data need to be collected and analyzed in real-time.Traditional DTs systems are deployed in cloud or edge servers independently,whilst it is hard to apply in real production systems due to the high interaction or execution delay.This results in a low consistency in the temporal dimension of the physical-cyber model.In this work,we propose a novel efficient edge-cloud DT manufacturing system,which is inspired by resource scheduling technology.Specifically,an edge-cloud collaborative DTs system deployment architecture is first constructed.Then,deterministic and uncertainty optimization adaptive strategies are presented to choose a more powerful server for running DT-based applications.We model the adaptive optimization problems as dynamic programming problems and propose a novel collaborative clustering parallel Q-learning(CCPQL)algorithm and prediction-based CCPQL to solve the problems.The proposed approach reduces the total delay with a higher convergence rate.Numerical simulation results are provided to validate the approach,which would have great potential in dynamic and complex industrial internet environments.
基金supported by the Iniciativa Milenio,Agencia Nacional de Investigacion yDesairollo(ANID)(grant Millennium Nucleus,NMEdSup)and Fondecyt Regular,Agencia Nacional deInvestigacion y Desairollo(grant number 1161413)。
文摘Purpose:This study aims to explore Chilean students'digital technology usage patterns andapproaches to learningDesignlApproach/Methods:We conducted this study in two stages We worked with onesemester learning management systems(LMS),library,and students records data in the firstone.We performed a k-means cluster analysis to identify groups with similar usage patterns.Inthe second stage,we invited students from emerging dusters to participate in group interviews.Thematic analysis was employed to analyze them.Findings:Three groups were identified:ID digital library users/high performers,who adopteddeeper approaches to learning obtained higher marks,and used learning resources to integratematerials and expand understanding 2)LMS and physical library userslmid-performers,whoadopted mainly strategicapproaches obtained marks dlose to average,and used learning resources for studying in an organized manner toget good marks and 3)lower users of LMS andlibrarylmidlow performers,who adopted mainly a surface approach,obtained mid-to-lower-than-averagemarks,and used learning resources for minimum content understanding Originality/Value:We demonstrated the importance of combining learning analytics data withqualitative methods to make sense of digital technology usage patternss approaches to learningare associated with learning resources use.Practical recommendations are presented.