期刊文献+
共找到31,061篇文章
< 1 2 250 >
每页显示 20 50 100
A Power Data Anomaly Detection Model Based on Deep Learning with Adaptive Feature Fusion
1
作者 Xiu Liu Liang Gu +3 位作者 Xin Gong Long An Xurui Gao Juying Wu 《Computers, Materials & Continua》 SCIE EI 2024年第6期4045-4061,共17页
With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve suffi... With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve sufficient extraction of data features,which seriously affects the accuracy and performance of anomaly detection.Therefore,this paper proposes a deep learning-based anomaly detection model for power data,which integrates a data alignment enhancement technique based on random sampling and an adaptive feature fusion method leveraging dimension reduction.Aiming at the distribution variability of power data,this paper developed a sliding window-based data adjustment method for this model,which solves the problem of high-dimensional feature noise and low-dimensional missing data.To address the problem of insufficient feature fusion,an adaptive feature fusion method based on feature dimension reduction and dictionary learning is proposed to improve the anomaly data detection accuracy of the model.In order to verify the effectiveness of the proposed method,we conducted effectiveness comparisons through elimination experiments.The experimental results show that compared with the traditional anomaly detection methods,the method proposed in this paper not only has an advantage in model accuracy,but also reduces the amount of parameter calculation of the model in the process of feature matching and improves the detection speed. 展开更多
关键词 Data alignment dimension reduction feature fusion data anomaly detection deep learning
下载PDF
Perpendicular-Cutdepth:Perpendicular Direction Depth Cutting Data Augmentation Method
2
作者 Le Zou Linsong Hu +2 位作者 Yifan Wang Zhize Wu Xiaofeng Wang 《Computers, Materials & Continua》 SCIE EI 2024年第4期927-941,共15页
Depth estimation is an important task in computer vision.Collecting data at scale for monocular depth estimation is challenging,as this task requires simultaneously capturing RGB images and depth information.Therefore... Depth estimation is an important task in computer vision.Collecting data at scale for monocular depth estimation is challenging,as this task requires simultaneously capturing RGB images and depth information.Therefore,data augmentation is crucial for this task.Existing data augmentationmethods often employ pixel-wise transformations,whichmay inadvertently disrupt edge features.In this paper,we propose a data augmentationmethod formonocular depth estimation,which we refer to as the Perpendicular-Cutdepth method.This method involves cutting realworld depth maps along perpendicular directions and pasting them onto input images,thereby diversifying the data without compromising edge features.To validate the effectiveness of the algorithm,we compared it with existing convolutional neural network(CNN)against the current mainstream data augmentation algorithms.Additionally,to verify the algorithm’s applicability to Transformer networks,we designed an encoder-decoder network structure based on Transformer to assess the generalization of our proposed algorithm.Experimental results demonstrate that,in the field of monocular depth estimation,our proposed method,Perpendicular-Cutdepth,outperforms traditional data augmentationmethods.On the indoor dataset NYU,our method increases accuracy from0.900 to 0.907 and reduces the error rate from0.357 to 0.351.On the outdoor dataset KITTI,our method improves accuracy from 0.9638 to 0.9642 and decreases the error rate from 0.060 to 0.0598. 展开更多
关键词 PERPENDICULAR depth estimation data augmentation
下载PDF
Benchmark experiment on slab^(238)U with D-T neutrons for validation of evaluated nuclear data
3
作者 Yan-Yan Ding Yang-Bo Nie +9 位作者 Yue Zhang Zhi-Jie Hu Qi Zhao Huan-Yu Zhang Kuo-Zhi Xu Shi-Yu Zhang Xin-Yi Pan Chang-Lin Lan Jie Ren Xi-Chao Ruan 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第2期145-159,共15页
A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°an... A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results. 展开更多
关键词 Leakage neutron spectra URANIUM D-T neutron source Evaluated nuclear data
下载PDF
An Innovative K-Anonymity Privacy-Preserving Algorithm to Improve Data Availability in the Context of Big Data
4
作者 Linlin Yuan Tiantian Zhang +2 位作者 Yuling Chen Yuxiang Yang Huang Li 《Computers, Materials & Continua》 SCIE EI 2024年第4期1561-1579,共19页
The development of technologies such as big data and blockchain has brought convenience to life,but at the same time,privacy and security issues are becoming more and more prominent.The K-anonymity algorithm is an eff... The development of technologies such as big data and blockchain has brought convenience to life,but at the same time,privacy and security issues are becoming more and more prominent.The K-anonymity algorithm is an effective and low computational complexity privacy-preserving algorithm that can safeguard users’privacy by anonymizing big data.However,the algorithm currently suffers from the problem of focusing only on improving user privacy while ignoring data availability.In addition,ignoring the impact of quasi-identified attributes on sensitive attributes causes the usability of the processed data on statistical analysis to be reduced.Based on this,we propose a new K-anonymity algorithm to solve the privacy security problem in the context of big data,while guaranteeing improved data usability.Specifically,we construct a new information loss function based on the information quantity theory.Considering that different quasi-identification attributes have different impacts on sensitive attributes,we set weights for each quasi-identification attribute when designing the information loss function.In addition,to reduce information loss,we improve K-anonymity in two ways.First,we make the loss of information smaller than in the original table while guaranteeing privacy based on common artificial intelligence algorithms,i.e.,greedy algorithm and 2-means clustering algorithm.In addition,we improve the 2-means clustering algorithm by designing a mean-center method to select the initial center of mass.Meanwhile,we design the K-anonymity algorithm of this scheme based on the constructed information loss function,the improved 2-means clustering algorithm,and the greedy algorithm,which reduces the information loss.Finally,we experimentally demonstrate the effectiveness of the algorithm in improving the effect of 2-means clustering and reducing information loss. 展开更多
关键词 Blockchain big data K-ANONYMITY 2-means clustering greedy algorithm mean-center method
下载PDF
Performance Analysis of Support Vector Machine (SVM) on Challenging Datasets for Forest Fire Detection
5
作者 Ankan Kar Nirjhar Nath +1 位作者 Utpalraj Kemprai   Aman 《International Journal of Communications, Network and System Sciences》 2024年第2期11-29,共19页
This article delves into the analysis of performance and utilization of Support Vector Machines (SVMs) for the critical task of forest fire detection using image datasets. With the increasing threat of forest fires to... This article delves into the analysis of performance and utilization of Support Vector Machines (SVMs) for the critical task of forest fire detection using image datasets. With the increasing threat of forest fires to ecosystems and human settlements, the need for rapid and accurate detection systems is of utmost importance. SVMs, renowned for their strong classification capabilities, exhibit proficiency in recognizing patterns associated with fire within images. By training on labeled data, SVMs acquire the ability to identify distinctive attributes associated with fire, such as flames, smoke, or alterations in the visual characteristics of the forest area. The document thoroughly examines the use of SVMs, covering crucial elements like data preprocessing, feature extraction, and model training. It rigorously evaluates parameters such as accuracy, efficiency, and practical applicability. The knowledge gained from this study aids in the development of efficient forest fire detection systems, enabling prompt responses and improving disaster management. Moreover, the correlation between SVM accuracy and the difficulties presented by high-dimensional datasets is carefully investigated, demonstrated through a revealing case study. The relationship between accuracy scores and the different resolutions used for resizing the training datasets has also been discussed in this article. These comprehensive studies result in a definitive overview of the difficulties faced and the potential sectors requiring further improvement and focus. 展开更多
关键词 Support Vector Machine Challenging Datasets Forest Fire Detection CLASSIFICATION
下载PDF
Evaluating the Efficacy of Latent Variables in Mitigating Data Poisoning Attacks in the Context of Bayesian Networks:An Empirical Study
6
作者 Shahad Alzahrani Hatim Alsuwat Emad Alsuwat 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第5期1635-1654,共20页
Bayesian networks are a powerful class of graphical decision models used to represent causal relationships among variables.However,the reliability and integrity of learned Bayesian network models are highly dependent ... Bayesian networks are a powerful class of graphical decision models used to represent causal relationships among variables.However,the reliability and integrity of learned Bayesian network models are highly dependent on the quality of incoming data streams.One of the primary challenges with Bayesian networks is their vulnerability to adversarial data poisoning attacks,wherein malicious data is injected into the training dataset to negatively influence the Bayesian network models and impair their performance.In this research paper,we propose an efficient framework for detecting data poisoning attacks against Bayesian network structure learning algorithms.Our framework utilizes latent variables to quantify the amount of belief between every two nodes in each causal model over time.We use our innovative methodology to tackle an important issue with data poisoning assaults in the context of Bayesian networks.With regard to four different forms of data poisoning attacks,we specifically aim to strengthen the security and dependability of Bayesian network structure learning techniques,such as the PC algorithm.By doing this,we explore the complexity of this area and offer workablemethods for identifying and reducing these sneaky dangers.Additionally,our research investigates one particular use case,the“Visit to Asia Network.”The practical consequences of using uncertainty as a way to spot cases of data poisoning are explored in this inquiry,which is of utmost relevance.Our results demonstrate the promising efficacy of latent variables in detecting and mitigating the threat of data poisoning attacks.Additionally,our proposed latent-based framework proves to be sensitive in detecting malicious data poisoning attacks in the context of stream data. 展开更多
关键词 Bayesian networks data poisoning attacks latent variables structure learning algorithms adversarial attacks
下载PDF
A Cross-matching Service for Data Center of Xinjiang Astronomical Observatory
7
作者 Hai-Long Zhang Jie Wang +6 位作者 Xin-Chen Ye Wan-Qiong Wang Jia Li Ya-Zhou Zhang Xu Du Han Wu Ting Zhang 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第1期119-127,共9页
Cross-matching is a key technique to achieve fusion of multi-band astronomical catalogs. Due to different equipment such as various astronomical telescopes, the existence of measurement errors, and proper motions of t... Cross-matching is a key technique to achieve fusion of multi-band astronomical catalogs. Due to different equipment such as various astronomical telescopes, the existence of measurement errors, and proper motions of the celestial bodies, the same celestial object will have different positions in different catalogs, making it difficult to integrate multi-band or full-band astronomical data. In this study, we propose an online cross-matching method based on pseudo-spherical indexing techniques and develop a service combining with high performance computing system(Taurus) to improve cross-matching efficiency, which is designed for the Data Center of Xinjiang Astronomical Observatory. Specifically, we use Quad Tree Cube to divide the spherical blocks of the celestial object and map the 2D space composed of R.A. and decl. to 1D space and achieve correspondence between real celestial objects and spherical patches. Finally, we verify the performance of the service using Gaia 3 and PPMXL catalogs. Meanwhile, we send the matching results to VO tools-Topcat and Aladin respectively to get visual results. The experimental results show that the service effectively solves the speed bottleneck problem of crossmatching caused by frequent I/O, and significantly improves the retrieval and matching speed of massive astronomical data. 展开更多
关键词 virtual observatory tools astronomical databases:miscellaneous catalogs
下载PDF
Cloud Datacenter Selection Using Service Broker Policies:A Survey
8
作者 Salam Al-E’mari Yousef Sanjalawe +2 位作者 Ahmad Al-Daraiseh Mohammad Bany Taha Mohammad Aladaileh 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期1-41,共41页
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ... Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain. 展开更多
关键词 Cloud computing cloud service broker datacenter selection QUALITY-OF-SERVICE user request
下载PDF
PSRDP:A Parallel Processing Method for Pulsar Baseband Data
9
作者 Ya-Zhou Zhang Hai-Long Zhang +7 位作者 Jie Wang Xin-Chen Ye Shuang-Qiang Wang Xu Du Han Wu Ting Zhang Shao-Cong Guo Meng Zhang 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第1期300-310,共11页
To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRD... To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRDP can perform operations such as baseband data unpacking,channel separation,coherent dedispersion,Stokes detection,phase and folding period prediction,and folding integration in GPU clusters.We tested the algorithm using the J0437-4715 pulsar baseband data generated by the CASPSR and Medusa backends of the Parkes,and the J0332+5434 pulsar baseband data generated by the self-developed backend of the Nan Shan Radio Telescope.We obtained the pulse profiles of each baseband data.Through experimental analysis,we have found that the pulse profiles generated by the PSRDP algorithm in this paper are essentially consistent with the processing results of Digital Signal Processing Software for Pulsar Astronomy(DSPSR),which verified the effectiveness of the PSRDP algorithm.Furthermore,using the same baseband data,we compared the processing speed of PSRDP with DSPSR,and the results showed that PSRDP was not slower than DSPSR in terms of speed.The theoretical and technical experience gained from the PSRDP algorithm research in this article lays a technical foundation for the real-time processing of QTT(Qi Tai radio Telescope)ultra-wide bandwidth pulsar baseband data. 展开更多
关键词 (stars:)pulsars:general methods:data analysis techniques:miscellaneous
下载PDF
A Blind Batch Encryption and Public Ledger-Based Protocol for Sharing Sensitive Data
10
作者 Zhiwei Wang Nianhua Yang +2 位作者 Qingqing Chen Wei Shen Zhiying Zhang 《China Communications》 SCIE CSCD 2024年第1期310-322,共13页
For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and all... For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and allows privacy information to be preserved.Data owners can tightly manage their data with efficient revocation and only grant one-time adaptive access for the fulfillment of the requester.We prove that our protocol is semanticallly secure,blind,and secure against oblivious requesters and malicious file keepers.We also provide security analysis in the context of four typical attacks. 展开更多
关键词 blind batch encryption data sharing onetime adaptive access public ledger security and privacy
下载PDF
Sports Prediction Model through Cloud Computing and Big Data Based on Artificial Intelligence Method
11
作者 Aws I. Abu Eid Achraf Ben Miled +9 位作者 Ahlem Fatnassi Majid A. Nawaz Ashraf F. A. Mahmoud Faroug A. Abdalla Chams Jabnoun Aida Dhibi Firas M. Allan Mohammed Ahmed Elhossiny Salem Belhaj Imen Ben Mohamed 《Journal of Intelligent Learning Systems and Applications》 2024年第2期53-79,共27页
This article delves into the intricate relationship between big data, cloud computing, and artificial intelligence, shedding light on their fundamental attributes and interdependence. It explores the seamless amalgama... This article delves into the intricate relationship between big data, cloud computing, and artificial intelligence, shedding light on their fundamental attributes and interdependence. It explores the seamless amalgamation of AI methodologies within cloud computing and big data analytics, encompassing the development of a cloud computing framework built on the robust foundation of the Hadoop platform, enriched by AI learning algorithms. Additionally, it examines the creation of a predictive model empowered by tailored artificial intelligence techniques. Rigorous simulations are conducted to extract valuable insights, facilitating method evaluation and performance assessment, all within the dynamic Hadoop environment, thereby reaffirming the precision of the proposed approach. The results and analysis section reveals compelling findings derived from comprehensive simulations within the Hadoop environment. These outcomes demonstrate the efficacy of the Sport AI Model (SAIM) framework in enhancing the accuracy of sports-related outcome predictions. Through meticulous mathematical analyses and performance assessments, integrating AI with big data emerges as a powerful tool for optimizing decision-making in sports. The discussion section extends the implications of these results, highlighting the potential for SAIM to revolutionize sports forecasting, strategic planning, and performance optimization for players and coaches. The combination of big data, cloud computing, and AI offers a promising avenue for future advancements in sports analytics. This research underscores the synergy between these technologies and paves the way for innovative approaches to sports-related decision-making and performance enhancement. 展开更多
关键词 Artificial Intelligence Machine Learning Spark Apache Big Data SAIM
下载PDF
A NOVEL STOCHASTIC HEPATITIS B VIRUS EPIDEMIC MODEL WITH SECOND-ORDER MULTIPLICATIVE α-STABLE NOISE AND REAL DATA
12
作者 Anwarud DIN Yassine SABBAR 吴鹏 《Acta Mathematica Scientia》 SCIE CSCD 2024年第2期752-788,共37页
This work presents an advanced and detailed analysis of the mechanisms of hepatitis B virus(HBV)propagation in an environment characterized by variability and stochas-ticity.Based on some biological features of the vi... This work presents an advanced and detailed analysis of the mechanisms of hepatitis B virus(HBV)propagation in an environment characterized by variability and stochas-ticity.Based on some biological features of the virus and the assumptions,the corresponding deterministic model is formulated,which takes into consideration the effect of vaccination.This deterministic model is extended to a stochastic framework by considering a new form of disturbance which makes it possible to simulate strong and significant fluctuations.The long-term behaviors of the virus are predicted by using stochastic differential equations with second-order multiplicative α-stable jumps.By developing the assumptions and employing the novel theoretical tools,the threshold parameter responsible for ergodicity(persistence)and extinction is provided.The theoretical results of the current study are validated by numerical simulations and parameters estimation is also performed.Moreover,we obtain the following new interesting findings:(a)in each class,the average time depends on the value ofα;(b)the second-order noise has an inverse effect on the spread of the virus;(c)the shapes of population densities at stationary level quickly changes at certain values of α.The last three conclusions can provide a solid research base for further investigation in the field of biological and ecological modeling. 展开更多
关键词 HBV model nonlinear perturbation probabilistic bifurcation long-run forecast numerical simulation
下载PDF
Robust and Trustworthy Data Sharing Framework Leveraging On-Chain and Off-Chain Collaboration
13
作者 Jinyang Yu Xiao Zhang +4 位作者 Jinjiang Wang Yuchen Zhang Yulong Shi Linxuan Su Leijie Zeng 《Computers, Materials & Continua》 SCIE EI 2024年第2期2159-2179,共21页
The proliferation of Internet of Things(IoT)systems has resulted in the generation of substantial data,presenting new challenges in reliable storage and trustworthy sharing.Conventional distributed storage systems are... The proliferation of Internet of Things(IoT)systems has resulted in the generation of substantial data,presenting new challenges in reliable storage and trustworthy sharing.Conventional distributed storage systems are hindered by centralized management and lack traceability,while blockchain systems are limited by low capacity and high latency.To address these challenges,the present study investigates the reliable storage and trustworthy sharing of IoT data,and presents a novel system architecture that integrates on-chain and off-chain data manage systems.This architecture,integrating blockchain and distributed storage technologies,provides high-capacity,high-performance,traceable,and verifiable data storage and access.The on-chain system,built on Hyperledger Fabric,manages metadata,verification data,and permission information of the raw data.The off-chain system,implemented using IPFS Cluster,ensures the reliable storage and efficient access to massive files.A collaborative storage server is designed to integrate on-chain and off-chain operation interfaces,facilitating comprehensive data operations.We provide a unified access interface for user-friendly system interaction.Extensive testing validates the system’s reliability and stable performance.The proposed approach significantly enhances storage capacity compared to standalone blockchain systems.Rigorous reliability tests consistently yield positive outcomes.With average upload and download throughputs of roughly 20 and 30 MB/s,respectively,the system’s throughput surpasses the blockchain system by a factor of 4 to 18. 展开更多
关键词 On-chain and off-chain collaboration blockchain distributed storage system hyperledger fabric IPFS cluster
下载PDF
Multi-Perspective Data Fusion Framework Based on Hierarchical BERT: Provide Visual Predictions of Business Processes
14
作者 Yongwang Yuan Xiangwei Liu Ke Lu 《Computers, Materials & Continua》 SCIE EI 2024年第1期1227-1252,共26页
Predictive Business Process Monitoring(PBPM)is a significant research area in Business Process Management(BPM)aimed at accurately forecasting future behavioral events.At present,deep learning methods are widely cited ... Predictive Business Process Monitoring(PBPM)is a significant research area in Business Process Management(BPM)aimed at accurately forecasting future behavioral events.At present,deep learning methods are widely cited in PBPM research,but no method has been effective in fusing data information into the control flow for multi-perspective process prediction.Therefore,this paper proposes a process prediction method based on the hierarchical BERT and multi-perspective data fusion.Firstly,the first layer BERT network learns the correlations between different category attribute data.Then,the attribute data is integrated into a weighted event-level feature vector and input into the second layer BERT network to learn the impact and priority relationship of each event on future predicted events.Next,the multi-head attention mechanism within the framework is visualized for analysis,helping to understand the decision-making logic of the framework and providing visual predictions.Finally,experimental results show that the predictive accuracy of the framework surpasses the current state-of-the-art research methods and significantly enhances the predictive performance of BPM. 展开更多
关键词 Business process prediction monitoring deep learning attention mechanism BERT multi-perspective
下载PDF
Review of artificial intelligence applications in astronomical data processing
15
作者 Hailong Zhang Jie Wang +3 位作者 Yazhou Zhang Xu Du Han Wu Ting Zhang 《Astronomical Techniques and Instruments》 CSCD 2024年第1期1-15,共15页
Artificial Intelligence(AI)is an interdisciplinary research field with widespread applications.It aims at developing theoretical,methodological,technological,and applied systems that simulate,enhance,and assist human ... Artificial Intelligence(AI)is an interdisciplinary research field with widespread applications.It aims at developing theoretical,methodological,technological,and applied systems that simulate,enhance,and assist human intelligence.Recently,notable accomplishments of artificial intelligence technology have been achieved in astronomical data processing,establishing this technology as central to numerous astronomical research areas such as radio astronomy,stellar and galactic(Milky Way)studies,exoplanets surveys,cosmology,and solar physics.This article systematically reviews representative applications of artificial intelligence technology to astronomical data processing,with comprehensive description of specific cases:pulsar candidate identification,fast radio burst detection,gravitational wave detection,spectral classification,and radio frequency interference mitigation.Furthermore,it discusses possible future applications to provide perspectives for astronomical research in the artificial intelligence era. 展开更多
关键词 Astronomical techniques Astronomical methods Astroinformatics
下载PDF
A Prediction-Based Multi-Objective VM Consolidation Approach for Cloud Data Centers
16
作者 Xialin Liu Junsheng Wu +1 位作者 Lijun Chen Jiyuan Hu 《Computers, Materials & Continua》 SCIE EI 2024年第7期1601-1631,共31页
Virtual machine(VM)consolidation aims to run VMs on the least number of physical machines(PMs).The optimal consolidation significantly reduces energy consumption(EC),quality of service(QoS)in applications,and resource... Virtual machine(VM)consolidation aims to run VMs on the least number of physical machines(PMs).The optimal consolidation significantly reduces energy consumption(EC),quality of service(QoS)in applications,and resource utilization.This paper proposes a prediction-basedmulti-objective VMconsolidation approach to search for the best mapping between VMs and PMs with good timeliness and practical value.We use a hybrid model based on Auto-Regressive Integrated Moving Average(ARIMA)and Support Vector Regression(SVR)(HPAS)as a prediction model and consolidate VMs to PMs based on prediction results by HPAS,aiming at minimizing the total EC,performance degradation(PD),migration cost(MC)and resource wastage(RW)simultaneously.Experimental results usingMicrosoft Azure trace show the proposed approach has better prediction accuracy and overcomes the multi-objective consolidation approach without prediction(i.e.,Non-dominated sorting genetic algorithm 2,Nsga2)and the renowned Overload Host Detection(OHD)approaches without prediction,such as Linear Regression(LR),Median Absolute Deviation(MAD)and Inter-Quartile Range(IQR). 展开更多
关键词 VM consolidation PREDICTION multi-objective optimization machine learning
下载PDF
A Non-Parametric Scheme for Identifying Data Characteristic Based on Curve Similarity Matching
17
作者 Quanbo Ge Yang Cheng +3 位作者 Hong Li Ziyi Ye Yi Zhu Gang Yao 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1424-1437,共14页
For accurately identifying the distribution charac-teristic of Gaussian-like noises in unmanned aerial vehicle(UAV)state estimation,this paper proposes a non-parametric scheme based on curve similarity matching.In the... For accurately identifying the distribution charac-teristic of Gaussian-like noises in unmanned aerial vehicle(UAV)state estimation,this paper proposes a non-parametric scheme based on curve similarity matching.In the framework of the pro-posed scheme,a Parzen window(kernel density estimation,KDE)method on sliding window technology is applied for roughly esti-mating the sample probability density,a precise data probability density function(PDF)model is constructed with the least square method on K-fold cross validation,and the testing result based on evaluation method is obtained based on some data characteristic analyses of curve shape,abruptness and symmetry.Some com-parison simulations with classical methods and UAV flight exper-iment shows that the proposed scheme has higher recognition accuracy than classical methods for some kinds of Gaussian-like data,which provides better reference for the design of Kalman filter(KF)in complex water environment. 展开更多
关键词 Curve similarity matching Gaussian-like noise non-parametric scheme parzen window.
下载PDF
Autonomous UAV 3D trajectory optimization and transmission scheduling for sensor data collection on uneven terrains
18
作者 Andrey V.Savkin Satish C.Verma Wei Ni 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2023年第12期154-160,共7页
This paper considers a time-constrained data collection problem from a network of ground sensors located on uneven terrain by an Unmanned Aerial Vehicle(UAV),a typical Unmanned Aerial System(UAS).The ground sensors ha... This paper considers a time-constrained data collection problem from a network of ground sensors located on uneven terrain by an Unmanned Aerial Vehicle(UAV),a typical Unmanned Aerial System(UAS).The ground sensors harvest renewable energy and are equipped with batteries and data buffers.The ground sensor model takes into account sensor data buffer and battery limitations.An asymptotically globally optimal method of joint UAV 3D trajectory optimization and data transmission schedule is developed.The developed method maximizes the amount of data transmitted to the UAV without losses and too long delays and minimizes the propulsion energy of the UAV.The developed algorithm of optimal trajectory optimization and transmission scheduling is based on dynamic programming.Computer simulations demonstrate the effectiveness of the proposed algorithm. 展开更多
关键词 Unmanned aerial system UAS Unmanned aerial vehicle UAV Wireless sensor networks UAS-Assisted data collection 3D trajectory optimization Data transmission scheduling
下载PDF
Human Stress Recognition by Correlating Vision and EEG Data
19
作者 S.Praveenkumar T.Karthick 《Computer Systems Science & Engineering》 SCIE EI 2023年第6期2417-2433,共17页
Because stress has such a powerful impact on human health,we must be able to identify it automatically in our everyday lives.The human activity recognition(HAR)system use data from several kinds of sensors to try to r... Because stress has such a powerful impact on human health,we must be able to identify it automatically in our everyday lives.The human activity recognition(HAR)system use data from several kinds of sensors to try to recognize and evaluate human actions automatically recognize and evaluate human actions.Using the multimodal dataset DEAP(Database for Emotion Analysis using Physiological Signals),this paper presents deep learning(DL)technique for effectively detecting human stress.The combination of vision-based and sensor-based approaches for recognizing human stress will help us achieve the increased efficiency of current stress recognition systems and predict probable actions in advance of when fatal.Based on visual and EEG(Electroencephalogram)data,this research aims to enhance the performance and extract the dominating characteristics of stress detection.For the stress identification test,we utilized the DEAP dataset,which included video and EEG data.We also demonstrate that combining video and EEG characteristics may increase overall performance,with the suggested stochastic features providing the most accurate results.In the first step,CNN(Convolutional Neural Network)extracts feature vectors from video frames and EEG data.Feature Level(FL)fusion that combines the features extracted from video and EEG data.We use XGBoost as our classifier model to predict stress,and we put it into action.The stress recognition accuracy of the proposed method is compared to existing methods of Decision Tree(DT),Random Forest(RF),AdaBoost,Linear Discriminant Analysis(LDA),and KNearest Neighborhood(KNN).When we compared our technique to existing state-of-the-art approaches,we found that the suggested DL methodology combining multimodal and heterogeneous inputs may improve stress identification. 展开更多
关键词 Mental stress physiological data XGBoost feature fusion DEAP video data EEG CNN HAR
下载PDF
Intelligent Electrocardiogram Analysis in Medicine:Data,Methods,and Applications
20
作者 Yu-Xia Guan Ying An +2 位作者 Feng-Yi Guo Wei-Bai Pan Jian-Xin Wang 《Chinese Medical Sciences Journal》 CAS CSCD 2023年第1期38-48,共11页
Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been wi... Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been widely used in various biomedical applications such as arrhythmia detection,disease-specific detection,mortality prediction,and biometric recognition.In recent years,ECG-related studies have been carried out using a variety of publicly available datasets,with many differences in the datasets used,data preprocessing methods,targeted challenges,and modeling and analysis techniques.Here we systematically summarize and analyze the ECGbased automatic analysis methods and applications.Specifically,we first reviewed 22 commonly used ECG public datasets and provided an overview of data preprocessing processes.Then we described some of the most widely used applications of ECG signals and analyzed the advanced methods involved in these applications.Finally,we elucidated some of the challenges in ECG analysis and provided suggestions for further research. 展开更多
关键词 ELECTROCARDIOGRAM DATABASE PREPROCESSING machine learning medical big data analysis
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部