期刊文献+
共找到336,918篇文章
< 1 2 250 >
每页显示 20 50 100
Uncertainties of ENSO-related Regional Hadley Circulation Anomalies within Eight Reanalysis Datasets
1
作者 Yadi LI Xichen LI +3 位作者 Juan FENG Yi ZHOU Wenzhu WANG Yurong HOU 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第1期115-140,共26页
El Nino-Southern Oscillation(ENSO),the leading mode of global interannual variability,usually intensifies the Hadley Circulation(HC),and meanwhile constrains its meridional extension,leading to an equatorward movement... El Nino-Southern Oscillation(ENSO),the leading mode of global interannual variability,usually intensifies the Hadley Circulation(HC),and meanwhile constrains its meridional extension,leading to an equatorward movement of the jet system.Previous studies have investigated the response of HC to ENSO events using different reanalysis datasets and evaluated their capability in capturing the main features of ENSO-associated HC anomalies.However,these studies mainly focused on the global HC,represented by a zonal-mean mass stream function(MSF).Comparatively fewer studies have evaluated HC responses from a regional perspective,partly due to the prerequisite of the Stokes MSF,which prevents us from integrating a regional HC.In this study,we adopt a recently developed technique to construct the three-dimensional structure of HC and evaluate the capability of eight state-of-the-art reanalyses in reproducing the regional HC response to ENSO events.Results show that all eight reanalyses reproduce the spatial structure of HC responses well,with an intensified HC around the central-eastern Pacific but weakened circulations around the Indo-Pacific warm pool and tropical Atlantic.The spatial correlation coefficient of the three-dimensional HC anomalies among the different datasets is always larger than 0.93.However,these datasets may not capture the amplitudes of the HC responses well.This uncertainty is especially large for ENSO-associated equatorially asymmetric HC anomalies,with the maximum amplitude in Climate Forecast System Reanalysis(CFSR)being about 2.7 times the minimum value in the Twentieth Century Reanalysis(20CR).One should be careful when using reanalysis data to evaluate the intensity of ENSO-associated HC anomalies. 展开更多
关键词 regional Hadley circulation ENSO atmosphere-ocean interaction reanalysis data
下载PDF
Dealing with the Data Imbalance Problem in Pulsar Candidate Sifting Based on Feature Selection
2
作者 Haitao Lin Xiangru Li 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第2期125-137,共13页
Pulsar detection has become an active research topic in radio astronomy recently.One of the essential procedures for pulsar detection is pulsar candidate sifting(PCS),a procedure for identifying potential pulsar signa... Pulsar detection has become an active research topic in radio astronomy recently.One of the essential procedures for pulsar detection is pulsar candidate sifting(PCS),a procedure for identifying potential pulsar signals in a survey.However,pulsar candidates are always class-imbalanced,as most candidates are non-pulsars such as RFI and only a tiny part of them are from real pulsars.Class imbalance can greatly affect the performance of machine learning(ML)models,resulting in a heavy cost as some real pulsars are misjudged.To deal with the problem,techniques of choosing relevant features to discriminate pulsars from non-pulsars are focused on,which is known as feature selection.Feature selection is a process of selecting a subset of the most relevant features from a feature pool.The distinguishing features between pulsars and non-pulsars can significantly improve the performance of the classifier even if the data are highly imbalanced.In this work,an algorithm for feature selection called the K-fold Relief-Greedy(KFRG)algorithm is designed.KFRG is a two-stage algorithm.In the first stage,it filters out some irrelevant features according to their K-fold Relief scores,while in the second stage,it removes the redundant features and selects the most relevant features by a forward greedy search strategy.Experiments on the data set of the High Time Resolution Universe survey verified that ML models based on KFRG are capable of PCS,correctly separating pulsars from non-pulsars even if the candidates are highly class-imbalanced. 展开更多
关键词 methods data analysis-(stars:)pulsars general-methods statistical
下载PDF
Pavement Cracks Coupled With Shadows:A New Shadow-Crack Dataset and A Shadow-Removal-Oriented Crack Detection Approach 被引量:2
3
作者 Lili Fan Shen Li +3 位作者 Ying Li Bai Li Dongpu Cao Fei-Yue Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第7期1593-1607,共15页
Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,whi... Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,which interfere with the crack detection performance.Till to the present,there still lacks efficient algorithm models and training datasets to deal with the interference brought by the shadows.To fill in the gap,we made several contributions as follows.First,we proposed a new pavement shadow and crack dataset,which contains a variety of shadow and pavement pixel size combinations.It also covers all common cracks(linear cracks and network cracks),placing higher demands on crack detection methods.Second,we designed a two-step shadow-removal-oriented crack detection approach:SROCD,which improves the performance of the algorithm by first removing the shadow and then detecting it.In addition to shadows,the method can cope with other noise disturbances.Third,we explored the mechanism of how shadows affect crack detection.Based on this mechanism,we propose a data augmentation method based on the difference in brightness values,which can adapt to brightness changes caused by seasonal and weather changes.Finally,we introduced a residual feature augmentation algorithm to detect small cracks that can predict sudden disasters,and the algorithm improves the performance of the model overall.We compare our method with the state-of-the-art methods on existing pavement crack datasets and the shadow-crack dataset,and the experimental results demonstrate the superiority of our method. 展开更多
关键词 Automatic pavement crack detection data augmentation compensation deep learning residual feature augmentation shadow removal shadow-crack dataset
下载PDF
Heterogeneous decentralised machine unlearning with seed model distillation 被引量:1
4
作者 Guanhua Ye Tong Chen +1 位作者 Quoc Viet Hung Nguyen Hongzhi Yin 《CAAI Transactions on Intelligence Technology》 SCIE EI 2024年第3期608-619,共12页
As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality in... As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality into their consideration.The most straight-forward method to unlearn users'contribution is to retrain the model from the initial state,which is not realistic in high throughput applications with frequent unlearning requests.Though some machine unlearning frameworks have been proposed to speed up the retraining process,they fail to match decentralised learning scenarios.A decentralised unlearning framework called heterogeneous decentralised unlearning framework with seed(HDUS)is designed,which uses distilled seed models to construct erasable en-sembles for all clients.Moreover,the framework is compatible with heterogeneous on-device models,representing stronger scalability in real-world applications.Extensive experiments on three real-world datasets show that our HDUS achieves state-of-the-art performance. 展开更多
关键词 data mining data privacy machine learning
下载PDF
Enhanced prediction of anisotropic deformation behavior using machine learning with data augmentation 被引量:1
5
作者 Sujeong Byun Jinyeong Yu +3 位作者 Seho Cheon Seong Ho Lee Sung Hyuk Park Taekyung Lee 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第1期186-196,共11页
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w... Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys. 展开更多
关键词 Plastic anisotropy Compression ANNEALING Machine learning Data augmentation
下载PDF
Benchmark experiment on slab^(238)U with D-T neutrons for validation of evaluated nuclear data 被引量:1
6
作者 Yan-Yan Ding Yang-Bo Nie +9 位作者 Yue Zhang Zhi-Jie Hu Qi Zhao Huan-Yu Zhang Kuo-Zhi Xu Shi-Yu Zhang Xin-Yi Pan Chang-Lin Lan Jie Ren Xi-Chao Ruan 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第2期145-159,共15页
A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°an... A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results. 展开更多
关键词 Leakage neutron spectra URANIUM D-T neutron source Evaluated nuclear data
下载PDF
A landslide monitoring method using data from unmanned aerial vehicle and terrestrial laser scanning with insufficient and inaccurate ground control points 被引量:1
7
作者 Jiawen Zhou Nan Jiang +1 位作者 Congjiang Li Haibo Li 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第10期4125-4140,共16页
Non-contact remote sensing techniques,such as terrestrial laser scanning(TLS)and unmanned aerial vehicle(UAV)photogrammetry,have been globally applied for landslide monitoring in high and steep mountainous areas.These... Non-contact remote sensing techniques,such as terrestrial laser scanning(TLS)and unmanned aerial vehicle(UAV)photogrammetry,have been globally applied for landslide monitoring in high and steep mountainous areas.These techniques acquire terrain data and enable ground deformation monitoring.However,practical application of these technologies still faces many difficulties due to complex terrain,limited access and dense vegetation.For instance,monitoring high and steep slopes can obstruct the TLS sightline,and the accuracy of the UAV model may be compromised by absence of ground control points(GCPs).This paper proposes a TLS-and UAV-based method for monitoring landslide deformation in high mountain valleys using traditional real-time kinematics(RTK)-based control points(RCPs),low-precision TLS-based control points(TCPs)and assumed control points(ACPs)to achieve high-precision surface deformation analysis under obstructed vision and impassable conditions.The effects of GCP accuracy,GCP quantity and automatic tie point(ATP)quantity on the accuracy of UAV modeling and surface deformation analysis were comprehensively analyzed.The results show that,the proposed method allows for the monitoring accuracy of landslides to exceed the accuracy of the GCPs themselves by adding additional low-accuracy GCPs.The proposed method was implemented for monitoring the Xinhua landslide in Baoxing County,China,and was validated against data from multiple sources. 展开更多
关键词 Landslide monitoring Data fusion Terrestrial laser scanning(TLS) Unmanned aerial vehicle(UAV) Model reconstruction
下载PDF
A 28/56 Gb/s NRZ/PAM-4 dual-mode transceiver with 1/4 rate reconfigurable 4-tap FFE and half-rate slicer in a 28-nm CMOS 被引量:1
8
作者 Yukun He Zhao Yuan +5 位作者 Kanan Wang Renjie Tang Yunxiang He Xian Chen Zhengyang Ye Xiaoyan Gui 《Journal of Semiconductors》 EI CAS CSCD 2024年第6期35-46,共12页
A 28/56 Gb/s NRZ/PAM-4 dual-mode transceiver(TRx)designed in a 28-nm complementary metal-oxide-semiconduc-tor(CMOS)process is presented in this article.A voltage-mode(VM)driver featuring a 4-tap reconfigurable feed-fo... A 28/56 Gb/s NRZ/PAM-4 dual-mode transceiver(TRx)designed in a 28-nm complementary metal-oxide-semiconduc-tor(CMOS)process is presented in this article.A voltage-mode(VM)driver featuring a 4-tap reconfigurable feed-forward equal-izer(FFE)is employed in the quarter-rate transmitter(TX).The half-rate receiver(RX)incorporates a continuous-time linear equal-izer(CTLE),a 3-stage high-speed slicer with multi-clock-phase sampling,and a clock and data recovery(CDR).The experimen-tal results show that the TRx operates at a maximum speed of 56 Gb/s with chip-on board(COB)assembly.The 28 Gb/s NRZ eye diagram shows a far-end vertical eye opening of 210 mV with an output amplitude of 351 mV single-ended and the 56 Gb/s PAM-4 eye diagram exhibits far-end eye opening of 33 mV(upper-eye),31 mV(mid-eye),and 28 mV(lower-eye)with an output amplitude of 353 mV single-ended.The recovered 14 GHz clock from the RX exhibits random jitter(RJ)of 469 fs and deterministic jitter(DJ)of 8.76 ps.The 875 Mb/s de-multiplexed data features 593 ps horizontal eye opening with 32.02 ps RJ,at bit-error rate(BER)of 10-5(0.53 UI).The power dissipation of TX and RX are 125 and 181.4 mW,respectively,from a 0.9-V sup-ply. 展开更多
关键词 transceiver(TRx) feed-forward equalizer(FFE) clock and data recovery(CDR) continuous time linear equalizer(CTLE)
下载PDF
SHT-based public auditing protocol with error tolerance in FDL-empowered IoVs
9
作者 Kui Zhu Yongjun Ren +2 位作者 Jian Shen Pandi Vijayakumar Pradip Kumar Sharma 《Digital Communications and Networks》 SCIE CSCD 2024年第1期142-149,共8页
With the intelligentization of the Internet of Vehicles(lovs),Artificial Intelligence(Al)technology is becoming more and more essential,especially deep learning.Federated Deep Learning(FDL)is a novel distributed machi... With the intelligentization of the Internet of Vehicles(lovs),Artificial Intelligence(Al)technology is becoming more and more essential,especially deep learning.Federated Deep Learning(FDL)is a novel distributed machine learning technology and is able to address the challenges like data security,privacy risks,and huge communication overheads from big raw data sets.However,FDL can only guarantee data security and privacy among multiple clients during data training.If the data sets stored locally in clients are corrupted,including being tampered with and lost,the training results of the FDL in intelligent IoVs must be negatively affected.In this paper,we are the first to design a secure data auditing protocol to guarantee the integrity and availability of data sets in FDL-empowered IoVs.Specifically,the cuckoo filter and Reed-Solomon codes are utilized to guarantee error tolerance,including efficient corrupted data locating and recovery.In addition,a novel data structure,Skip Hash Table(SHT)is designed to optimize data dynamics.Finally,we illustrate the security of the scheme with the Computational Diffie-Hellman(CDH)assumption on bilinear groups.Sufficient theoretical analyses and performance evaluations demonstrate the security and efficiency of our scheme for data sets in FDL-empowered IoVs. 展开更多
关键词 Internet of vehicles Federated deep learning Data security Data auditing Data locating and recovery
下载PDF
VKFQ:A Verifiable Keyword Frequency Query Framework with Local Differential Privacy in Blockchain
10
作者 Youlin Ji Bo Yin Ke Gu 《Computers, Materials & Continua》 SCIE EI 2024年第3期4205-4223,共19页
With its untameable and traceable properties,blockchain technology has been widely used in the field of data sharing.How to preserve individual privacy while enabling efficient data queries is one of the primary issue... With its untameable and traceable properties,blockchain technology has been widely used in the field of data sharing.How to preserve individual privacy while enabling efficient data queries is one of the primary issues with secure data sharing.In this paper,we study verifiable keyword frequency(KF)queries with local differential privacy in blockchain.Both the numerical and the keyword attributes are present in data objects;the latter are sensitive and require privacy protection.However,prior studies in blockchain have the problem of trilemma in privacy protection and are unable to handle KF queries.We propose an efficient framework that protects data owners’privacy on keyword attributes while enabling quick and verifiable query processing for KF queries.The framework computes an estimate of a keyword’s frequency and is efficient in query time and verification object(VO)size.A utility-optimized local differential privacy technique is used for privacy protection.The data owner adds noise locally into data based on local differential privacy so that the attacker cannot infer the owner of the keywords while keeping the difference in the probability distribution of the KF within the privacy budget.We propose the VB-cm tree as the authenticated data structure(ADS).The VB-cm tree combines the Verkle tree and the Count-Min sketch(CM-sketch)to lower the VO size and query time.The VB-cm tree uses the vector commitment to verify the query results.The fixed-size CM-sketch,which summarizes the frequency of multiple keywords,is used to estimate the KF via hashing operations.We conduct an extensive evaluation of the proposed framework.The experimental results show that compared to theMerkle B+tree,the query time is reduced by 52.38%,and the VO size is reduced by more than one order of magnitude. 展开更多
关键词 SECURITY data sharing blockchain data query privacy protection
下载PDF
Traffic Flow Prediction with Heterogeneous Spatiotemporal Data Based on a Hybrid Deep Learning Model Using Attention-Mechanism
11
作者 Jing-Doo Wang Chayadi Oktomy Noto Susanto 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第8期1711-1728,共18页
A significant obstacle in intelligent transportation systems(ITS)is the capacity to predict traffic flow.Recent advancements in deep neural networks have enabled the development of models to represent traffic flow acc... A significant obstacle in intelligent transportation systems(ITS)is the capacity to predict traffic flow.Recent advancements in deep neural networks have enabled the development of models to represent traffic flow accurately.However,accurately predicting traffic flow at the individual road level is extremely difficult due to the complex interplay of spatial and temporal factors.This paper proposes a technique for predicting short-term traffic flow data using an architecture that utilizes convolutional bidirectional long short-term memory(Conv-BiLSTM)with attention mechanisms.Prior studies neglected to include data pertaining to factors such as holidays,weather conditions,and vehicle types,which are interconnected and significantly impact the accuracy of forecast outcomes.In addition,this research incorporates recurring monthly periodic pattern data that significantly enhances the accuracy of forecast outcomes.The experimental findings demonstrate a performance improvement of 21.68%when incorporating the vehicle type feature. 展开更多
关键词 Traffic flow prediction sptiotemporal data heterogeneous data Conv-BiLSTM DATA-CENTRIC intra-data
下载PDF
A Power Data Anomaly Detection Model Based on Deep Learning with Adaptive Feature Fusion
12
作者 Xiu Liu Liang Gu +3 位作者 Xin Gong Long An Xurui Gao Juying Wu 《Computers, Materials & Continua》 SCIE EI 2024年第6期4045-4061,共17页
With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve suffi... With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve sufficient extraction of data features,which seriously affects the accuracy and performance of anomaly detection.Therefore,this paper proposes a deep learning-based anomaly detection model for power data,which integrates a data alignment enhancement technique based on random sampling and an adaptive feature fusion method leveraging dimension reduction.Aiming at the distribution variability of power data,this paper developed a sliding window-based data adjustment method for this model,which solves the problem of high-dimensional feature noise and low-dimensional missing data.To address the problem of insufficient feature fusion,an adaptive feature fusion method based on feature dimension reduction and dictionary learning is proposed to improve the anomaly data detection accuracy of the model.In order to verify the effectiveness of the proposed method,we conducted effectiveness comparisons through elimination experiments.The experimental results show that compared with the traditional anomaly detection methods,the method proposed in this paper not only has an advantage in model accuracy,but also reduces the amount of parameter calculation of the model in the process of feature matching and improves the detection speed. 展开更多
关键词 Data alignment dimension reduction feature fusion data anomaly detection deep learning
下载PDF
A privacy-preserving method for publishing data with multiple sensitive attributes
13
作者 Tong Yi Minyong Shi +1 位作者 Wenqian Shang Haibin Zhu 《CAAI Transactions on Intelligence Technology》 SCIE EI 2024年第1期222-238,共17页
The overgeneralisation may happen because most studies on data publishing for multiple sensitive attributes(SAs)have not considered the personalised privacy requirement.Furthermore,sensitive information disclosure may... The overgeneralisation may happen because most studies on data publishing for multiple sensitive attributes(SAs)have not considered the personalised privacy requirement.Furthermore,sensitive information disclosure may also be caused by these personalised requirements.To address the matter,this article develops a personalised data publishing method for multiple SAs.According to the requirements of individuals,the new method partitions SAs values into two categories:private values and public values,and breaks the association between them for privacy guarantees.For the private values,this paper takes the process of anonymisation,while the public values are released without this process.An algorithm is designed to achieve the privacy mode,where the selectivity is determined by the sensitive value frequency and undesirable objects.The experimental results show that the proposed method can provide more information utility when compared with previous methods.The theoretic analyses and experiments also indicate that the privacy can be guaranteed even though the public values are known to an adversary.The overgeneralisation and privacy breach caused by the personalised requirement can be avoided by the new method. 展开更多
关键词 data privacy data publishing
下载PDF
Enhancing Data Analysis and Automation: Integrating Python with Microsoft Excel for Non-Programmers
14
作者 Osama Magdy Ali Mohamed Breik +2 位作者 Tarek Aly Atef Tayh Nour El-Din Raslan Mervat Gheith 《Journal of Software Engineering and Applications》 2024年第6期530-540,共11页
Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision... Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision-making across diverse domains. Conversely, Python is indispensable for professional programming due to its versatility, readability, extensive libraries, and robust community support. It enables efficient development, advanced data analysis, data mining, and automation, catering to diverse industries and applications. However, one primary issue when using Microsoft Excel with Python libraries is compatibility and interoperability. While Excel is a widely used tool for data storage and analysis, it may not seamlessly integrate with Python libraries, leading to challenges in reading and writing data, especially in complex or large datasets. Additionally, manipulating Excel files with Python may not always preserve formatting or formulas accurately, potentially affecting data integrity. Moreover, dependency on Excel’s graphical user interface (GUI) for automation can limit scalability and reproducibility compared to Python’s scripting capabilities. This paper covers the integration solution of empowering non-programmers to leverage Python’s capabilities within the familiar Excel environment. This enables users to perform advanced data analysis and automation tasks without requiring extensive programming knowledge. Based on Soliciting feedback from non-programmers who have tested the integration solution, the case study shows how the solution evaluates the ease of implementation, performance, and compatibility of Python with Excel versions. 展开更多
关键词 PYTHON End-User Approach Microsoft Excel Data Analysis Integration SPREADSHEET PROGRAMMING Data Visualization
下载PDF
Applying Hybrid Clustering in Pulsar Candidate Sifting with Multi-modality for FAST Survey
15
作者 Zi-Yi You Yun-Rong Pan +11 位作者 Zhi Ma Li Zhang Shuo Xiao Dan-Dan Zhang Shi-Jun Dang Ru-Shuang Zhao Pei Wang Ai-Jun Dong Jia-Tao Jiang Ji-Bing Leng Wei-An Li Si-Yao Li 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第3期283-296,共14页
Pulsar search is always the basis of pulsar navigation,gravitational wave detection and other research topics.Currently,the volume of pulsar candidates collected by the Five-hundred-meter Aperture Spherical radio Tele... Pulsar search is always the basis of pulsar navigation,gravitational wave detection and other research topics.Currently,the volume of pulsar candidates collected by the Five-hundred-meter Aperture Spherical radio Telescope(FAST)shows an explosive growth rate that has brought challenges for its pulsar candidate filtering system.Particularly,the multi-view heterogeneous data and class imbalance between true pulsars and non-pulsar candidates have negative effects on traditional single-modal supervised classification methods.In this study,a multi-modal and semi-supervised learning based on a pulsar candidate sifting algorithm is presented,which adopts a hybrid ensemble clustering scheme of density-based and partition-based methods combined with a feature-level fusion strategy for input data and a data partition strategy for parallelization.Experiments on both High Time Resolution Universe SurveyⅡ(HTRU2)and actual FAST observation data demonstrate that the proposed algorithm could excellently identify pulsars:On HTRU2,the precision and recall rates of its parallel mode reach0.981 and 0.988 respectively.On FAST data,those of its parallel mode reach 0.891 and 0.961,meanwhile,the running time also significantly decreases with the increment of parallel nodes within limits.Thus,we can conclude that our algorithm could be a feasible idea for large scale pulsar candidate sifting for FAST drift scan observation. 展开更多
关键词 METHODS data analysis-surveys-methods numerical
下载PDF
Real-time Abnormal Detection of GWAC Light Curve based on Wavelet Transform Combined with GRU-Attention
16
作者 Hao Li Qing Zhao +3 位作者 Long Shao Tao Liu Chenzhou Cui Yunfei Xu 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第5期151-168,共18页
Nowadays, astronomy has entered the era of Time-Domain Astronomy, and the study of the time-varying light curves of various types of objects is of great significance in revealing the physical properties and evolutiona... Nowadays, astronomy has entered the era of Time-Domain Astronomy, and the study of the time-varying light curves of various types of objects is of great significance in revealing the physical properties and evolutionary history of celestial bodies. The Ground-based Wide Angle Cameras telescope, on which this paper is based, has observed more than 10 million light curves, and the detection of anomalies in the light curves can be used to rapidly detect transient rare phenomena such as microgravity lensing events from the massive data. However, the traditional statistically based anomaly detection methods cannot realize the fast processing of massive data. In this paper, we propose a Discrete Wavelet(DW)-Gate Recurrent Unit-Attention(GRU-Attention) light curve warning model. Wavelet transform has good effect on data noise reduction processing and feature extraction, which can provide richer and more stable input features for a neural network, and the neural network can provide more flexible and powerful output model for wavelet transform. Comparison experiments show an average improvement of 61% compared to the previous pure long-short-term memory unit(LSTM) model, and an average improvement of 53.5% compared to the previous GRU model. The efficiency and accuracy of anomaly detection in previous paper work are not good enough, the method proposed in this paper possesses higher efficiency and accuracy,which incorporates the Attention mechanism to find out the key parts of the light curve that determine the anomalies. These parts are assigned higher weights, and in the actual anomaly detection, the star is detected with83.35% anomalies on average, and the DW-GRU-Attention model is compared with the DW-LSTM model, and the detection result f1 is improved by 5.75% on average, while having less training time, thus providing valuable information and guidance for astronomical observation and research. 展开更多
关键词 methods data analysis-stars VARIABLES general-techniques PHOTOMETRIC
下载PDF
X-Ray Source Classification Using Machine Learning:A Study with EP-WXT Pathfinder LEIA
17
作者 Xiaoxiong Zuo Yihan Tao +7 位作者 Yuan Liu Yunfei Xu Wenda Zhang Haiwu Pan Hui Sun Zhen Zhang Chenzhou Cui Weimin Yuan 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第8期175-195,共21页
X-ray observations play a crucial role in time-domain astronomy.The Einstein Probe(EP),a recently launched X-ray astronomical satellite,emerges as a forefront player in the field of time-domain astronomy and high-ener... X-ray observations play a crucial role in time-domain astronomy.The Einstein Probe(EP),a recently launched X-ray astronomical satellite,emerges as a forefront player in the field of time-domain astronomy and high-energy astrophysics.With a focus on systematic surveys in the soft X-ray band,EP aims to discover high-energy transients and monitor variable sources in the universe.To achieve these objectives,a quick and reliable classification of observed sources is essential.In this study,we developed a machine learning classifier for autonomous source classification using data from the EP-WXT Pathfinder—Lobster Eye Imager for Astronomy(LEIA)and EP-WXT simulations.The proposed Random Forest classifier,built on selected features derived from light curves,energy spectra,and location information,achieves an accuracy of approximately 95%on EP simulation data and 98%on LEIA observational data.The classifier is integrated into the LEIA data processing pipeline,serving as a tool for manual validation and rapid classification during observations.This paper presents an efficient method for the classification of X-ray sources based on single observations,along with implications of most effective features for the task.This work facilitates rapid source classification for the EP mission and also provides valuable insights into feature selection and classification techniques for enhancing the efficiency and accuracy of X-ray source classification that can be adapted to other X-ray telescope data. 展开更多
关键词 methods data analysis-X-rays binaries-stars VARIABLES general-X-rays BURSTS
下载PDF
Research on Enhanced Contraband Dataset ACXray Based on ETL
18
作者 Xueping Song Jianming Yang +1 位作者 Shuyu Zhang Jicun Zhang 《Computers, Materials & Continua》 SCIE EI 2024年第6期4551-4572,共22页
To address the shortage of public datasets for customs X-ray images of contraband and the difficulties in deploying trained models in engineering applications,a method has been proposed that employs the Extract-Transf... To address the shortage of public datasets for customs X-ray images of contraband and the difficulties in deploying trained models in engineering applications,a method has been proposed that employs the Extract-Transform-Load(ETL)approach to create an X-ray dataset of contraband items.Initially,X-ray scatter image data is collected and cleaned.Using Kafka message queues and the Elasticsearch(ES)distributed search engine,the data is transmitted in real-time to cloud servers.Subsequently,contraband data is annotated using a combination of neural networks and manual methods to improve annotation efficiency and implemented mean hash algorithm for quick image retrieval.The method of integrating targets with backgrounds has enhanced the X-ray contraband image data,increasing the number of positive samples.Finally,an Airport Customs X-ray dataset(ACXray)compatible with customs business scenarios has been constructed,featuring an increased number of positive contraband samples.Experimental tests using three datasets to train the Mask Region-based Convolutional Neural Network(Mask R-CNN)algorithm and tested on 400 real customs images revealed that the recognition accuracy of algorithms trained with Security Inspection X-ray(SIXray)and Occluded Prohibited Items X-ray(OPIXray)decreased by 16.3%and 15.1%,respectively,while the ACXray dataset trained algorithm’s accuracy was almost unaffected.This indicates that the ACXray dataset-trained algorithm possesses strong generalization capabilities and is more suitable for customs detection scenarios. 展开更多
关键词 X-ray contraband ETL data enhancement datasET
下载PDF
Improving Diversity with Multi-Loss Adversarial Training in Personalized News Recommendation
19
作者 Ruijin Xue Shuang Feng Qi Wang 《Computers, Materials & Continua》 SCIE EI 2024年第8期3107-3122,共16页
Users’interests are often diverse and multi-grained,with their underlying intents even more so.Effectively captur-ing users’interests and uncovering the relationships between diverse interests are key to news recomm... Users’interests are often diverse and multi-grained,with their underlying intents even more so.Effectively captur-ing users’interests and uncovering the relationships between diverse interests are key to news recommendation.Meanwhile,diversity is an important metric for evaluating news recommendation algorithms,as users tend to reject excessive homogeneous information in their recommendation lists.However,recommendation models themselves lack diversity awareness,making it challenging to achieve a good balance between the accuracy and diversity of news recommendations.In this paper,we propose a news recommendation algorithm that achieves good performance in both accuracy and diversity.Unlike most existing works that solely optimize accuracy or employ more features to meet diversity,the proposed algorithm leverages the diversity-aware capability of the model.First,we introduce an augmented user model to fully capture user intent and the behavioral guidance they might undergo as a result.Specifically,we focus on the relationship between the original clicked news and the augmented clicked news.Moreover,we propose an effective adversarial training method for diversity(AT4D),which is a pluggable component that can enhance both the accuracy and diversity of news recommendation results.Extensive experiments on real-world datasets confirm the efficacy of the proposed algorithm in improving both the accuracy and diversity of news recommendations. 展开更多
关键词 News recommendation DIVERSITY ACCURACY data augmentation
下载PDF
A Quick Calculation Method for Radiation Pattern of Submillimeter Telescope with Deformation and Displacement
20
作者 Jia You Yi-Wei Yao Zheng Wang 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第3期249-265,共17页
Radiation pattern captures the electromagnetic performance of reflector antennas,which is significantly affected by the deformation of the primary reflector due to gravity and the displacement of the secondary reflect... Radiation pattern captures the electromagnetic performance of reflector antennas,which is significantly affected by the deformation of the primary reflector due to gravity and the displacement of the secondary reflector.During the design process of large reflector antennas,a substantial amount of time is often dedicated to iteratively adjusting structural parameters and validating electromagnetic performance.To improve the efficiency of the design process,we first propose an approximate calculation method of optical path difference(OPD)for the deformation of the primary reflector under gravity and the displacement of the secondary reflector.Then an OPD fitting function based on the modified Zernike polynomials is proposed to capture the phase difference of radiation over the aperture plane,based on which the radiation pattern will be obtained quickly by the aperture field integration method.Numerical experiments demonstrate the effectiveness of the proposed quick calculation method for analyzing the radiation pattern of a 10.4 m submillimeter telescope antenna at its highest operating frequency of 856 GHz.In comparison with the numerical simulation method based on GRASP(which is an antenna electromagnetic analysis tool combining physical optics(PO)and physical theory of diffraction(PTD)),the quick calculation method reduces the time for radiation pattern analysis from more than one hour to less than two minutes.Furthermore,the quick calculation method exhibits excellent accuracy for the figure of merit(FOM)of the radiation pattern.Therefore,the proposed quick calculation method can obtain the radiation pattern with high speed and accuracy.Compared to the time-consuming numerical simulation method(PO and PTD),it can be employed for quick analysis of the radiation pattern for the lateral displacement of the secondary reflector and the deformation of the primary reflector under gravity in the design process of a reflector antenna. 展开更多
关键词 INSTRUMENTATION high angular resolution-methods data analysis-methods ANALYTICAL
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部