期刊文献+
共找到62篇文章
< 1 2 4 >
每页显示 20 50 100
Credit Card Fraud Detection Using Improved Deep Learning Models
1
作者 Sumaya S.Sulaiman Ibraheem Nadher Sarab M.Hameed 《Computers, Materials & Continua》 SCIE EI 2024年第1期1049-1069,共21页
Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown pr... Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown promise in several fields,including detecting credit card fraud.However,the efficacy of these models is heavily dependent on the careful selection of appropriate hyperparameters.This paper introduces models that integrate deep learning models with hyperparameter tuning techniques to learn the patterns and relationships within credit card transaction data,thereby improving fraud detection.Three deep learning models:AutoEncoder(AE),Convolution Neural Network(CNN),and Long Short-Term Memory(LSTM)are proposed to investigate how hyperparameter adjustment impacts the efficacy of deep learning models used to identify credit card fraud.The experiments conducted on a European credit card fraud dataset using different hyperparameters and three deep learning models demonstrate that the proposed models achieve a tradeoff between detection rate and precision,leading these models to be effective in accurately predicting credit card fraud.The results demonstrate that LSTM significantly outperformed AE and CNN in terms of accuracy(99.2%),detection rate(93.3%),and area under the curve(96.3%).These proposed models have surpassed those of existing studies and are expected to make a significant contribution to the field of credit card fraud detection. 展开更多
关键词 Card fraud detection hyperparameter tuning deep learning autoencoder convolution neural network long short-term memory RESAMPLING
下载PDF
Imbalanced Data Classification Using SVM Based on Improved Simulated Annealing Featuring Synthetic Data Generation and Reduction 被引量:1
2
作者 Hussein Ibrahim Hussein Said Amirul Anwar Muhammad Imran Ahmad 《Computers, Materials & Continua》 SCIE EI 2023年第4期547-564,共18页
Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the perform... Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the performance of the machine learning algorithm such as Support Vector Machine(SVM)is affected when dealing with an imbalanced dataset.The classification accuracy is mostly skewed toward the majority class and poor results are exhibited in the prediction of minority-class samples.In this paper,a hybrid approach combining data pre-processing technique andSVMalgorithm based on improved Simulated Annealing(SA)was proposed.Firstly,the data preprocessing technique which primarily aims at solving the resampling strategy of handling imbalanced datasets was proposed.In this technique,the data were first synthetically generated to equalize the number of samples between classes and followed by a reduction step to remove redundancy and duplicated data.Next is the training of a balanced dataset using SVM.Since this algorithm requires an iterative process to search for the best penalty parameter during training,an improved SA algorithm was proposed for this task.In this proposed improvement,a new acceptance criterion for the solution to be accepted in the SA algorithm was introduced to enhance the accuracy of the optimization process.Experimental works based on ten publicly available imbalanced datasets have demonstrated higher accuracy in the classification tasks using the proposed approach in comparison with the conventional implementation of SVM.Registering at an average of 89.65%of accuracy for the binary class classification has demonstrated the good performance of the proposed works. 展开更多
关键词 Imbalanced data resampling technique data reduction support vector machine simulated annealing
下载PDF
Comparison of uniform resampling and nonuniform sampling direct-reconstruction methods in k-space for FD-OCT
3
作者 Yanrong Yang Yun Dai +1 位作者 Yuehua Zhou Yaliang Yang 《Journal of Innovative Optical Health Sciences》 SCIE EI CSCD 2023年第5期93-106,共14页
The nonuniform distribution of interference spectrum in wavenumber k-space is a key issue to limit the imaging quality of Fourier-domain optical coherence tomography(FD-OCT).At present,the reconstruction quality at di... The nonuniform distribution of interference spectrum in wavenumber k-space is a key issue to limit the imaging quality of Fourier-domain optical coherence tomography(FD-OCT).At present,the reconstruction quality at different depths among a variety of processing methods in k-space is still uncertain.Using simulated and experimental interference spectra at different depths,the effects of common six processing methods including uniform resampling(linear interpolation(LI),cubic spline interpolation(CSI),time-domain interpolation(TDI),and K-B window convolution)and nonuniform sampling direct-reconstruction(Lomb periodogram(LP)and nonuniform discrete Fourier transform(NDFT))on the reconstruction quality of FD-OCT were quantitatively analyzed and compared in this work.The results obtained by using simulated and experimental data were coincident.From the experimental results,the averaged peak intensity,axial resolution,and signal-to-noise ratio(SNR)of NDFT at depth from 0.5 to 3.0mm were improved by about 1.9 dB,1.4 times,and 11.8 dB,respectively,compared to the averaged indices of all the uniform resampling methods at all depths.Similarly,the improvements of the above three indices of LP were 2.0 dB,1.4 times,and 11.7 dB,respectively.The analysis method and the results obtained in this work are helpful to select an appropriate processing method in k-space,so as to improve the imaging quality of FD-OCT. 展开更多
关键词 Optical coherence tomography signal processing uniform resampling nonuniform sampling direct-reconstruction reconstruction quality.
下载PDF
ESR-PINNs:Physics-informed neural networks with expansion-shrinkage resampling selection strategies
4
作者 刘佳楠 侯庆志 +1 位作者 魏建国 孙泽玮 《Chinese Physics B》 SCIE EI CAS CSCD 2023年第7期337-346,共10页
Neural network methods have been widely used in many fields of scientific research with the rapid increase of computing power.The physics-informed neural networks(PINNs)have received much attention as a major breakthr... Neural network methods have been widely used in many fields of scientific research with the rapid increase of computing power.The physics-informed neural networks(PINNs)have received much attention as a major breakthrough in solving partial differential equations using neural networks.In this paper,a resampling technique based on the expansion-shrinkage point(ESP)selection strategy is developed to dynamically modify the distribution of training points in accordance with the performance of the neural networks.In this new approach both training sites with slight changes in residual values and training points with large residuals are taken into account.In order to make the distribution of training points more uniform,the concept of continuity is further introduced and incorporated.This method successfully addresses the issue that the neural network becomes ill or even crashes due to the extensive alteration of training point distribution.The effectiveness of the improved physics-informed neural networks with expansion-shrinkage resampling is demonstrated through a series of numerical experiments. 展开更多
关键词 physical informed neural networks RESAMPLING partial differential equation
下载PDF
Deep Learning Based Sentiment Analysis of COVID-19 Tweets via Resampling and Label Analysis
5
作者 Mamoona Humayun Danish Javed +2 位作者 Nz Jhanjhi Maram Fahaad Almufareh Saleh Naif Almuayqil 《Computer Systems Science & Engineering》 SCIE EI 2023年第10期575-591,共17页
Twitter has emerged as a platform that produces new data every day through its users which can be utilized for various purposes.People express their unique ideas and views onmultiple topics thus providing vast knowled... Twitter has emerged as a platform that produces new data every day through its users which can be utilized for various purposes.People express their unique ideas and views onmultiple topics thus providing vast knowledge.Sentiment analysis is critical from the corporate and political perspectives as it can impact decision-making.Since the proliferation of COVID-19,it has become an important challenge to detect the sentiment of COVID-19-related tweets so that people’s opinions can be tracked.The purpose of this research is to detect the sentiment of people regarding this problem with limited data as it can be challenging considering the various textual characteristics that must be analyzed.Hence,this research presents a deep learning-based model that utilizes the positives of random minority oversampling combined with class label analysis to achieve the best results for sentiment analysis.This research specifically focuses on utilizing class label analysis to deal with the multiclass problem by combining the class labels with a similar overall sentiment.This can be particularly helpful when dealing with smaller datasets.Furthermore,our proposed model integrates various preprocessing steps with random minority oversampling and various deep learning algorithms including standard deep learning and bi-directional deep learning algorithms.This research explores several algorithms and their impact on sentiment analysis tasks and concludes that bidirectional neural networks do not provide any advantage over standard neural networks as standard Neural Networks provide slightly better results than their bidirectional counterparts.The experimental results validate that our model offers excellent results with a validation accuracy of 92.5%and an F1 measure of 0.92. 展开更多
关键词 Bi-directional deep learning RESAMPLING random minority oversampling sentiment analysis class label analysis
下载PDF
经济周期视角下全国社会保障基金的战术资产配置 被引量:3
6
作者 庞杰 王光伟 《社会保障研究》 CSSCI 2016年第4期73-80,共8页
本文按照全国社会保障基金投资管理暂行办法,将社会保障基金的投资组合资产分为银行存款,国债,企业债,金融债,证券投资基金和股票资产6个种类,采用宏观经济的月度指标将我国经济周期划分为4个阶段:衰退阶段、复苏阶段、繁荣阶段和滞涨... 本文按照全国社会保障基金投资管理暂行办法,将社会保障基金的投资组合资产分为银行存款,国债,企业债,金融债,证券投资基金和股票资产6个种类,采用宏观经济的月度指标将我国经济周期划分为4个阶段:衰退阶段、复苏阶段、繁荣阶段和滞涨阶段。根据美林证券的投资时钟理论,统计并分析这6类资产在不同的经济阶段的收益状况。同时,我们在全国社会保障基金投资管理办法的约束条件下,采用均值-方差模型,对全国社会保障基金进行最优资产模拟配置。数据模拟结果表明:在衰退阶段,应尽可能多地配置债券资产;在复苏阶段,应尽量降低现金资产的权重,提高债券资产和委托证券资产的权重;在繁荣阶段,应降低债券资产的配置比重;在滞涨阶段,应尽可能地增加现金持有的权重。并且,本文还针对均值—方差模型固有的缺乏鲁棒性的缺陷,采用Resample方法对模拟结果进行改进,使得资产的配置更加稳健可靠。 展开更多
关键词 全国社会保障基金 投资时钟理论 均值-方差模型 战术资产配置 resample方法
下载PDF
A New Method for Ride Quality Analysis of Vehicle
7
作者 林建辉 张洁 +1 位作者 高燕 李天瑞 《Journal of Donghua University(English Edition)》 EI CAS 2006年第6期134-136,共3页
Current spectral analysis for evaluating the rail ride quality effectively is based on the sampling data at a uniform time interval, but the train is of fluctuation velocity in motion, which results in a non-uniform i... Current spectral analysis for evaluating the rail ride quality effectively is based on the sampling data at a uniform time interval, but the train is of fluctuation velocity in motion, which results in a non-uniform interval between consecutive sampling data. Therefore the accuracy of routine spectral analysis is in doubt when applying it in evaluating the rail ride quality. This paper presents a new way, namely, re-sampling with variable frequency to eliminate the influence of the train's uneven velocity. Its feature is that there is no need for precision measurement of the train's moving speed. Experiment results from the test-bed of rolling stock vibration show that it is valid. 展开更多
关键词 rail ride quality non-uniform sample resample and reconstitute signal processing.
下载PDF
An improved resampling algorithm for rolling element bearing fault diagnosis under variable rotational speeds
8
作者 赵德尊 李建勇 +1 位作者 程卫东 温伟刚 《Journal of Southeast University(English Edition)》 EI CAS 2017年第2期150-158,共9页
In order to address the issues of traditional resampling algorithms involving computational accuracy and efficiency in rolling element bearing fault diagnosis, an equal division impulse-based(EDI-based) resampling a... In order to address the issues of traditional resampling algorithms involving computational accuracy and efficiency in rolling element bearing fault diagnosis, an equal division impulse-based(EDI-based) resampling algorithm is proposed. First, the time marks of every rising edge of the rotating speed pulse and the corresponding amplitudes of faulty bearing vibration signal are determined. Then, every adjacent the rotating pulse is divided equally, and the time marks in every adjacent rotating speed pulses and the corresponding amplitudes of vibration signal are obtained by the interpolation algorithm. Finally, all the time marks and the corresponding amplitudes of vibration signal are arranged and the time marks are transformed into the angle domain to obtain the resampling signal. Speed-up and speed-down faulty bearing signals are employed to verify the validity of the proposed method, and experimental results show that the proposed method is effective for diagnosing faulty bearings. Furthermore, the traditional order tracking techniques are applied to the experimental bearing signals, and the results show that the proposed method produces higher accurate outcomes in less computation time. 展开更多
关键词 rolling element bearing fault diagnosis variable rotational speed equal division impulse-based resampling
下载PDF
The application on order analysis for the rotating machinery with LabVIEW 被引量:5
9
作者 Yu Zhouxiang Wang Shaohong +1 位作者 Xu Kang Liu Bin 《仪器仪表学报》 EI CAS CSCD 北大核心 2016年第S1期157-161,共5页
Order analysis is regarded as one of the most significant method for monitoring and analyzing rotational machinery for the phenomenon of " frequency smear".However,the order analysis based on resampling is a... Order analysis is regarded as one of the most significant method for monitoring and analyzing rotational machinery for the phenomenon of " frequency smear".However,the order analysis based on resampling is a signal processingwhich converts the constant time interval sampling into constant angle interval sampling,while with the variety of the rotational speed.The superiority of the order analysis is investigatedon implement of order analysis.Andthrough comparing the advantage and disadvantage between spectrum and order analysis,the paper will discuss the order analysis form a deep perspective. 展开更多
关键词 order analysis RESAMPLING rotating machinery LABVIEW
下载PDF
Point mass filter based matching algorithm in gravity aided underwater navigation 被引量:8
10
作者 HAN Yurong WANG Bo +1 位作者 DENG Zhihong FU Mengyin 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2018年第1期152-159,共8页
Gravity-aided inertial navigation is a hot issue in the applications of underwater autonomous vehicle(UAV). Since the matching process is conducted with a gravity anomaly database tabulated in the form of a digital mo... Gravity-aided inertial navigation is a hot issue in the applications of underwater autonomous vehicle(UAV). Since the matching process is conducted with a gravity anomaly database tabulated in the form of a digital model and the resolution is 2’ × 2’,a filter model based on vehicle position is derived and the particularity of inertial navigation system(INS) output is employed to estimate a parameter in the system model. Meanwhile, the matching algorithm based on point mass filter(PMF) is applied and several optimal selection strategies are discussed. It is obtained that the point mass filter algorithm based on the deterministic resampling method has better practicability. The reliability and the accuracy of the algorithm are verified via simulation tests. 展开更多
关键词 gravity-aided inertial navigation system(INS) navigation point mass filter(PMF) deterministic resampling
下载PDF
Probabilistic outlier detection for sparse multivariate geotechnical site investigation data using Bayesian learning 被引量:3
11
作者 Shuo Zheng Yu-Xin Zhu +3 位作者 Dian-Qing Li Zi-Jun Cao Qin-Xuan Deng Kok-Kwang Phoon 《Geoscience Frontiers》 SCIE CAS CSCD 2021年第1期425-439,共15页
Various uncertainties arising during acquisition process of geoscience data may result in anomalous data instances(i.e.,outliers)that do not conform with the expected pattern of regular data instances.With sparse mult... Various uncertainties arising during acquisition process of geoscience data may result in anomalous data instances(i.e.,outliers)that do not conform with the expected pattern of regular data instances.With sparse multivariate data obtained from geotechnical site investigation,it is impossible to identify outliers with certainty due to the distortion of statistics of geotechnical parameters caused by outliers and their associated statistical uncertainty resulted from data sparsity.This paper develops a probabilistic outlier detection method for sparse multivariate data obtained from geotechnical site investigation.The proposed approach quantifies the outlying probability of each data instance based on Mahalanobis distance and determines outliers as those data instances with outlying probabilities greater than 0.5.It tackles the distortion issue of statistics estimated from the dataset with outliers by a re-sampling technique and accounts,rationally,for the statistical uncertainty by Bayesian machine learning.Moreover,the proposed approach also suggests an exclusive method to determine outlying components of each outlier.The proposed approach is illustrated and verified using simulated and real-life dataset.It showed that the proposed approach properly identifies outliers among sparse multivariate data and their corresponding outlying components in a probabilistic manner.It can significantly reduce the masking effect(i.e.,missing some actual outliers due to the distortion of statistics by the outliers and statistical uncertainty).It also found that outliers among sparse multivariate data instances affect significantly the construction of multivariate distribution of geotechnical parameters for uncertainty quantification.This emphasizes the necessity of data cleaning process(e.g.,outlier detection)for uncertainty quantification based on geoscience data. 展开更多
关键词 Outlier detection Site investigation Sparse multivariate data Mahalanobis distance Resampling by half-means Bayesian machine learning
下载PDF
Jackknife based generalized resampling reliability approach for rock slopes and tunnels stability analyses with limited data:Theory and applications 被引量:3
12
作者 Akshay Kumar Gaurav Tiwari 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2022年第3期714-730,共17页
An efficient resampling reliability approach was developed to consider the effect of statistical uncertainties in input properties arising due to insufficient data when estimating the reliability of rock slopes and tu... An efficient resampling reliability approach was developed to consider the effect of statistical uncertainties in input properties arising due to insufficient data when estimating the reliability of rock slopes and tunnels.This approach considers the effect of uncertainties in both distribution parameters(mean and standard deviation)and types of input properties.Further,the approach was generalized to make it capable of analyzing complex problems with explicit/implicit performance functions(PFs),single/multiple PFs,and correlated/non-correlated input properties.It couples resampling statistical tool,i.e.jackknife,with advanced reliability tools like Latin hypercube sampling(LHS),Sobol’s global sensitivity,moving least square-response surface method(MLS-RSM),and Nataf’s transformation.The developed approach was demonstrated for four cases encompassing different types.Results were compared with a recently developed bootstrap-based resampling reliability approach.The results show that the approach is accurate and significantly efficient compared with the bootstrap-based approach.The proposed approach reflects the effect of statistical uncertainties of input properties by estimating distributions/confidence intervals of reliability index/probability of failure(s)instead of their fixed-point estimates.Further,sufficiently accurate results were obtained by considering uncertainties in distribution parameters only and ignoring those in distribution types. 展开更多
关键词 Statistical uncertainty Resampling reliability Moving least square response surface(MLSRSM) Sobol’s global sensitivity Correlation coefficient
下载PDF
A Comprehensive Evaluation of PAN-Sharpening Algorithms Coupled with Resampling Methods for Image Synthesis of Very High Resolution Remotely Sensed Satellite Data 被引量:6
13
作者 Shridhar D. Jawak Alvarinho J. Luis 《Advances in Remote Sensing》 2013年第4期332-344,共13页
The merging of a panchromatic (PAN) image with a multispectral satellite image (MSI) to increase the spatial resolution of the MSI, while simultaneously preserving its spectral information is classically referred as P... The merging of a panchromatic (PAN) image with a multispectral satellite image (MSI) to increase the spatial resolution of the MSI, while simultaneously preserving its spectral information is classically referred as PAN-sharpening. We employed a recent dataset derived from very high resolution of WorldView-2 satellite (PAN and MSI) for two test sites (one over an urban area and the other over Antarctica), to comprehensively evaluate the performance of six existing PAN-sharpening algorithms. The algorithms under consideration were the Gram-Schmidt (GS), Ehlers fusion (EF), modified hue-intensity-saturation (Mod-HIS), high pass filtering (HPF), the Brovey transform (BT), and wavelet-based principal component analysis (W-PC). Quality assessment of the sharpened images was carried out by using 20 quality indices. We also analyzed the performance of nearest neighbour (NN), bilinear interpolation (BI), and cubic convolution (CC) resampling methods to test their practicability in the PAN-sharpening process. Our results indicate that the comprehensive performance of PAN-sharpening methods decreased in the following order: GS > W-PC > EF > HPF > Mod-HIS > BT, while resampling methods followed the order: NN > BI > CC. 展开更多
关键词 PAN-Sharpening WorldView-2 RESAMPLING METHODS
下载PDF
Outlier Detection in Near Infra-Red Spectra with Self-Organizing Map 被引量:2
14
作者 李晓霞 李刚 +4 位作者 林凌 刘玉良 王焱 李健 杜江 《Transactions of Tianjin University》 EI CAS 2005年第2期129-132,共4页
A new method to detect multiple outliers in multivariate data is proposed. It is a combination of minimum subsets, resampling and self-organizing map (SOM) algorithm introduced by Kohonen,which provides a robust way w... A new method to detect multiple outliers in multivariate data is proposed. It is a combination of minimum subsets, resampling and self-organizing map (SOM) algorithm introduced by Kohonen,which provides a robust way with neural network. In this method, the number and organization of the neurons are selected by the characteristics of the spectra, e.g., the spectra data are often changed linearly with the concentration of the components and are often measured repeatedly, etc. So the spatial distribution of the neurons can be arranged by this characteristic. With this method, all the outliers in the spectra can be detected, which cannot be solved by the traditional method, and the speed of computation is higher than that of the traditional neural network method. The results of the simulation and the experiment show that this method is simple, effective, intuitionistic and all the outliers in the spectra can be detected in a short time. It is useful when associated with the regression model in the near infra-red research. 展开更多
关键词 OUTLIER near infra-red spectra minimum subsets RESAMPLING self-organizing map
下载PDF
Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for Software-Defined Radio Front-Ends 被引量:3
15
作者 Mehmood Awan Yannick Le Moullec +1 位作者 Peter Koch Fred Harris 《ZTE Communications》 2012年第1期54-62,70,共10页
In this paper, we describe resourceefficient hardware architectures for softwaredefined radio (SDR) frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample r... In this paper, we describe resourceefficient hardware architectures for softwaredefined radio (SDR) frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time, and power optimization for field programmable gate array (FPGA) based architectures in an Mpath polyphase filter bank with modified Npath polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. A nonmaximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the Mdataload ' s time period. We present a loadprocess architecture (LPA) and a runtime architecture (RA) (based on serial polyphase structure) which have different scheduling. In LPA, Nsubfilters are loaded, and then M subfilters are processed at a clock rate that is a multiple of the input data rate. This is necessary to meet the output time constraint of the down-sampled data. In RA, Msubfilters processes are efficiently scheduled within Ndataload time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors for maximally decimated, underdecimated, overdecimated, and combined upand downsampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resourceoptimized SDR frontends, RA is superior for reducing operating clock rates and dynamic power consumption. RA is also superior for reducing area resources, except when indices are prestored in LUTs. 展开更多
关键词 SDR FPGA Digital Frontends Polyphase Filter Bank Embedded Resampling
下载PDF
Polyphase Filter Banks for Embedded Sample Rate Changes in Digital Radio Front-Ends 被引量:3
16
作者 Mehmood Awan Yannick Le Moullec +1 位作者 Peter Koch Fred Harris 《ZTE Communications》 2011年第4期3-9,共7页
This paper presents efficient processing engines for software-defined radio (SDR) front-ends. These engines, based on a polyphase channelizer, perform arbitrary sample-rate changes, frequency selection, and bandwidt... This paper presents efficient processing engines for software-defined radio (SDR) front-ends. These engines, based on a polyphase channelizer, perform arbitrary sample-rate changes, frequency selection, and bandwidth control. This paper presents an M-path polyphase filter bank based on a modified N-path polyphase filter. Such a system allows resampling by arbitrary ratios while performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. This resampling technique is based on sliding cyclic data load interacting with cyclic-shifted coefficients. A non-maximally-decimated polyphase filterbank (where the number of data loads is not equal to the number of M subfilters) processes M subfilters in a time period that is less than or greater than the M data loads. A polyphase filter bank with five different resampling modes is used as a case study for embedded resamp/ing in SDR front-ends. These modes are (i) maximally decimated, (ii) Under-decimated, (iii) over-decimated, and combined up- and down-sampling with (iv) single stride length, and (v) multiple stride lengths. These modes can be used to obtain any required rational sampling rate change in an SDR front-end based on a polyphase channelizer. They can also be used for translation to and from arbitrary center frequencies that are unrelated to the output sample rates. 展开更多
关键词 SDR digital front-ends polyphase filter bank embedded resampling
下载PDF
Transient characteristics verification method for DC transformer used in flexible HVDC system 被引量:2
17
作者 Qi Nie Haoliang Hu +3 位作者 Dengyun Li Boyang Liu He Li Qianzhu Xiong 《Global Energy Interconnection》 2019年第2期180-187,共8页
Previous studies have proposed higher requirements for the transient characteristics of a DC transformer used in a flexible high-voltage direct current(HVDC) system to achieve faster sampling speed and meet wider band... Previous studies have proposed higher requirements for the transient characteristics of a DC transformer used in a flexible high-voltage direct current(HVDC) system to achieve faster sampling speed and meet wider bandwidth requirements of the control and protection signal, and to eventually suppress the large transient fault current. In this study, a transient characteristics verification method is proposed for transient characteristics verification of a DC transformer used in a flexible HVDC system based on resampling technology and LabVIEW measurement technology after analyzing the key technology for transient characteristics verification of a DC transformer. A laboratory experiment for the transient characteristics of a full-fiber electronic DC transformer is conducted, and experimental results show that such verification method can be employed for frequency response and step response verification of a DC transformer at 10% of the rated voltage and current, and can eventually improve the screening of a DC transformer. 展开更多
关键词 DC TRANSFORMER STEP response TRANSIENT CHARACTERISTIC RESAMPLING technology Verification method
下载PDF
Extracting useful high-frequency information from wide-field electromagnetic data using time-domain signal reconstruction 被引量:1
18
作者 LING Fan YANG Yang +6 位作者 LI Gang ZHOU Chang-yu HUANG Min WANG Xin ZHANG Heng ZHU Yu-zhen SUN Huai-feng 《Journal of Central South University》 SCIE EI CAS CSCD 2022年第11期3767-3778,共12页
The wide-field electromagnetic method is widely used in hydrocarbon exploration,mineral deposit detection,and geological disaster prediction.However,apparent resistivity and normalized field amplitude exceeding 2048 H... The wide-field electromagnetic method is widely used in hydrocarbon exploration,mineral deposit detection,and geological disaster prediction.However,apparent resistivity and normalized field amplitude exceeding 2048 Hz often exhibit upward warping in data,making geophysical inversion and interpretation challenging.The cumulative error of the crystal oscillator in signal transmission and acquisition contributes to an upturned apparent resistivity curve.To address this,a high-frequency information extraction method is proposed based on time-domain signal reconstruction,which helps to record a complete current data sequence;moreover,it helps estimate the crystal oscillator error for the transmitted signal.Considering the recorded error,a received signal was corrected using a set of reconstruction algorithms.After processing,the high-frequency component of the wide-field electromagnetic data was not upturned,while accurate high-frequency information was extracted from the signal.Therefore,the proposed method helped effectively extract high-frequency components of all wide-field electromagnetic data. 展开更多
关键词 wide-field electromagnetic method crystal oscillator error time series signal resampling signal reconstruction
下载PDF
Particle filter based on iterated importance density function and parallel resampling 被引量:1
19
作者 武勇 王俊 曹运合 《Journal of Central South University》 SCIE EI CAS CSCD 2015年第9期3427-3439,共13页
The design, analysis and parallel implementation of particle filter(PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function(IIDF) was proposed, wher... The design, analysis and parallel implementation of particle filter(PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function(IIDF) was proposed, where a new term associating with the current measurement information(CMI) was introduced into the expression of the sampled particles. Through the repeated use of the least squares estimate, the CMI can be integrated into the sampling stage in an iterative manner, conducing to the greatly improved sampling quality. By running the IIDF, an iterated PF(IPF) can be obtained. Subsequently, a parallel resampling(PR) was proposed for the purpose of parallel implementation of IPF, whose main idea was the same as systematic resampling(SR) but performed differently. The PR directly used the integral part of the product of the particle weight and particle number as the number of times that a particle was replicated, and it simultaneously eliminated the particles with the smallest weights, which are the two key differences from the SR. The detailed implementation procedures on the graphics processing unit of IPF based on the PR were presented at last. The performance of the IPF, PR and their parallel implementations are illustrated via one-dimensional numerical simulation and practical application of passive radar target tracking. 展开更多
关键词 particle filter iterated importance density function least squares estimate parallel resampling graphics processing unit
下载PDF
Variation of Q value before and after the 1999 Xiuyan, Liaoning Province, M=5.4 earthquake on the basis of analysis on attenuation dispersion of P waves 被引量:1
20
作者 刘希强 孙庆文 +3 位作者 李红 石玉燕 季爱东 王峰吉 《Acta Seismologica Sinica(English Edition)》 EI CSCD 2005年第5期16-26,共11页
A method for determining medium quality factor is developed on the basis of analyzing the attenuation dispersion of the arrived first period P wave. In order to enhance signal to noise ratio, improve the resolution in... A method for determining medium quality factor is developed on the basis of analyzing the attenuation dispersion of the arrived first period P wave. In order to enhance signal to noise ratio, improve the resolution in measurement and reduce systematic error we applied the data resampling technique. The group velocity delay of P wave was derived by using an improved multi-filtering method. Based on a linear viscoelastic relaxation model we deduced the medium quality factor Qm, and associated error with 95% confidence level. Applying the method to the seismic record of the Xiuyan M=5.4 earthquake sequences we obtained the following result: 1 High Qm started to appear from Nov. 9, 1999. The events giving the deduced high Qm value clustered in a region with their epicenter dis- tances being between 32 and 46 km to the Yingkou station. This Qm versus distance observation obviously deviates from the normal trend of Qm linearly increasing with distance. 2 The average Qm before the 29 Dec. 1999 M=5.4 earthquake is 460, while the average Qm between the M=5.4 event and the 12 Jan. 2000 M=5.1 earthquake is 391, and the average Qm after the M=5.1 event is 204. 展开更多
关键词 P wave data resampling multi-filtering anelasticity of medium inversion of meidium quality factor Qm variation before and after earthquake
下载PDF
上一页 1 2 4 下一页 到第
使用帮助 返回顶部