期刊文献+
共找到50篇文章
< 1 2 3 >
每页显示 20 50 100
Credit Card Fraud Detection Using Improved Deep Learning Models
1
作者 Sumaya S.Sulaiman Ibraheem Nadher Sarab M.Hameed 《Computers, Materials & Continua》 SCIE EI 2024年第1期1049-1069,共21页
Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown pr... Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown promise in several fields,including detecting credit card fraud.However,the efficacy of these models is heavily dependent on the careful selection of appropriate hyperparameters.This paper introduces models that integrate deep learning models with hyperparameter tuning techniques to learn the patterns and relationships within credit card transaction data,thereby improving fraud detection.Three deep learning models:AutoEncoder(AE),Convolution Neural Network(CNN),and Long Short-Term Memory(LSTM)are proposed to investigate how hyperparameter adjustment impacts the efficacy of deep learning models used to identify credit card fraud.The experiments conducted on a European credit card fraud dataset using different hyperparameters and three deep learning models demonstrate that the proposed models achieve a tradeoff between detection rate and precision,leading these models to be effective in accurately predicting credit card fraud.The results demonstrate that LSTM significantly outperformed AE and CNN in terms of accuracy(99.2%),detection rate(93.3%),and area under the curve(96.3%).These proposed models have surpassed those of existing studies and are expected to make a significant contribution to the field of credit card fraud detection. 展开更多
关键词 Card fraud detection hyperparameter tuning deep learning autoencoder convolution neural network long short-term memory RESAMPLING
下载PDF
Comparison of uniform resampling and nonuniform sampling direct-reconstruction methods in k-space for FD-OCT
2
作者 Yanrong Yang Yun Dai +1 位作者 Yuehua Zhou Yaliang Yang 《Journal of Innovative Optical Health Sciences》 SCIE EI CSCD 2023年第5期93-106,共14页
The nonuniform distribution of interference spectrum in wavenumber k-space is a key issue to limit the imaging quality of Fourier-domain optical coherence tomography(FD-OCT).At present,the reconstruction quality at di... The nonuniform distribution of interference spectrum in wavenumber k-space is a key issue to limit the imaging quality of Fourier-domain optical coherence tomography(FD-OCT).At present,the reconstruction quality at different depths among a variety of processing methods in k-space is still uncertain.Using simulated and experimental interference spectra at different depths,the effects of common six processing methods including uniform resampling(linear interpolation(LI),cubic spline interpolation(CSI),time-domain interpolation(TDI),and K-B window convolution)and nonuniform sampling direct-reconstruction(Lomb periodogram(LP)and nonuniform discrete Fourier transform(NDFT))on the reconstruction quality of FD-OCT were quantitatively analyzed and compared in this work.The results obtained by using simulated and experimental data were coincident.From the experimental results,the averaged peak intensity,axial resolution,and signal-to-noise ratio(SNR)of NDFT at depth from 0.5 to 3.0mm were improved by about 1.9 dB,1.4 times,and 11.8 dB,respectively,compared to the averaged indices of all the uniform resampling methods at all depths.Similarly,the improvements of the above three indices of LP were 2.0 dB,1.4 times,and 11.7 dB,respectively.The analysis method and the results obtained in this work are helpful to select an appropriate processing method in k-space,so as to improve the imaging quality of FD-OCT. 展开更多
关键词 Optical coherence tomography signal processing uniform resampling nonuniform sampling direct-reconstruction reconstruction quality.
下载PDF
ESR-PINNs:Physics-informed neural networks with expansion-shrinkage resampling selection strategies
3
作者 刘佳楠 侯庆志 +1 位作者 魏建国 孙泽玮 《Chinese Physics B》 SCIE EI CAS CSCD 2023年第7期337-346,共10页
Neural network methods have been widely used in many fields of scientific research with the rapid increase of computing power.The physics-informed neural networks(PINNs)have received much attention as a major breakthr... Neural network methods have been widely used in many fields of scientific research with the rapid increase of computing power.The physics-informed neural networks(PINNs)have received much attention as a major breakthrough in solving partial differential equations using neural networks.In this paper,a resampling technique based on the expansion-shrinkage point(ESP)selection strategy is developed to dynamically modify the distribution of training points in accordance with the performance of the neural networks.In this new approach both training sites with slight changes in residual values and training points with large residuals are taken into account.In order to make the distribution of training points more uniform,the concept of continuity is further introduced and incorporated.This method successfully addresses the issue that the neural network becomes ill or even crashes due to the extensive alteration of training point distribution.The effectiveness of the improved physics-informed neural networks with expansion-shrinkage resampling is demonstrated through a series of numerical experiments. 展开更多
关键词 physical informed neural networks RESAMPLING partial differential equation
下载PDF
Imbalanced Data Classification Using SVM Based on Improved Simulated Annealing Featuring Synthetic Data Generation and Reduction
4
作者 Hussein Ibrahim Hussein Said Amirul Anwar Muhammad Imran Ahmad 《Computers, Materials & Continua》 SCIE EI 2023年第4期547-564,共18页
Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the perform... Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the performance of the machine learning algorithm such as Support Vector Machine(SVM)is affected when dealing with an imbalanced dataset.The classification accuracy is mostly skewed toward the majority class and poor results are exhibited in the prediction of minority-class samples.In this paper,a hybrid approach combining data pre-processing technique andSVMalgorithm based on improved Simulated Annealing(SA)was proposed.Firstly,the data preprocessing technique which primarily aims at solving the resampling strategy of handling imbalanced datasets was proposed.In this technique,the data were first synthetically generated to equalize the number of samples between classes and followed by a reduction step to remove redundancy and duplicated data.Next is the training of a balanced dataset using SVM.Since this algorithm requires an iterative process to search for the best penalty parameter during training,an improved SA algorithm was proposed for this task.In this proposed improvement,a new acceptance criterion for the solution to be accepted in the SA algorithm was introduced to enhance the accuracy of the optimization process.Experimental works based on ten publicly available imbalanced datasets have demonstrated higher accuracy in the classification tasks using the proposed approach in comparison with the conventional implementation of SVM.Registering at an average of 89.65%of accuracy for the binary class classification has demonstrated the good performance of the proposed works. 展开更多
关键词 Imbalanced data resampling technique data reduction support vector machine simulated annealing
下载PDF
Deep Learning Based Sentiment Analysis of COVID-19 Tweets via Resampling and Label Analysis
5
作者 Mamoona Humayun Danish Javed +2 位作者 Nz Jhanjhi Maram Fahaad Almufareh Saleh Naif Almuayqil 《Computer Systems Science & Engineering》 SCIE EI 2023年第10期575-591,共17页
Twitter has emerged as a platform that produces new data every day through its users which can be utilized for various purposes.People express their unique ideas and views onmultiple topics thus providing vast knowled... Twitter has emerged as a platform that produces new data every day through its users which can be utilized for various purposes.People express their unique ideas and views onmultiple topics thus providing vast knowledge.Sentiment analysis is critical from the corporate and political perspectives as it can impact decision-making.Since the proliferation of COVID-19,it has become an important challenge to detect the sentiment of COVID-19-related tweets so that people’s opinions can be tracked.The purpose of this research is to detect the sentiment of people regarding this problem with limited data as it can be challenging considering the various textual characteristics that must be analyzed.Hence,this research presents a deep learning-based model that utilizes the positives of random minority oversampling combined with class label analysis to achieve the best results for sentiment analysis.This research specifically focuses on utilizing class label analysis to deal with the multiclass problem by combining the class labels with a similar overall sentiment.This can be particularly helpful when dealing with smaller datasets.Furthermore,our proposed model integrates various preprocessing steps with random minority oversampling and various deep learning algorithms including standard deep learning and bi-directional deep learning algorithms.This research explores several algorithms and their impact on sentiment analysis tasks and concludes that bidirectional neural networks do not provide any advantage over standard neural networks as standard Neural Networks provide slightly better results than their bidirectional counterparts.The experimental results validate that our model offers excellent results with a validation accuracy of 92.5%and an F1 measure of 0.92. 展开更多
关键词 Bi-directional deep learning RESAMPLING random minority oversampling sentiment analysis class label analysis
下载PDF
经济周期视角下全国社会保障基金的战术资产配置 被引量:3
6
作者 庞杰 王光伟 《社会保障研究》 CSSCI 2016年第4期73-80,共8页
本文按照全国社会保障基金投资管理暂行办法,将社会保障基金的投资组合资产分为银行存款,国债,企业债,金融债,证券投资基金和股票资产6个种类,采用宏观经济的月度指标将我国经济周期划分为4个阶段:衰退阶段、复苏阶段、繁荣阶段和滞涨... 本文按照全国社会保障基金投资管理暂行办法,将社会保障基金的投资组合资产分为银行存款,国债,企业债,金融债,证券投资基金和股票资产6个种类,采用宏观经济的月度指标将我国经济周期划分为4个阶段:衰退阶段、复苏阶段、繁荣阶段和滞涨阶段。根据美林证券的投资时钟理论,统计并分析这6类资产在不同的经济阶段的收益状况。同时,我们在全国社会保障基金投资管理办法的约束条件下,采用均值-方差模型,对全国社会保障基金进行最优资产模拟配置。数据模拟结果表明:在衰退阶段,应尽可能多地配置债券资产;在复苏阶段,应尽量降低现金资产的权重,提高债券资产和委托证券资产的权重;在繁荣阶段,应降低债券资产的配置比重;在滞涨阶段,应尽可能地增加现金持有的权重。并且,本文还针对均值—方差模型固有的缺乏鲁棒性的缺陷,采用Resample方法对模拟结果进行改进,使得资产的配置更加稳健可靠。 展开更多
关键词 全国社会保障基金 投资时钟理论 均值-方差模型 战术资产配置 resample方法
下载PDF
The application on order analysis for the rotating machinery with LabVIEW 被引量:5
7
作者 Yu Zhouxiang Wang Shaohong +1 位作者 Xu Kang Liu Bin 《仪器仪表学报》 EI CAS CSCD 北大核心 2016年第S1期157-161,共5页
Order analysis is regarded as one of the most significant method for monitoring and analyzing rotational machinery for the phenomenon of " frequency smear".However,the order analysis based on resampling is a... Order analysis is regarded as one of the most significant method for monitoring and analyzing rotational machinery for the phenomenon of " frequency smear".However,the order analysis based on resampling is a signal processingwhich converts the constant time interval sampling into constant angle interval sampling,while with the variety of the rotational speed.The superiority of the order analysis is investigatedon implement of order analysis.Andthrough comparing the advantage and disadvantage between spectrum and order analysis,the paper will discuss the order analysis form a deep perspective. 展开更多
关键词 order analysis RESAMPLING rotating machinery LABVIEW
下载PDF
Point mass filter based matching algorithm in gravity aided underwater navigation 被引量:7
8
作者 HAN Yurong WANG Bo +1 位作者 DENG Zhihong FU Mengyin 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2018年第1期152-159,共8页
Gravity-aided inertial navigation is a hot issue in the applications of underwater autonomous vehicle(UAV). Since the matching process is conducted with a gravity anomaly database tabulated in the form of a digital mo... Gravity-aided inertial navigation is a hot issue in the applications of underwater autonomous vehicle(UAV). Since the matching process is conducted with a gravity anomaly database tabulated in the form of a digital model and the resolution is 2’ × 2’,a filter model based on vehicle position is derived and the particularity of inertial navigation system(INS) output is employed to estimate a parameter in the system model. Meanwhile, the matching algorithm based on point mass filter(PMF) is applied and several optimal selection strategies are discussed. It is obtained that the point mass filter algorithm based on the deterministic resampling method has better practicability. The reliability and the accuracy of the algorithm are verified via simulation tests. 展开更多
关键词 gravity-aided inertial navigation system(INS) navigation point mass filter(PMF) deterministic resampling
下载PDF
Probabilistic outlier detection for sparse multivariate geotechnical site investigation data using Bayesian learning 被引量:3
9
作者 Shuo Zheng Yu-Xin Zhu +3 位作者 Dian-Qing Li Zi-Jun Cao Qin-Xuan Deng Kok-Kwang Phoon 《Geoscience Frontiers》 SCIE CAS CSCD 2021年第1期425-439,共15页
Various uncertainties arising during acquisition process of geoscience data may result in anomalous data instances(i.e.,outliers)that do not conform with the expected pattern of regular data instances.With sparse mult... Various uncertainties arising during acquisition process of geoscience data may result in anomalous data instances(i.e.,outliers)that do not conform with the expected pattern of regular data instances.With sparse multivariate data obtained from geotechnical site investigation,it is impossible to identify outliers with certainty due to the distortion of statistics of geotechnical parameters caused by outliers and their associated statistical uncertainty resulted from data sparsity.This paper develops a probabilistic outlier detection method for sparse multivariate data obtained from geotechnical site investigation.The proposed approach quantifies the outlying probability of each data instance based on Mahalanobis distance and determines outliers as those data instances with outlying probabilities greater than 0.5.It tackles the distortion issue of statistics estimated from the dataset with outliers by a re-sampling technique and accounts,rationally,for the statistical uncertainty by Bayesian machine learning.Moreover,the proposed approach also suggests an exclusive method to determine outlying components of each outlier.The proposed approach is illustrated and verified using simulated and real-life dataset.It showed that the proposed approach properly identifies outliers among sparse multivariate data and their corresponding outlying components in a probabilistic manner.It can significantly reduce the masking effect(i.e.,missing some actual outliers due to the distortion of statistics by the outliers and statistical uncertainty).It also found that outliers among sparse multivariate data instances affect significantly the construction of multivariate distribution of geotechnical parameters for uncertainty quantification.This emphasizes the necessity of data cleaning process(e.g.,outlier detection)for uncertainty quantification based on geoscience data. 展开更多
关键词 Outlier detection Site investigation Sparse multivariate data Mahalanobis distance Resampling by half-means Bayesian machine learning
下载PDF
Jackknife based generalized resampling reliability approach for rock slopes and tunnels stability analyses with limited data:Theory and applications 被引量:3
10
作者 Akshay Kumar Gaurav Tiwari 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2022年第3期714-730,共17页
An efficient resampling reliability approach was developed to consider the effect of statistical uncertainties in input properties arising due to insufficient data when estimating the reliability of rock slopes and tu... An efficient resampling reliability approach was developed to consider the effect of statistical uncertainties in input properties arising due to insufficient data when estimating the reliability of rock slopes and tunnels.This approach considers the effect of uncertainties in both distribution parameters(mean and standard deviation)and types of input properties.Further,the approach was generalized to make it capable of analyzing complex problems with explicit/implicit performance functions(PFs),single/multiple PFs,and correlated/non-correlated input properties.It couples resampling statistical tool,i.e.jackknife,with advanced reliability tools like Latin hypercube sampling(LHS),Sobol’s global sensitivity,moving least square-response surface method(MLS-RSM),and Nataf’s transformation.The developed approach was demonstrated for four cases encompassing different types.Results were compared with a recently developed bootstrap-based resampling reliability approach.The results show that the approach is accurate and significantly efficient compared with the bootstrap-based approach.The proposed approach reflects the effect of statistical uncertainties of input properties by estimating distributions/confidence intervals of reliability index/probability of failure(s)instead of their fixed-point estimates.Further,sufficiently accurate results were obtained by considering uncertainties in distribution parameters only and ignoring those in distribution types. 展开更多
关键词 Statistical uncertainty Resampling reliability Moving least square response surface(MLSRSM) Sobol’s global sensitivity Correlation coefficient
下载PDF
A Comprehensive Evaluation of PAN-Sharpening Algorithms Coupled with Resampling Methods for Image Synthesis of Very High Resolution Remotely Sensed Satellite Data 被引量:6
11
作者 Shridhar D. Jawak Alvarinho J. Luis 《Advances in Remote Sensing》 2013年第4期332-344,共13页
The merging of a panchromatic (PAN) image with a multispectral satellite image (MSI) to increase the spatial resolution of the MSI, while simultaneously preserving its spectral information is classically referred as P... The merging of a panchromatic (PAN) image with a multispectral satellite image (MSI) to increase the spatial resolution of the MSI, while simultaneously preserving its spectral information is classically referred as PAN-sharpening. We employed a recent dataset derived from very high resolution of WorldView-2 satellite (PAN and MSI) for two test sites (one over an urban area and the other over Antarctica), to comprehensively evaluate the performance of six existing PAN-sharpening algorithms. The algorithms under consideration were the Gram-Schmidt (GS), Ehlers fusion (EF), modified hue-intensity-saturation (Mod-HIS), high pass filtering (HPF), the Brovey transform (BT), and wavelet-based principal component analysis (W-PC). Quality assessment of the sharpened images was carried out by using 20 quality indices. We also analyzed the performance of nearest neighbour (NN), bilinear interpolation (BI), and cubic convolution (CC) resampling methods to test their practicability in the PAN-sharpening process. Our results indicate that the comprehensive performance of PAN-sharpening methods decreased in the following order: GS > W-PC > EF > HPF > Mod-HIS > BT, while resampling methods followed the order: NN > BI > CC. 展开更多
关键词 PAN-Sharpening WorldView-2 RESAMPLING METHODS
下载PDF
Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for Software-Defined Radio Front-Ends 被引量:3
12
作者 Mehmood Awan Yannick Le Moullec +1 位作者 Peter Koch Fred Harris 《ZTE Communications》 2012年第1期54-62,70,共10页
In this paper, we describe resourceefficient hardware architectures for softwaredefined radio (SDR) frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample r... In this paper, we describe resourceefficient hardware architectures for softwaredefined radio (SDR) frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time, and power optimization for field programmable gate array (FPGA) based architectures in an Mpath polyphase filter bank with modified Npath polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. A nonmaximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the Mdataload ' s time period. We present a loadprocess architecture (LPA) and a runtime architecture (RA) (based on serial polyphase structure) which have different scheduling. In LPA, Nsubfilters are loaded, and then M subfilters are processed at a clock rate that is a multiple of the input data rate. This is necessary to meet the output time constraint of the down-sampled data. In RA, Msubfilters processes are efficiently scheduled within Ndataload time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors for maximally decimated, underdecimated, overdecimated, and combined upand downsampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resourceoptimized SDR frontends, RA is superior for reducing operating clock rates and dynamic power consumption. RA is also superior for reducing area resources, except when indices are prestored in LUTs. 展开更多
关键词 SDR FPGA Digital Frontends Polyphase Filter Bank Embedded Resampling
下载PDF
Polyphase Filter Banks for Embedded Sample Rate Changes in Digital Radio Front-Ends 被引量:3
13
作者 Mehmood Awan Yannick Le Moullec +1 位作者 Peter Koch Fred Harris 《ZTE Communications》 2011年第4期3-9,共7页
This paper presents efficient processing engines for software-defined radio (SDR) front-ends. These engines, based on a polyphase channelizer, perform arbitrary sample-rate changes, frequency selection, and bandwidt... This paper presents efficient processing engines for software-defined radio (SDR) front-ends. These engines, based on a polyphase channelizer, perform arbitrary sample-rate changes, frequency selection, and bandwidth control. This paper presents an M-path polyphase filter bank based on a modified N-path polyphase filter. Such a system allows resampling by arbitrary ratios while performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. This resampling technique is based on sliding cyclic data load interacting with cyclic-shifted coefficients. A non-maximally-decimated polyphase filterbank (where the number of data loads is not equal to the number of M subfilters) processes M subfilters in a time period that is less than or greater than the M data loads. A polyphase filter bank with five different resampling modes is used as a case study for embedded resamp/ing in SDR front-ends. These modes are (i) maximally decimated, (ii) Under-decimated, (iii) over-decimated, and combined up- and down-sampling with (iv) single stride length, and (v) multiple stride lengths. These modes can be used to obtain any required rational sampling rate change in an SDR front-end based on a polyphase channelizer. They can also be used for translation to and from arbitrary center frequencies that are unrelated to the output sample rates. 展开更多
关键词 SDR digital front-ends polyphase filter bank embedded resampling
下载PDF
Transient characteristics verification method for DC transformer used in flexible HVDC system 被引量:2
14
作者 Qi Nie Haoliang Hu +3 位作者 Dengyun Li Boyang Liu He Li Qianzhu Xiong 《Global Energy Interconnection》 2019年第2期180-187,共8页
Previous studies have proposed higher requirements for the transient characteristics of a DC transformer used in a flexible high-voltage direct current(HVDC) system to achieve faster sampling speed and meet wider band... Previous studies have proposed higher requirements for the transient characteristics of a DC transformer used in a flexible high-voltage direct current(HVDC) system to achieve faster sampling speed and meet wider bandwidth requirements of the control and protection signal, and to eventually suppress the large transient fault current. In this study, a transient characteristics verification method is proposed for transient characteristics verification of a DC transformer used in a flexible HVDC system based on resampling technology and LabVIEW measurement technology after analyzing the key technology for transient characteristics verification of a DC transformer. A laboratory experiment for the transient characteristics of a full-fiber electronic DC transformer is conducted, and experimental results show that such verification method can be employed for frequency response and step response verification of a DC transformer at 10% of the rated voltage and current, and can eventually improve the screening of a DC transformer. 展开更多
关键词 DC TRANSFORMER STEP response TRANSIENT CHARACTERISTIC RESAMPLING technology Verification method
下载PDF
Variation of Q value before and after the 1999 Xiuyan, Liaoning Province, M=5.4 earthquake on the basis of analysis on attenuation dispersion of P waves 被引量:1
15
作者 刘希强 孙庆文 +3 位作者 李红 石玉燕 季爱东 王峰吉 《Acta Seismologica Sinica(English Edition)》 EI CSCD 2005年第5期16-26,共11页
A method for determining medium quality factor is developed on the basis of analyzing the attenuation dispersion of the arrived first period P wave. In order to enhance signal to noise ratio, improve the resolution in... A method for determining medium quality factor is developed on the basis of analyzing the attenuation dispersion of the arrived first period P wave. In order to enhance signal to noise ratio, improve the resolution in measurement and reduce systematic error we applied the data resampling technique. The group velocity delay of P wave was derived by using an improved multi-filtering method. Based on a linear viscoelastic relaxation model we deduced the medium quality factor Qm, and associated error with 95% confidence level. Applying the method to the seismic record of the Xiuyan M=5.4 earthquake sequences we obtained the following result: 1 High Qm started to appear from Nov. 9, 1999. The events giving the deduced high Qm value clustered in a region with their epicenter dis- tances being between 32 and 46 km to the Yingkou station. This Qm versus distance observation obviously deviates from the normal trend of Qm linearly increasing with distance. 2 The average Qm before the 29 Dec. 1999 M=5.4 earthquake is 460, while the average Qm between the M=5.4 event and the 12 Jan. 2000 M=5.1 earthquake is 391, and the average Qm after the M=5.1 event is 204. 展开更多
关键词 P wave data resampling multi-filtering anelasticity of medium inversion of meidium quality factor Qm variation before and after earthquake
下载PDF
Resampling Factor Estimation via Dual-Stream Convolutional Neural Network 被引量:1
16
作者 Shangjun Luo Junwei Luo +4 位作者 Wei Lu Yanmei Fang Jinhua Zeng Shaopei Shi Yue Zhang 《Computers, Materials & Continua》 SCIE EI 2021年第1期647-657,共11页
The estimation of image resampling factors is an important problem in image forensics.Among all the resampling factor estimation methods,spectrumbased methods are one of the most widely used methods and have attracted... The estimation of image resampling factors is an important problem in image forensics.Among all the resampling factor estimation methods,spectrumbased methods are one of the most widely used methods and have attracted a lot of research interest.However,because of inherent ambiguity,spectrum-based methods fail to discriminate upscale and downscale operations without any prior information.In general,the application of resampling leaves detectable traces in both spatial domain and frequency domain of a resampled image.Firstly,the resampling process will introduce correlations between neighboring pixels.In this case,a set of periodic pixels that are correlated to their neighbors can be found in a resampled image.Secondly,the resampled image has distinct and strong peaks on spectrum while the spectrum of original image has no clear peaks.Hence,in this paper,we propose a dual-stream convolutional neural network for image resampling factors estimation.One of the two streams is gray stream whose purpose is to extract resampling traces features directly from the rescaled images.The other is frequency stream that discovers the differences of spectrum between rescaled and original images.The features from two streams are then fused to construct a feature representation including the resampling traces left in spatial and frequency domain,which is later fed into softmax layer for resampling factor estimation.Experimental results show that the proposed method is effective on resampling factor estimation and outperforms some CNN-based methods. 展开更多
关键词 Image forensics image resampling detection parameter estimation convolutional neural network
下载PDF
IMPROVED METHOD FOR HILBERT INSTANTANEOUS FREQUENCY ESTIMATION 被引量:3
17
作者 BO Lin LIU Xiaofeng QIN Shuren 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2007年第6期94-98,共5页
In the mechanical fault detection and diagnosis field, it is more and more important to analyze the instantaneous frequency (IF) character of complex vibration signal. The improved IF estimation method is put forwar... In the mechanical fault detection and diagnosis field, it is more and more important to analyze the instantaneous frequency (IF) character of complex vibration signal. The improved IF estimation method is put forward aiming at the shortage of traditional Hilbert transform. It is based on Hilbert transform in wavelet domain. With the help of relationship between the real part and the imaginary part obtained from the complex coefficient of continuous wavelet transform or the analyti- cal signal reconstructed in wavelet packet decomposition, the instantaneous phase function of the subcomponent is extracted. In order to improve the precise of IF estimated out, some means such as Linear regression, adaptive filtering, resampling are applied into the instantaneous phase obtained, then, the central differencing operator is used to get desired IF. Simulation results with synthetic and gearbox fault signals are included to illustrate the proposed method. 展开更多
关键词 Instantaneous frequency Hilbert transform Wavelet transform Linear regression Adaptive filtering Resampling
下载PDF
Particle Filter Object Tracking Algorithm Based on Sparse Representation and Nonlinear Resampling 被引量:3
18
作者 Zheyi Fan Shuqin Weng +2 位作者 Jiao Jiang Yixuan Zhu Zhiwen Liu 《Journal of Beijing Institute of Technology》 EI CAS 2018年第1期51-57,共7页
Object tracking with abrupt motion is an important research topic and has attracted wide attention.To obtain accurate tracking results,an improved particle filter tracking algorithm based on sparse representation and ... Object tracking with abrupt motion is an important research topic and has attracted wide attention.To obtain accurate tracking results,an improved particle filter tracking algorithm based on sparse representation and nonlinear resampling is proposed in this paper. First,the sparse representation is used to compute particle weights by considering the fact that the weights are sparse when the object moves abruptly,so the potential object region can be predicted more precisely. Then,a nonlinear resampling process is proposed by utilizing the nonlinear sorting strategy,which can solve the problem of particle diversity impoverishment caused by traditional resampling methods. Experimental results based on videos containing objects with various abrupt motions have demonstrated the effectiveness of the proposed algorithm. 展开更多
关键词 object tracking abrupt motion particle filter sparse representation nonlinear resampling
下载PDF
Modified sequential importance resampling filter 被引量:1
19
作者 Yong Wu Jun Wang +1 位作者 Xiaoyong L Yunhe Cao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2015年第3期441-449,共9页
In order to deal with the particle degeneracy and impov- erishment problems existed in particle filters, a modified sequential importance resampling (MSIR) filter is proposed. In this filter, the resampling is trans... In order to deal with the particle degeneracy and impov- erishment problems existed in particle filters, a modified sequential importance resampling (MSIR) filter is proposed. In this filter, the resampling is translated into an evolutional process just like the biological evolution. A particle generator is constructed, which introduces the current measurement information (CMI) into the resampled particles. In the evolution, new particles are first pro- duced through the particle generator, each of which is essentially an unbiased estimation of the current true state. Then, new and old particles are recombined for the sake of raising the diversity among the particles. Finally, those particles who have low quality are eliminated. Through the evolution, all the particles retained are regarded as the optimal ones, and these particles are utilized to update the current state. By using the proposed resampling approach, not only the CMI is incorporated into each resampled particle, but also the particle degeneracy and the loss of diver- sity among the particles are mitigated, resulting in the improved estimation accuracy. Simulation results show the superiorities of the proposed filter over the standard sequential importance re- sampling (SIR) filter, auxiliary particle filter and unscented Kalman particle filter. 展开更多
关键词 sequential importance resampling (SIR) evolution current measurement information (CMI) unbiased estimation.
下载PDF
Performance of Resampling Algorithms Based on Particle Filter in Video Target Tracking
20
作者 韩华 王裕明 +1 位作者 张玉金 胡一帆 《Journal of Donghua University(English Edition)》 EI CAS 2016年第5期745-748,共4页
Particle filter is a common algorithm in video target tracking.But there are still some shortcomings,for example,particle degradation phenomenon.For solving this problem,the general solution is to introduce resampling... Particle filter is a common algorithm in video target tracking.But there are still some shortcomings,for example,particle degradation phenomenon.For solving this problem,the general solution is to introduce resampling step.At present,four kinds of resampling algorithms are widely used:multinomial resampling,residual resampling,stratified resampling and systematic resampling algorithms.In this paper,the performances of these four resampling algorithms were analyzed from realization principle,uniform distribution theory and computational complexity.Finally,through a series of video target tracking experiments,the systematic resampling algorithm had the smallest calculation load,the shortest running time and the maximum number of effective particles.So,it can be concluded that in the field of video target tracking,the systematic resampling algorithm has more advantages than other three algorithms both in the running time and the number of effective particles. 展开更多
关键词 video target tracking multinomial resampling residual resampling stratified resampling systematic resampling
下载PDF
上一页 1 2 3 下一页 到第
使用帮助 返回顶部