Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS m...Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS method for the global variance reduction problem based on the AIS method,which was implemented in the Monte Carlo program MCShield.The proposed method was validated using the VENUS-Ⅲ international benchmark problem and a self-shielding calculation example.The results from the VENUS-Ⅲ benchmark problem showed that the grid-AIS method achieved a significant reduction in the variance of the statistical errors of the MESH grids,decreasing from 1.08×10^(-2) to 3.84×10^(-3),representing a 64.00% reduction.This demonstrates that the grid-AIS method is effective in addressing global issues.The results of the selfshielding calculation demonstrate that the grid-AIS method produced accurate computational results.Moreover,the grid-AIS method exhibited a computational efficiency approximately one order of magnitude higher than that of the AIS method and approximately two orders of magnitude higher than that of the conventional Monte Carlo method.展开更多
The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking...The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking algorithm based on particle filter with mean shift importance sampling is proposed. First, the coarse location of the face target is attained by the efficient mean shift tracker, and then the result is used to construct the proposal distribution for particle propagation. Because the particles obtained with this method can cluster around the true state region, particle efficiency is improved greatly. The experimental results show that the performance of the proposed algorithm is better than that of the standard condensation tracking algorithm.展开更多
Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the...Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the point of maximum likelihood. The sampling region is a hyper-ellipsoid that consists of the sampling ellipse on each plane of main curvature in V-space. Thus, the sampling probability density function can be constructed by the sampling region center and ellipsoid axes. Several examples have shown the efficiency and generality of this method.展开更多
The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computi...The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computing efficiency and accuracy of the current analysismethods.In this case,by fitting the implicit limit state function(LSF)with active Kriging(AK)model and reducing candidate sample poolwith adaptive importance sampling(AIS),a novel AK-AIS method is proposed.Herein,theAKmodel andMarkov chainMonte Carlo(MCMC)are first established to identify the most probable failure region(s)(MPFRs),and the adaptive kernel density estimation(AKDE)importance sampling function is constructed to select the candidate samples.With the best samples sequentially attained in the reduced candidate samples and employed to update the Kriging-fitted LSF,the failure probability and sensitivity indices are acquired at a lower cost.The proposed method is verified by twomulti-failure numerical examples,and then applied to the reliability and sensitivity analyses of a typical stator blade regulator.Withmethods comparison,the proposed AK-AIS is proven to hold the computing advantages on accuracy and efficiency in complex reliability and sensitivity analysis problems.展开更多
In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description o...In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description of the uniform linear array(ULA), a decoupled concentrated likelihood function(CLF) is presented. A new objective function based on CLF which can obtain a closed-form solution of global maximum is constructed according to Pincus theorem. To obtain the optimal value of the objective function which is a complex high-dimensional integral,we propose an importance sampling approach based on Monte Carlo random calculation. Next, an importance function is derived, which can simplify the problem of generating random vector from a high-dimensional probability density function(PDF) to generate random variable from a one-dimensional PDF. Compared with the existing maximum likelihood(ML) algorithms for DOA estimation of ID sources, the proposed algorithm does not require initial estimates, and its performance is closer to CramerRao lower bound(CRLB). The proposed algorithm performs better than the existing methods when the interval between sources to be estimated is small and in low signal to noise ratio(SNR)scenarios.展开更多
In order to deal with the particle degeneracy and impov- erishment problems existed in particle filters, a modified sequential importance resampling (MSIR) filter is proposed. In this filter, the resampling is trans...In order to deal with the particle degeneracy and impov- erishment problems existed in particle filters, a modified sequential importance resampling (MSIR) filter is proposed. In this filter, the resampling is translated into an evolutional process just like the biological evolution. A particle generator is constructed, which introduces the current measurement information (CMI) into the resampled particles. In the evolution, new particles are first pro- duced through the particle generator, each of which is essentially an unbiased estimation of the current true state. Then, new and old particles are recombined for the sake of raising the diversity among the particles. Finally, those particles who have low quality are eliminated. Through the evolution, all the particles retained are regarded as the optimal ones, and these particles are utilized to update the current state. By using the proposed resampling approach, not only the CMI is incorporated into each resampled particle, but also the particle degeneracy and the loss of diver- sity among the particles are mitigated, resulting in the improved estimation accuracy. Simulation results show the superiorities of the proposed filter over the standard sequential importance re- sampling (SIR) filter, auxiliary particle filter and unscented Kalman particle filter.展开更多
The design, analysis and parallel implementation of particle filter(PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function(IIDF) was proposed, wher...The design, analysis and parallel implementation of particle filter(PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function(IIDF) was proposed, where a new term associating with the current measurement information(CMI) was introduced into the expression of the sampled particles. Through the repeated use of the least squares estimate, the CMI can be integrated into the sampling stage in an iterative manner, conducing to the greatly improved sampling quality. By running the IIDF, an iterated PF(IPF) can be obtained. Subsequently, a parallel resampling(PR) was proposed for the purpose of parallel implementation of IPF, whose main idea was the same as systematic resampling(SR) but performed differently. The PR directly used the integral part of the product of the particle weight and particle number as the number of times that a particle was replicated, and it simultaneously eliminated the particles with the smallest weights, which are the two key differences from the SR. The detailed implementation procedures on the graphics processing unit of IPF based on the PR were presented at last. The performance of the IPF, PR and their parallel implementations are illustrated via one-dimensional numerical simulation and practical application of passive radar target tracking.展开更多
It is assumed that the storm wave takes place once a year during the design period, and Nhistories of storm waves are generated on the basis of wave spectrum corresponding to the N-year design period. The responses of...It is assumed that the storm wave takes place once a year during the design period, and Nhistories of storm waves are generated on the basis of wave spectrum corresponding to the N-year design period. The responses of the breakwater to the N histories of storm waves in the N-year design period are calculated by mass-spring-dashpot mode and taken as a set of samples. The failure probability of caisson breakwaters during the design period of N years is obtained by the statistical analysis of many sets of samples. It is the key issue to improve the efficiency of the common Monte Carlo simulation method in the failure probability estimation of caisson breakwaters in the complete life cycle. In this paper, the kernel method of importance sampling, which can greatly increase the efficiency of failure probability calculation of caisson breakwaters, is proposed to estimate the failure probability of caisson breakwaters in the complete life cycle. The effectiveness of the kernel method is investigated by an example. It is indicated that the calculation efficiency of the kernel method is over 10 times the common Monte Carlo simulation method.展开更多
The process of changing the channel associated with the current connection while a call is in progress is under consideration. The estimation of dropping rate in handover process of a one dimensional traffic system is...The process of changing the channel associated with the current connection while a call is in progress is under consideration. The estimation of dropping rate in handover process of a one dimensional traffic system is discussed. To reduce the sample size of simulation, dropping calls at base station is considered as rare event and simulated with importance sampling - one of rare event simulation approaches. The simulation results suggest the sample size can be tremendously reduced by using importance sampling.展开更多
We investigate the phenomena of spontaneous symmetry breaking for φ^4 model on a square lattice in the parameter space by using the potential importance samplingmethod, which was proposed by Milchev, Heermann, and Bi...We investigate the phenomena of spontaneous symmetry breaking for φ^4 model on a square lattice in the parameter space by using the potential importance samplingmethod, which was proposed by Milchev, Heermann, and Binder [J. Star. Phys. 44 (1986) 749]. The critical values of the parameters allow us to determine the phase diagram of the model. At the same time, some relevant quantifies such as susceptibility and specific heat are also obtained.展开更多
Value at Risk (VaR) is an important tool for estimating the risk of a financial portfolio under significant loss. Although Monte Carlo simulation is a powerful tool for estimating VaR, it is quite inefficient since th...Value at Risk (VaR) is an important tool for estimating the risk of a financial portfolio under significant loss. Although Monte Carlo simulation is a powerful tool for estimating VaR, it is quite inefficient since the event of significant loss is usually rare. Previous studies suggest that the performance of the Monte Carlo simulation can be improved by impor-tance sampling if the market returns follow the normality or the distributions. The first contribution of our paper is to extend the importance sampling method for dealing with jump-diffusion market returns, which can more precisely model the phenomenon of high peaks, heavy tails, and jumps of market returns mentioned in numerous empirical study papers. This paper also points out that for portfolios of which the huge loss is triggered by significantly distinct events, naively applying importance sampling method can result in poor performance. The second contribution of our paper is to develop the hybrid importance sampling method for the aforementioned problem. Our method decomposes a Monte Carlo simulation into sub simulations, and each sub simulation focuses only on one huge loss event. Thus the perform-ance for each sub simulation is improved by importance sampling method, and overall performance is optimized by determining the allotment of samples to each sub simulation by Lagrange’s multiplier. Numerical experiments are given to verify the superiority of our method.展开更多
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenz...The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.展开更多
This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identifi...This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identification development follows a transparent image processing paradigm completely independent of state-of-the-art structural dynamics, aiming at delivering a simple and wide purpose method. Validation of the proposed importance sampling strategy is based on multi-scale clusters of realizations of digitally generated non-stationary stochastic processes. Good agreement with the reference pure Monte Carlo results indicates a significant potential in reducing the computational task of first passage probabilities estimation, an important feature in the field of e.g., probabilistic seismic design or risk assessment generally.展开更多
为应对基于游戏的学习平台在知识追踪应用方面的不足,本研究利用Field Day Lab提供的教育游戏用户日志进行深入分析。采用方差法和Null Importance方法对数据集进行降维处理,并结合K折交叉验证与LightGBM算法,建立了一个高效的预测模型...为应对基于游戏的学习平台在知识追踪应用方面的不足,本研究利用Field Day Lab提供的教育游戏用户日志进行深入分析。采用方差法和Null Importance方法对数据集进行降维处理,并结合K折交叉验证与LightGBM算法,建立了一个高效的预测模型。此外,通过集成Logistic模型,构建起Stacking模型。研究表明,该模型在验证集上的Macro-F1值显著提升至0.699,同时也在测试集上显示出优异的泛化能力。本研究为教育游戏领域的知识追踪提供了创新方法,并为游戏开发与教育实践提供了宝贵参考,支持教育游戏的开发者为学生创造更有效的学习体验。展开更多
In this paper,we establish a new multivariate Hermite sampling series involving samples from the function itself and its mixed and non-mixed partial derivatives of arbitrary order.This multivariate form of Hermite sam...In this paper,we establish a new multivariate Hermite sampling series involving samples from the function itself and its mixed and non-mixed partial derivatives of arbitrary order.This multivariate form of Hermite sampling will be valid for some classes of multivariate entire functions,satisfying certain growth conditions.We will show that many known results included in Commun Korean Math Soc,2002,17:731-740,Turk J Math,2017,41:387-403 and Filomat,2020,34:3339-3347 are special cases of our results.Moreover,we estimate the truncation error of this sampling based on localized sampling without decay assumption.Illustrative examples are also presented.展开更多
This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of t...This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.展开更多
The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms,yielding outstanding achievements across diverse domains.Nonetheless,self-atten...The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms,yielding outstanding achievements across diverse domains.Nonetheless,self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures.In response,this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network(DSLD),which adopts a diffusion sampling method to capture more comprehensive semantic information of the data.Additionally,themodel leverages the joint correlation information of labels and data to introduce the computation of text representation,correcting semantic representationbiases in thedata,andincreasing the accuracyof semantic representation.Ultimately,the model computes the corresponding classification results by synthesizing these rich data semantic representations.Experiments on seven benchmark datasets show that our proposed model achieves competitive results compared to state-of-the-art methods.展开更多
The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with...The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with molecular simulations to improve the sampling efficiency of the vast conformational space of large biomolecules.This review focuses on recent studies that utilize ML-based techniques in the exploration of protein conformational landscape.We first highlight the recent development of ML-aided enhanced sampling methods,including heuristic algorithms and neural networks that are designed to refine the selection of reaction coordinates for the construction of bias potential,or facilitate the exploration of the unsampled region of the energy landscape.Further,we review the development of autoencoder based methods that combine molecular simulations and deep learning to expand the search for protein conformations.Lastly,we discuss the cutting-edge methodologies for the one-shot generation of protein conformations with precise Boltzmann weights.Collectively,this review demonstrates the promising potential of machine learning in revolutionizing our insight into the complex conformational ensembles of proteins.展开更多
Peer-to-peer(P2P)overlay networks provide message transmission capabilities for blockchain systems.Improving data transmission efficiency in P2P networks can greatly enhance the performance of blockchain systems.Howev...Peer-to-peer(P2P)overlay networks provide message transmission capabilities for blockchain systems.Improving data transmission efficiency in P2P networks can greatly enhance the performance of blockchain systems.However,traditional blockchain P2P networks face a common challenge where there is often a mismatch between the upper-layer traffic requirements and the underlying physical network topology.This mismatch results in redundant data transmission and inefficient routing,severely constraining the scalability of blockchain systems.To address these pressing issues,we propose FPSblo,an efficient transmission method for blockchain networks.Our inspiration for FPSblo stems from the Farthest Point Sampling(FPS)algorithm,a well-established technique widely utilized in point cloud image processing.In this work,we analogize blockchain nodes to points in a point cloud image and select a representative set of nodes to prioritize message forwarding so that messages reach the network edge quickly and are evenly distributed.Moreover,we compare our model with the Kadcast transmission model,which is a classic improvement model for blockchain P2P transmission networks,the experimental findings show that the FPSblo model reduces 34.8%of transmission redundancy and reduces the overload rate by 37.6%.By conducting experimental analysis,the FPS-BT model enhances the transmission capabilities of the P2P network in blockchain.展开更多
For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT ...For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT to shorten the search time,the search area of the randomtree is restricted to improve the sampling efficiency.Secondly,to obtain better information about obstacles to shorten the path length,a feedback-biased sampling strategy is used instead of the traditional random sampling,the collision of the expanding node with an obstacle generates feedback information so that the next expanding node avoids expanding within a specific angle range.Thirdly,this paper proposes using the inverse optimization strategy to remove redundancy points from the initial path,making the path shorter and more accurate.Finally,to satisfy the smooth operation of the robot in practice,auxiliary points are used to optimize the cubic Bezier curve to avoid path-crossing obstacles when using the Bezier curve optimization.The experimental results demonstrate that,compared to the traditional RRT algorithm,the proposed FS-RRT algorithm performs favorably against mainstream algorithms regarding running time,number of search iterations,and path length.Moreover,the improved algorithm also performs well in a narrow obstacle environment,and its effectiveness is further confirmed by experimental verification.展开更多
基金supported by the Platform Development Foundation of the China Institute for Radiation Protection(No.YP21030101)the National Natural Science Foundation of China(General Program)(Nos.12175114,U2167209)+1 种基金the National Key R&D Program of China(No.2021YFF0603600)the Tsinghua University Initiative Scientific Research Program(No.20211080081).
文摘Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS method for the global variance reduction problem based on the AIS method,which was implemented in the Monte Carlo program MCShield.The proposed method was validated using the VENUS-Ⅲ international benchmark problem and a self-shielding calculation example.The results from the VENUS-Ⅲ benchmark problem showed that the grid-AIS method achieved a significant reduction in the variance of the statistical errors of the MESH grids,decreasing from 1.08×10^(-2) to 3.84×10^(-3),representing a 64.00% reduction.This demonstrates that the grid-AIS method is effective in addressing global issues.The results of the selfshielding calculation demonstrate that the grid-AIS method produced accurate computational results.Moreover,the grid-AIS method exhibited a computational efficiency approximately one order of magnitude higher than that of the AIS method and approximately two orders of magnitude higher than that of the conventional Monte Carlo method.
基金The National Natural Science Foundation of China(No60672094)
文摘The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking algorithm based on particle filter with mean shift importance sampling is proposed. First, the coarse location of the face target is attained by the efficient mean shift tracker, and then the result is used to construct the proposal distribution for particle propagation. Because the particles obtained with this method can cluster around the true state region, particle efficiency is improved greatly. The experimental results show that the performance of the proposed algorithm is better than that of the standard condensation tracking algorithm.
文摘Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the point of maximum likelihood. The sampling region is a hyper-ellipsoid that consists of the sampling ellipse on each plane of main curvature in V-space. Thus, the sampling probability density function can be constructed by the sampling region center and ellipsoid axes. Several examples have shown the efficiency and generality of this method.
基金supported by the National Natural Science Foundation of China under Grant Nos.52105136,51975028China Postdoctoral Science Foundation under Grant[No.2021M690290]the National Science and TechnologyMajor Project under Grant No.J2019-IV-0002-0069.
文摘The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computing efficiency and accuracy of the current analysismethods.In this case,by fitting the implicit limit state function(LSF)with active Kriging(AK)model and reducing candidate sample poolwith adaptive importance sampling(AIS),a novel AK-AIS method is proposed.Herein,theAKmodel andMarkov chainMonte Carlo(MCMC)are first established to identify the most probable failure region(s)(MPFRs),and the adaptive kernel density estimation(AKDE)importance sampling function is constructed to select the candidate samples.With the best samples sequentially attained in the reduced candidate samples and employed to update the Kriging-fitted LSF,the failure probability and sensitivity indices are acquired at a lower cost.The proposed method is verified by twomulti-failure numerical examples,and then applied to the reliability and sensitivity analyses of a typical stator blade regulator.Withmethods comparison,the proposed AK-AIS is proven to hold the computing advantages on accuracy and efficiency in complex reliability and sensitivity analysis problems.
基金supported by the basic research program of Natural Science in Shannxi province of China (2021JQ-369)。
文摘In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description of the uniform linear array(ULA), a decoupled concentrated likelihood function(CLF) is presented. A new objective function based on CLF which can obtain a closed-form solution of global maximum is constructed according to Pincus theorem. To obtain the optimal value of the objective function which is a complex high-dimensional integral,we propose an importance sampling approach based on Monte Carlo random calculation. Next, an importance function is derived, which can simplify the problem of generating random vector from a high-dimensional probability density function(PDF) to generate random variable from a one-dimensional PDF. Compared with the existing maximum likelihood(ML) algorithms for DOA estimation of ID sources, the proposed algorithm does not require initial estimates, and its performance is closer to CramerRao lower bound(CRLB). The proposed algorithm performs better than the existing methods when the interval between sources to be estimated is small and in low signal to noise ratio(SNR)scenarios.
基金supported by the National Natural Science Foundation of China(61372136)
文摘In order to deal with the particle degeneracy and impov- erishment problems existed in particle filters, a modified sequential importance resampling (MSIR) filter is proposed. In this filter, the resampling is translated into an evolutional process just like the biological evolution. A particle generator is constructed, which introduces the current measurement information (CMI) into the resampled particles. In the evolution, new particles are first pro- duced through the particle generator, each of which is essentially an unbiased estimation of the current true state. Then, new and old particles are recombined for the sake of raising the diversity among the particles. Finally, those particles who have low quality are eliminated. Through the evolution, all the particles retained are regarded as the optimal ones, and these particles are utilized to update the current state. By using the proposed resampling approach, not only the CMI is incorporated into each resampled particle, but also the particle degeneracy and the loss of diver- sity among the particles are mitigated, resulting in the improved estimation accuracy. Simulation results show the superiorities of the proposed filter over the standard sequential importance re- sampling (SIR) filter, auxiliary particle filter and unscented Kalman particle filter.
基金Project(61372136) supported by the National Natural Science Foundation of China
文摘The design, analysis and parallel implementation of particle filter(PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function(IIDF) was proposed, where a new term associating with the current measurement information(CMI) was introduced into the expression of the sampled particles. Through the repeated use of the least squares estimate, the CMI can be integrated into the sampling stage in an iterative manner, conducing to the greatly improved sampling quality. By running the IIDF, an iterated PF(IPF) can be obtained. Subsequently, a parallel resampling(PR) was proposed for the purpose of parallel implementation of IPF, whose main idea was the same as systematic resampling(SR) but performed differently. The PR directly used the integral part of the product of the particle weight and particle number as the number of times that a particle was replicated, and it simultaneously eliminated the particles with the smallest weights, which are the two key differences from the SR. The detailed implementation procedures on the graphics processing unit of IPF based on the PR were presented at last. The performance of the IPF, PR and their parallel implementations are illustrated via one-dimensional numerical simulation and practical application of passive radar target tracking.
基金financially supported by the National Natural Science Foundation of China(Grant No.51279128)the Innovative Research Groups Science Foundation of China(Grant No.51321065)the Construction Science and Technology Project of Ministry of Transport of the People's Republic of China(Grant No.2013328224070)
文摘It is assumed that the storm wave takes place once a year during the design period, and Nhistories of storm waves are generated on the basis of wave spectrum corresponding to the N-year design period. The responses of the breakwater to the N histories of storm waves in the N-year design period are calculated by mass-spring-dashpot mode and taken as a set of samples. The failure probability of caisson breakwaters during the design period of N years is obtained by the statistical analysis of many sets of samples. It is the key issue to improve the efficiency of the common Monte Carlo simulation method in the failure probability estimation of caisson breakwaters in the complete life cycle. In this paper, the kernel method of importance sampling, which can greatly increase the efficiency of failure probability calculation of caisson breakwaters, is proposed to estimate the failure probability of caisson breakwaters in the complete life cycle. The effectiveness of the kernel method is investigated by an example. It is indicated that the calculation efficiency of the kernel method is over 10 times the common Monte Carlo simulation method.
文摘The process of changing the channel associated with the current connection while a call is in progress is under consideration. The estimation of dropping rate in handover process of a one dimensional traffic system is discussed. To reduce the sample size of simulation, dropping calls at base station is considered as rare event and simulated with importance sampling - one of rare event simulation approaches. The simulation results suggest the sample size can be tremendously reduced by using importance sampling.
文摘We investigate the phenomena of spontaneous symmetry breaking for φ^4 model on a square lattice in the parameter space by using the potential importance samplingmethod, which was proposed by Milchev, Heermann, and Binder [J. Star. Phys. 44 (1986) 749]. The critical values of the parameters allow us to determine the phase diagram of the model. At the same time, some relevant quantifies such as susceptibility and specific heat are also obtained.
文摘Value at Risk (VaR) is an important tool for estimating the risk of a financial portfolio under significant loss. Although Monte Carlo simulation is a powerful tool for estimating VaR, it is quite inefficient since the event of significant loss is usually rare. Previous studies suggest that the performance of the Monte Carlo simulation can be improved by impor-tance sampling if the market returns follow the normality or the distributions. The first contribution of our paper is to extend the importance sampling method for dealing with jump-diffusion market returns, which can more precisely model the phenomenon of high peaks, heavy tails, and jumps of market returns mentioned in numerous empirical study papers. This paper also points out that for portfolios of which the huge loss is triggered by significantly distinct events, naively applying importance sampling method can result in poor performance. The second contribution of our paper is to develop the hybrid importance sampling method for the aforementioned problem. Our method decomposes a Monte Carlo simulation into sub simulations, and each sub simulation focuses only on one huge loss event. Thus the perform-ance for each sub simulation is improved by importance sampling method, and overall performance is optimized by determining the allotment of samples to each sub simulation by Lagrange’s multiplier. Numerical experiments are given to verify the superiority of our method.
文摘The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.
文摘This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identification development follows a transparent image processing paradigm completely independent of state-of-the-art structural dynamics, aiming at delivering a simple and wide purpose method. Validation of the proposed importance sampling strategy is based on multi-scale clusters of realizations of digitally generated non-stationary stochastic processes. Good agreement with the reference pure Monte Carlo results indicates a significant potential in reducing the computational task of first passage probabilities estimation, an important feature in the field of e.g., probabilistic seismic design or risk assessment generally.
文摘为应对基于游戏的学习平台在知识追踪应用方面的不足,本研究利用Field Day Lab提供的教育游戏用户日志进行深入分析。采用方差法和Null Importance方法对数据集进行降维处理,并结合K折交叉验证与LightGBM算法,建立了一个高效的预测模型。此外,通过集成Logistic模型,构建起Stacking模型。研究表明,该模型在验证集上的Macro-F1值显著提升至0.699,同时也在测试集上显示出优异的泛化能力。本研究为教育游戏领域的知识追踪提供了创新方法,并为游戏开发与教育实践提供了宝贵参考,支持教育游戏的开发者为学生创造更有效的学习体验。
文摘In this paper,we establish a new multivariate Hermite sampling series involving samples from the function itself and its mixed and non-mixed partial derivatives of arbitrary order.This multivariate form of Hermite sampling will be valid for some classes of multivariate entire functions,satisfying certain growth conditions.We will show that many known results included in Commun Korean Math Soc,2002,17:731-740,Turk J Math,2017,41:387-403 and Filomat,2020,34:3339-3347 are special cases of our results.Moreover,we estimate the truncation error of this sampling based on localized sampling without decay assumption.Illustrative examples are also presented.
基金the Science,Research and Innovation Promotion Funding(TSRI)(Grant No.FRB660012/0168)managed under Rajamangala University of Technology Thanyaburi(FRB66E0646O.4).
文摘This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.
基金the Communication University of China(CUC230A013)the Fundamental Research Funds for the Central Universities.
文摘The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms,yielding outstanding achievements across diverse domains.Nonetheless,self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures.In response,this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network(DSLD),which adopts a diffusion sampling method to capture more comprehensive semantic information of the data.Additionally,themodel leverages the joint correlation information of labels and data to introduce the computation of text representation,correcting semantic representationbiases in thedata,andincreasing the accuracyof semantic representation.Ultimately,the model computes the corresponding classification results by synthesizing these rich data semantic representations.Experiments on seven benchmark datasets show that our proposed model achieves competitive results compared to state-of-the-art methods.
基金Project supported by the National Key Research and Development Program of China(Grant No.2023YFF1204402)the National Natural Science Foundation of China(Grant Nos.12074079 and 12374208)+1 种基金the Natural Science Foundation of Shanghai(Grant No.22ZR1406800)the China Postdoctoral Science Foundation(Grant No.2022M720815).
文摘The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with molecular simulations to improve the sampling efficiency of the vast conformational space of large biomolecules.This review focuses on recent studies that utilize ML-based techniques in the exploration of protein conformational landscape.We first highlight the recent development of ML-aided enhanced sampling methods,including heuristic algorithms and neural networks that are designed to refine the selection of reaction coordinates for the construction of bias potential,or facilitate the exploration of the unsampled region of the energy landscape.Further,we review the development of autoencoder based methods that combine molecular simulations and deep learning to expand the search for protein conformations.Lastly,we discuss the cutting-edge methodologies for the one-shot generation of protein conformations with precise Boltzmann weights.Collectively,this review demonstrates the promising potential of machine learning in revolutionizing our insight into the complex conformational ensembles of proteins.
基金This present research work was supported by the National Key R&D Program of China(No.2021YFB2700800)the GHfund B(No.202302024490).
文摘Peer-to-peer(P2P)overlay networks provide message transmission capabilities for blockchain systems.Improving data transmission efficiency in P2P networks can greatly enhance the performance of blockchain systems.However,traditional blockchain P2P networks face a common challenge where there is often a mismatch between the upper-layer traffic requirements and the underlying physical network topology.This mismatch results in redundant data transmission and inefficient routing,severely constraining the scalability of blockchain systems.To address these pressing issues,we propose FPSblo,an efficient transmission method for blockchain networks.Our inspiration for FPSblo stems from the Farthest Point Sampling(FPS)algorithm,a well-established technique widely utilized in point cloud image processing.In this work,we analogize blockchain nodes to points in a point cloud image and select a representative set of nodes to prioritize message forwarding so that messages reach the network edge quickly and are evenly distributed.Moreover,we compare our model with the Kadcast transmission model,which is a classic improvement model for blockchain P2P transmission networks,the experimental findings show that the FPSblo model reduces 34.8%of transmission redundancy and reduces the overload rate by 37.6%.By conducting experimental analysis,the FPS-BT model enhances the transmission capabilities of the P2P network in blockchain.
基金provided by Shaanxi Province’s Key Research and Development Plan(No.2022NY-087).
文摘For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT to shorten the search time,the search area of the randomtree is restricted to improve the sampling efficiency.Secondly,to obtain better information about obstacles to shorten the path length,a feedback-biased sampling strategy is used instead of the traditional random sampling,the collision of the expanding node with an obstacle generates feedback information so that the next expanding node avoids expanding within a specific angle range.Thirdly,this paper proposes using the inverse optimization strategy to remove redundancy points from the initial path,making the path shorter and more accurate.Finally,to satisfy the smooth operation of the robot in practice,auxiliary points are used to optimize the cubic Bezier curve to avoid path-crossing obstacles when using the Bezier curve optimization.The experimental results demonstrate that,compared to the traditional RRT algorithm,the proposed FS-RRT algorithm performs favorably against mainstream algorithms regarding running time,number of search iterations,and path length.Moreover,the improved algorithm also performs well in a narrow obstacle environment,and its effectiveness is further confirmed by experimental verification.