Stiffened structures have great potential for improvingmechanical performance,and the study of their stability is of great interest.In this paper,the optimization of the critical buckling load factor for curved grid s...Stiffened structures have great potential for improvingmechanical performance,and the study of their stability is of great interest.In this paper,the optimization of the critical buckling load factor for curved grid stiffeners is solved by using the level set based density method,where the shape and cross section(including thickness and width)of the stiffeners can be optimized simultaneously.The grid stiffeners are a combination ofmany single stiffenerswhich are projected by the corresponding level set functions.The thickness and width of each stiffener are designed to be independent variables in the projection applied to each level set function.Besides,the path of each single stiffener is described by the zero iso-contour of the level set function.All the single stiffeners are combined together by using the p-norm method to obtain the stiffener grid.The proposed method is validated by several numerical examples to optimize the critical buckling load factor.展开更多
Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under qui...Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under quite general conditions, guarantee Bayesian optimality of highest posterior probability sets. We focus on three specific families of monotone losses, namely the linear, the exponential and the rational losses whose difference consists in the way the sizes of the sets are penalized. Within the standard yet important set-up of a normal model we propose: 1) an optimality analysis, to compare the solutions yielded by the alternative classes of losses;2) a regret analysis, to evaluate the additional loss of standard non-optimal intervals of fixed credibility. The article uses an application to a clinical trial as an illustrative example.展开更多
As a typical implementation of the probability hypothesis density(PHD) filter, sequential Monte Carlo PHD(SMC-PHD) is widely employed in highly nonlinear systems. However, the particle impoverishment problem introduce...As a typical implementation of the probability hypothesis density(PHD) filter, sequential Monte Carlo PHD(SMC-PHD) is widely employed in highly nonlinear systems. However, the particle impoverishment problem introduced by the resampling step, together with the high computational burden problem, may lead to performance degradation and restrain the use of SMC-PHD filter in practical applications. In this work, a novel SMC-PHD filter based on particle compensation is proposed to solve above problems. Firstly, according to a comprehensive analysis on the particle impoverishment problem, a new particle generating mechanism is developed to compensate the particles. Then, all the particles are integrated into the SMC-PHD filter framework. Simulation results demonstrate that, in comparison with the SMC-PHD filter, proposed PC-SMC-PHD filter is capable of overcoming the particle impoverishment problem, as well as improving the processing rate for a certain tracking accuracy in different scenarios.展开更多
It is understood that the forward-backward probability hypothesis density (PHD) smoothing algorithms proposed recently can significantly improve state estimation of targets. However, our analyses in this paper show ...It is understood that the forward-backward probability hypothesis density (PHD) smoothing algorithms proposed recently can significantly improve state estimation of targets. However, our analyses in this paper show that they cannot give a good cardinality (i.e., the number of targets) estimate. This is because backward smoothing ignores the effect of temporary track drop- ping caused by forward filtering and/or anomalous smoothing resulted from deaths of targets. To cope with such a problem, a novel PHD smoothing algorithm, called the variable-lag PHD smoother, in which a detection process used to identify whether the filtered cardinality varies within the smooth lag is added before backward smoothing, is developed here. The analytical results show that the proposed smoother can almost eliminate the influences of temporary track dropping and anomalous smoothing, while both the cardinality and the state estimations can significantly be improved. Simulation results on two multi-target tracking scenarios verify the effectiveness of the proposed smoother.展开更多
基金supported by the National Natural Science Foundation of China(Grant Nos.51975227 and 12272144).
文摘Stiffened structures have great potential for improvingmechanical performance,and the study of their stability is of great interest.In this paper,the optimization of the critical buckling load factor for curved grid stiffeners is solved by using the level set based density method,where the shape and cross section(including thickness and width)of the stiffeners can be optimized simultaneously.The grid stiffeners are a combination ofmany single stiffenerswhich are projected by the corresponding level set functions.The thickness and width of each stiffener are designed to be independent variables in the projection applied to each level set function.Besides,the path of each single stiffener is described by the zero iso-contour of the level set function.All the single stiffeners are combined together by using the p-norm method to obtain the stiffener grid.The proposed method is validated by several numerical examples to optimize the critical buckling load factor.
文摘Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under quite general conditions, guarantee Bayesian optimality of highest posterior probability sets. We focus on three specific families of monotone losses, namely the linear, the exponential and the rational losses whose difference consists in the way the sizes of the sets are penalized. Within the standard yet important set-up of a normal model we propose: 1) an optimality analysis, to compare the solutions yielded by the alternative classes of losses;2) a regret analysis, to evaluate the additional loss of standard non-optimal intervals of fixed credibility. The article uses an application to a clinical trial as an illustrative example.
基金Projects(61671462,61471383,61671463,61304103)supported by the National Natural Science Foundation of ChinaProject(ZR2012FQ004)supported by the Natural Science Foundation of Shandong Province,China
文摘As a typical implementation of the probability hypothesis density(PHD) filter, sequential Monte Carlo PHD(SMC-PHD) is widely employed in highly nonlinear systems. However, the particle impoverishment problem introduced by the resampling step, together with the high computational burden problem, may lead to performance degradation and restrain the use of SMC-PHD filter in practical applications. In this work, a novel SMC-PHD filter based on particle compensation is proposed to solve above problems. Firstly, according to a comprehensive analysis on the particle impoverishment problem, a new particle generating mechanism is developed to compensate the particles. Then, all the particles are integrated into the SMC-PHD filter framework. Simulation results demonstrate that, in comparison with the SMC-PHD filter, proposed PC-SMC-PHD filter is capable of overcoming the particle impoverishment problem, as well as improving the processing rate for a certain tracking accuracy in different scenarios.
基金co-supported by the National Natural Science Foundation of China(No.61171127)NSF of China(No.60972024)NSTMP of China(No.2011ZX03003-001-02 and No.2012ZX03001007-003)
文摘It is understood that the forward-backward probability hypothesis density (PHD) smoothing algorithms proposed recently can significantly improve state estimation of targets. However, our analyses in this paper show that they cannot give a good cardinality (i.e., the number of targets) estimate. This is because backward smoothing ignores the effect of temporary track drop- ping caused by forward filtering and/or anomalous smoothing resulted from deaths of targets. To cope with such a problem, a novel PHD smoothing algorithm, called the variable-lag PHD smoother, in which a detection process used to identify whether the filtered cardinality varies within the smooth lag is added before backward smoothing, is developed here. The analytical results show that the proposed smoother can almost eliminate the influences of temporary track dropping and anomalous smoothing, while both the cardinality and the state estimations can significantly be improved. Simulation results on two multi-target tracking scenarios verify the effectiveness of the proposed smoother.