Yamamuro in [1] defines strong and weak transience of Markov processes; gives a criterion for strong transience of Feller processes; and further, discusses strong and weak transience of Ornstein-Uhlenbeck type process...Yamamuro in [1] defines strong and weak transience of Markov processes; gives a criterion for strong transience of Feller processes; and further, discusses strong and weak transience of Ornstein-Uhlenbeck type processes. In this article, the authors weaken the Feller property of the result in [1] to weak Feller property and discuss the strong transience of operator-self-similar Markov processes.展开更多
Conjugate gradient methods. are a class of important methods for unconstrained optimization, especially when the dimension is large. In 2001, Dai and Liao have proposed a new conjugate condition, based on it two nonli...Conjugate gradient methods. are a class of important methods for unconstrained optimization, especially when the dimension is large. In 2001, Dai and Liao have proposed a new conjugate condition, based on it two nonlinear conjugate gradient methods are constructed. With trust region idea, this paper gives a self-adaptive technique for the two methods. The numerical results show that this technique works well for the given nonlinear optimization test problems.展开更多
基金Research supported in part by the National Natural Science Foundation of China and a grant from the Ministry of Education of China
文摘Yamamuro in [1] defines strong and weak transience of Markov processes; gives a criterion for strong transience of Feller processes; and further, discusses strong and weak transience of Ornstein-Uhlenbeck type processes. In this article, the authors weaken the Feller property of the result in [1] to weak Feller property and discuss the strong transience of operator-self-similar Markov processes.
文摘Conjugate gradient methods. are a class of important methods for unconstrained optimization, especially when the dimension is large. In 2001, Dai and Liao have proposed a new conjugate condition, based on it two nonlinear conjugate gradient methods are constructed. With trust region idea, this paper gives a self-adaptive technique for the two methods. The numerical results show that this technique works well for the given nonlinear optimization test problems.