The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochast...The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochastic hill climbing(SHC)algorithm is used to make a random disturbance to the given initial value of the relative orientation element,and the new value to guarantee the optimization direction is generated.②In local optimization,a super-linear convergent conjugate gradient method is used to replace the steepest descent method in relative orientation to improve its convergence rate.③The global convergence condition is that the calculation error is less than the prescribed limit error.The comparison experiment shows that the method proposed in this paper is independent of the initial value,and has higher accuracy and fewer iterations.展开更多
基金National Natural Science Foundation of China(Nos.4156108241161061)。
文摘The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochastic hill climbing(SHC)algorithm is used to make a random disturbance to the given initial value of the relative orientation element,and the new value to guarantee the optimization direction is generated.②In local optimization,a super-linear convergent conjugate gradient method is used to replace the steepest descent method in relative orientation to improve its convergence rate.③The global convergence condition is that the calculation error is less than the prescribed limit error.The comparison experiment shows that the method proposed in this paper is independent of the initial value,and has higher accuracy and fewer iterations.