The generalized least squares (LS) problem ... (Ax - b)[sup TW[sup -1](Ax - b) appears in, many application areas. Here W is an m × m symmetric positive definite matrix and A is an m × n matrix with m ≥ n. ...The generalized least squares (LS) problem ... (Ax - b)[sup TW[sup -1](Ax - b) appears in, many application areas. Here W is an m × m symmetric positive definite matrix and A is an m × n matrix with m ≥ n. Since the problem has many solutions in rank deficient case, some special preconditioned techniques are adapted to obtain the minimum 2-norm solution. A block SOR method and the preconditioned conjugate gradient (PCG) method are proposed here. Convergence and optimal relaxation parameter for the block SOR method are studied. An error bound for the PCG method is given. The comparison of these methods is investigated. Some remarks on the implementation of the methods and the operation cost are given as well. [ABSTRACT FROM AUTHOR]展开更多
基金CNPq, Brazil!301035/93-8University of Macao!RG010/ 99- 00S / JXQ / FST
文摘The generalized least squares (LS) problem ... (Ax - b)[sup TW[sup -1](Ax - b) appears in, many application areas. Here W is an m × m symmetric positive definite matrix and A is an m × n matrix with m ≥ n. Since the problem has many solutions in rank deficient case, some special preconditioned techniques are adapted to obtain the minimum 2-norm solution. A block SOR method and the preconditioned conjugate gradient (PCG) method are proposed here. Convergence and optimal relaxation parameter for the block SOR method are studied. An error bound for the PCG method is given. The comparison of these methods is investigated. Some remarks on the implementation of the methods and the operation cost are given as well. [ABSTRACT FROM AUTHOR]