Volume 2012, Issue 1 563586
Research Article
Open Access

Comparison Results on Preconditioned GAOR Methods for Weighted Linear Least Squares Problems

Guangbin Wang

Corresponding Author

Guangbin Wang

Department of Mathematics, Qingdao University of Science and Technology, Qingdao 266061, China qust.edu.cn

Search for more papers by this author
Yanli Du

Yanli Du

Department of Mathematics, Qingdao University of Science and Technology, Qingdao 266061, China qust.edu.cn

Search for more papers by this author
Fuping Tan

Fuping Tan

Department of Mathematics, Shanghai University, Shanghai 200444, China shu.edu.cn

Search for more papers by this author
First published: 26 September 2012
Citations: 3
Academic Editor: Zhongxiao Jia

Abstract

We present preconditioned generalized accelerated overrelaxation methods for solving weighted linear least square problems. We compare the spectral radii of the iteration matrices of the preconditioned and the original methods. The comparison results show that the preconditioned GAOR methods converge faster than the GAOR method whenever the GAOR method is convergent. Finally, we give a numerical example to confirm our theoretical results.

1. Introduction

Consider the weighted linear least squares problem
()
where W is the variance-covariance matrix. The problem has many scientific applications. A typical source is parameter estimation in mathematical modeling.
This problem has been discussed in many books and articles. In order to solve it, one has to solve a nonsingular linear system as
()
where
()
is an invertible matrix with
()

Yuan proposed a generalized SOR (GSOR) method to solve linear system (1) in [1]; afterwards, Yuan and Jin [2] established a generalized AOR (GAOR) method to solve linear system (1). In [3, 4], authors studied the convergence of the GAOR method for solving the linear system Hy = f. In [3], authors studied the convergence of the GAOR method when the coefficient matrices are consistently ordered matrices and gave the regions of convergence. In [4], authors studied the convergence of the GAOR method for diagonally dominant coefficient matrices and gave the regions of convergence.

In order to solve the linear system (1.2) using the GAOR method, we split H as
()
Then, for ω ≠ 0, one GAOR method can be defined by
()
where
()
is the iteration matrix and
()
In order to decrease the spectral radius of Lr,ω, an effective method is to precondition the linear system (1.2), namely,
()
then the preconditioned GAOR method can be defined by
()
where
()

In [5], authors presented three kinds of preconditioners for preconditioned modified accelerated overrelaxation method to solve systems of linear equations. They showed that the convergence rate of the preconditioned modified accelerated overrelaxation method is better than that of the original method, whenever the original method is convergent.

This paper is organized as follows. In Section 2, we give some important definition and the known results as the preliminaries of the paper. In Section 3, we propose three preconditioners and give the comparison theorems between the preconditioned and original methods. These results show that the preconditioned GAOR methods converge faster than the GAOR method whenever the GAOR method is convergent. In Section 4, we give an example to confirm our theoretical results.

2. Preliminaries

We need the following definition and results.

Definition 2.1. Let and . We say AB if aijbij for all i, j = 1,2, …, n.

This definition can be carried over to vectors by identifying them with n × 1 matrices.

In this paper, ρ(·) denotes the spectral radius of a matrix.

Lemma 2.2 (see [6].)Let ARn×n be nonnegative and irreducible. Then

  • (i)

    A  has a positive real eigenvalue equal to its spectral radius ρ(A);

  • (ii)

    for ρ(A), there corresponds an eigenvector x > 0.

Lemma 2.3 (see [7].)Let ARn×n be nonnegative and irreducible. If

()
for some nonnegative vector x, then α < ρ(A) < β and x is a positive vector.

3. Comparison Results

We consider the preconditioned linear system
()
where and with
()
S is a p × p matrix with 1 < p < n.
We take S as follows:
()
Now, we obtain two preconditioned linear systems with coefficient matrices
()
where
()
We split (i = 1,2) as
()
then the preconditioned GAOR methods for solving (3.1) are defined as follows
()
where
()
are iteration matrices and
()

Now, we give comparison results between the preconditioned GAOR methods defined by (3.7) and the corresponding GAOR method defined by (1.6).

Theorem 3.1. Let be the iteration matrices associated with the GAOR and preconditioned GAOR methods, respectively. If the matrix H in (1.2) is irreducible with C ≤ 0, U ≤ 0, B1 ≥ 0, B2 ≥ 0, 0 < ω ≤ 1, 0 ≤ r < 1, bi,i+1 > 0, bi+1,i > 0 for some i ∈ {2, …, p}, when 0 ≤ bii < 1  (i ∈ {2, …, p}),

()
or when bii ≥ 1, αi > 0, βi > 0  (i ∈ {2, …, p}), then either
()
or
()

Proof. By direct operation, we have

()

Since 0 < ω ≤ 1, 0 ≤ r < 1, C ≤ 0, U ≤ 0, B1 ≥ 0, B2 ≥ 0, then

()
and Lr,ω is nonnegative. Since H is irreducible, from (3.13), it is easy to see that the matrix Lr,ω is nonnegative and irreducible.

Similarly, we can prove that the matrix is a nonnegative and irreducible matrix.

By Lemma 2.2, there is a positive vector x such that

()
where λ = ρ(Lr,ω). Since the matrix H is nonsingular, λ ≠ 1. Hence, we get either λ > 1 or λ < 1.

Now, from (3.15) and by the definitions of Lr,ω and , we have

()

Since bi,i+1 > 0, bi+1,i > 0 for some i ∈ {2, …, p}, when 0 ≤ bii < 1 (i ∈ {2, …, p}),

()
or when bii ≥ 1, αi > 0, βi > 0 (i ∈ {2, …, p}), then S1 ≥ 0 and S1 ≠ 0. So we have , .

If λ < 1, then , .

By Lemma 2.3, the inequality (3.11) is proved.

If λ > 1, then , .

By Lemma 2.3, the inequality (3.12) is proved.

Theorem 3.2. Let be the iteration matrices associated with the GAOR and preconditioned GAOR methods, respectively. If the matrix H in (1.2) is irreducible with C ≤ 0, U ≤ 0, B1 ≥ 0, B2 ≥ 0, 0 < ω ≤ 1, 0 ≤ r < 1, bi,1 > 0, b1,i > 0 for some i ∈ {2,3, …, p}, when 0 ≤ b11 < 1, 0 < βi < 1/(1 − b11),

()
or when b11 ≥ 1, αi > 0, βi > 0, i ∈ {2,3, …, p}, then either
()
or
()

By the analogous proof of Theorem 3.1, we can prove Theorem 3.2.

4. Numerical Example

Now, we present an example to illustrate our theoretical results.

Example 4.1. The coefficient matrix H in (1.2) is given by

()
where , , , and with
()

Table 1 displays the spectral radii of the corresponding iteration matrices with some randomly chosen parameters r, ω, p. The randomly chosen parameters αi and βi satisfy the conditions of two theorems.

From Table 1, we see that these results accord with Theorems 3.1-3.2.

Table 1. The spectral radii of the GAOR and preconditioned GAOR iteration matrices.
n ω r p ρ ρ1 ρ2
5 0.95 0.7 3 0.1450 0.1384 0.1348
10 0.9 0.85 5 0.2782 0.2726 0.2695
15 0.95 0.8 5 0.3834 0.3808 0.3796
20 0.75 0.65 10 0.6350 0.6317 0.6297
25 0.7 0.55 8 0.7872 0.7861 0.7855
30 0.65 0.55 16 0.9145 0.9136 0.9130
40 0.6 0.5 10 1.1426 1.1433 1.1436
50 0.6 0.5 10 1.3668 1.3683 1.3691
  • Where ρ = ρ(Lr,ω), , .

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Grant no. 11001144) and the Science and Technology Program of Shandong Universities of China (J10LA06).

      The full text of this article hosted at iucr.org is unavailable due to technical difficulties.