Volume 2012, Issue 1 782960
Research Article
Open Access

A Hybrid Gradient-Projection Algorithm for Averaged Mappings in Hilbert Spaces

Ming Tian

Corresponding Author

Ming Tian

College of Science, Civil Aviation University of China, Tianjin 300300, China cauc.edu.cn

Search for more papers by this author
Min-Min Li

Min-Min Li

College of Science, Civil Aviation University of China, Tianjin 300300, China cauc.edu.cn

Search for more papers by this author
First published: 25 July 2012
Academic Editor: Hong-Kun Xu

Abstract

It is well known that the gradient-projection algorithm (GPA) is very useful in solving constrained convex minimization problems. In this paper, we combine a general iterative method with the gradient-projection algorithm to propose a hybrid gradient-projection algorithm and prove that the sequence generated by the hybrid gradient-projection algorithm converges in norm to a minimizer of constrained convex minimization problems which solves a variational inequality.

1. Introduction

Let H be a real Hilbert space and C a nonempty closed and convex subset of H. Consider the following constrained convex minimization problem:
(1.1)
where f : C is a real-valued convex and continuously Fréchet differentiable function. The gradient ∇f satisfies the following Lipschitz condition:
(1.2)
where L > 0. Assume that the minimization problem (1.1) is consistent, and let S denote its solution set.
It is well known that the gradient-projection algorithm is very useful in dealing with constrained convex minimization problems and has extensively been studied ([15] and the references therein). It has recently been applied to solve split feasibility problems [610]. Levitin and Polyak [1] consider the following gradient-projection algorithm:
(1.3)
Let satisfy
(1.4)
It is proved that the sequence {xn} generated by (1.3) converges weakly to a minimizer of (1.1).
Xu proved that under certain appropriate conditions on {αn} and {λn} the sequence {xn} defined by the following relaxed gradient-projection algorithm:
(1.5)
converges weakly to a minimizer of (1.1) [11].

Since the Lipschitz continuity of the gradient of f implies that it is indeed inverse strongly monotone (ism) [12, 13], its complement can be an averaged mapping. Recall that a mapping T is nonexpansive if and only if it is Lipschitz with Lipschitz constant not more than one, that a mapping is an averaged mapping if and only if it can be expressed as a proper convex combination of the identity mapping and a nonexpansive mapping, and that a mapping T is said to be ν-inverse strongly monotone if and only if 〈xy,   TxTy〉 ≥ νTxTy2  for   all  x, yH, where the number ν > 0. Recall also that the composite of finitely many averaged mappings is averaged. That is, if each of the mappings is averaged, then so is the composite T1TN [14]. In particular, an averaged mapping is a nonexpansive mapping [15]. As a result, the GPA can be rewritten as the composite of a projection and an averaged mapping which is again an averaged mapping.

Generally speaking, in infinite-dimensional Hilbert spaces, GPA has only weak convergence. Xu [11] provided a modification of GPA so that strong convergence is guaranteed. He considered the following hybrid gradient-projection algorithm:
(1.6)
It is proved that if the sequences {θn} and {λn} satisfy appropriate conditions, the sequence {xn} generated by (1.6) converges in norm to a minimizer of (1.1) which solves the variational inequality
(1.7)
On the other hand, Ming Tian [16] introduced the following general iterative algorithm for solving the variational inequality
(1.8)
where F is a κ-Lipschitzian and η-strongly monotone operator with κ > 0, η > 0 and f is a contraction with coefficient 0 < α < 1. Then, he proved that if {αn} satisfying appropriate conditions, the {xn} generated by (1.8) converges strongly to the unique solution of variational inequality
(1.9)
In this paper, motivated and inspired by the research work in this direction, we will combine the iterative method (1.8) with the gradient-projection algorithm (1.3) and consider the following hybrid gradient-projection algorithm:
(1.10)
We will prove that if the sequence {θn} of parameters and the sequence {λn} of parameters satisfy appropriate conditions, then the sequence {xn} generated by (1.10) converges in norm to a minimizer of (1.1) which solves the variational inequality (VI)
(1.11)
where S is the solution set of the minimization problem (1.1).

2. Preliminaries

This section collects some lemmas which will be used in the proofs for the main results in the next section. Some of them are known; others are not hard to derive.

Throughout this paper, we write xnx to indicate that the sequence {xn} converges weakly to x, xnx implies that {xn} converges strongly to x. is the weak ω-limit set of the sequence .

Lemma 2.1 (see [17].)Assume that is a sequence of nonnegative real numbers such that

(2.1)
where and are sequences in [0,1] and is a sequence in such that
  • (i)

    ;

  • (ii)

    either limsup nδn ≤ 0 or ;

  • (iii)

    .

Then lim nan = 0.

Lemma 2.2 (see [18].)Let C be a closed and convex subset of a Hilbert space H, and let T : CC be a nonexpansive mapping with Fix  T. If is a sequence in C weakly converging to x and if converges strongly to y, then (I − T)x = y.

Lemma 2.3. Let H be a Hilbert space, and let C be a nonempty closed and convex subset of H. h : CC a contraction with coefficient 0 < ρ < 1, and F : CC a κ-Lipschitzian continuous operator and η-strongly monotone operator with κ, η > 0. Then, for 0 < γ < μη/ρ,

(2.2)
That is, μFγh is strongly monotone with coefficient μηγρ.

Lemma 2.4. Let C be a closed subset of a real Hilbert space H, given xH and yC. Then, y = PCx if and only if there holds the inequality

(2.3)

3. Main Results

Let H be a real Hilbert space, and let C be a nonempty closed and convex subset of H such that C ± CC. Assume that the minimization problem (1.1) is consistent, and let S denote its solution set. Assume that the gradient ∇f satisfies the Lipschitz condition (1.2). Since S is a closed convex subset, the nearest point projection from H onto S is well defined. Recall also that a contraction on C is a self-mapping of C such that ∥h(x) − h(y)∥≤ρxy∥, for  all  x, yC, where ρ ∈ [0,1) is a constant. Let F be a κ-Lipschitzian and η-strongly monotone operator on C with κ, η > 0. Denote by Π the collection of all contractions on C, namely,
(3.1)
Now given hΠ with 0 < ρ < 1, s ∈ (0,1). Let 0 < μ < 2η/κ2,   0 < γ < μ(η − (μκ2)/2)/ρ = τ/ρ. Assume that λs with respect to s is continuous and, in addition, λs ∈ [a, b]⊂(0,2/L). Consider a mapping Xs on C defined by
(3.2)
It is easy to see that Xs is a contraction. Setting Vs : = ProjC(Iλsf). It is obvious that Vs is a nonexpansive mapping. We can rewrite Xs(x) as
(3.3)
First observe that for s ∈ (0,1), we can get
(3.4)
Indeed, we have
(3.5)
Hence, Xs has a unique fixed point, denoted xs, which uniquely solves the fixed-point equation
(3.6)
The next proposition summarizes the properties of {xs}.

Proposition 3.1. Let xs be defined by (3.6).

  • (i)

    {xs} is  bounded  for   s ∈ (0, (1/τ)).

  • (ii)

    lim s→0xs − ProjC(Iλsf)(xs)∥ = 0.

  • (iii)

    xs   defines  a  continuous  curve  from   (0,1/τ)   into   H.

Proof. (i) Take a , then we have

(3.7)
It follows that
(3.8)
Hence, {xs} is bounded.

(ii) By the definition of {xs}, we have

(3.9)
{xs} is bounded, so are {h(xs)} and {F ProjC(Iλsf)(xs)}.

(iii) Take s, s0 ∈ (0,1/τ), and we have

(3.10)
Therefore,
(3.11)
Therefore, as ss0. This means xs is continuous.

Our main result in the following shows that {xs} converges in norm to a minimizer of (1.1) which solves some variational inequality.

Theorem 3.2. Assume that {xs} is defined by (3.6), then xs converges in norm as s → 0 to a minimizer of (1.1) which solves the variational inequality

(3.12)
Equivalently, we have Projs(I − (μFγh))x* = x*.

Proof. It is easy to see that the uniqueness of a solution of the variational inequality (3.12). By Lemma 2.3, μFγh is strongly monotone, so the variational inequality (3.12) has only one solution. Let x*S denote the unique solution of (3.12).

To prove that xsx*  (s → 0), we write, for a given ,

(3.13)
It follows that
(3.14)
Hence,
(3.15)
To derive that
(3.16)
Since {xs} is bounded as s → 0, we see that if {sn} is a sequence in (0,1) such that sn → 0 and , then by (3.16), . We may further assume that due to condition (1.4). Notice that ProjC(Iλf) is nonexpansive. It turns out that
(3.17)
From the boundedness of {xs} and lim s→0∥ProjC(Iλsf)xsxs∥ = 0, we conclude that
(3.18)
Since , by Lemma 2.2, we obtain
(3.19)
This shows that .

We next prove that is a solution of the variational inequality (3.12). Since

(3.20)
we can derive that
(3.21)
Therefore, for ,
(3.22)
Since ProjC(Iλsf) is nonexpansive, we obtain that I − ProjC(Iλsf) is monotone, that is,
(3.23)
Taking the limit through s = sn → 0 ensures that is a solution to (3.12). That is to say
(3.24)
Hence by uniqueness. Therefore, xsx* as s → 0. The variational inequality (3.12) can be written as
(3.25)
So, by Lemma 2.4, it is equivalent to the fixed-point equation
(3.26)

Taking F = A, μ = 1 in Theorem 3.2, we get the following

Corollary 3.3. We have that {xs} converges in norm as s → 0 to a minimizer of (1.1) which solves the variational inequality

(3.27)
Equivalently, we have Projs(I − (Aγh))x* = x*.

Taking F = I, μ = 1, γ = 1 in Theorem 3.2, we get the following.

Corollary 3.4. Let zsH be the unique fixed point of the contraction zsh(z)+(1 − s)ProjC(Iλsf)(z). Then, {zs} converges in norm as s → 0 to the unique solution of the variational inequality

(3.28)

Finally, we consider the following hybrid gradient-projection algorithm,
(3.29)
Assume that the sequence satisfies the condition (1.4) and, in addition, that the following conditions are satisfied for and :
  • (i)

    θn → 0;

  • (ii)

    ;

  • (iii)

    ;

  • (iv)

    .

Theorem 3.5. Assume that the minimization problem (1.1) is consistent and the gradient ∇f satisfies the Lipschitz condition (1.2). Let {xn} be generated by algorithm (3.29) with the sequences {θn} and {λn} satisfying the above conditions. Then, the sequence {xn} converges in norm to x* that is obtained in Theorem 3.2.

Proof. (1) The sequence is bounded. Setting

(3.30)
Indeed, we have, for ,
(3.31)
By induction,
(3.32)
In particular, is bounded.

(2) We prove that ∥xn+1xn∥→0 as n. Let M be a constant such that

(3.33)
We compute
(3.34)
(3.35)
Combining (3.34) and (3.35), we can obtain
(3.36)
Apply Lemma 2.1 to (3.36) to conclude that ∥xn+1xn∥→0 as n.

(3) We prove that ωw(xn) ⊂ S. Let , and assume that for some subsequence of . We may further assume that due to condition (1.4). Set V : = ProjC(Iλf). Notice that V is nonexpansive and Fix  V = S. It turns out that

(3.37)
So Lemma 2.2 guarantees that ωw(xn) ⊂ Fix  V = S.

(4) We prove that xnx* as n, where x* is the unique solution of the VI (3.12). First observe that there is some Such that

(3.38)

We now compute

(3.39)
Applying Lemma 2.1 to the inequality (3.39), together with (3.38), we get ∥xnx*∥→0 as n.

Corollary 3.6 (see [11].)Let {xn} be generated by the following algorithm:

(3.40)
Assume that the sequence satisfies the conditions (1.4) and (iv) and that {θn}⊂[0,1] satisfies the conditions (i)–(iii). Then {xn} converges in norm to x* obtained in Corollary 3.4.

Corollary 3.7. Let {xn} be generated by the following algorithm:

(3.41)
Assume that the sequences {θn} and {λn} satisfy the conditions contained in Theorem 3.5, then {xn} converges in norm to x* obtained in Corollary 3.3.

Acknowledgments

Ming Tian is Supported in part by The Fundamental Research Funds for the Central Universities (the Special Fund of Science in Civil Aviation University of China: No. ZXH2012 K001) and by the Science Research Foundation of Civil Aviation University of China (No. 2012KYM03).

      The full text of this article hosted at iucr.org is unavailable due to technical difficulties.