Convergence Rates in the Law of Large Numbers for Arrays of Banach Valued Martingale Differences
Abstract
We study the convergence rates in the law of large numbers for arrays of Banach valued martingale differences. Under a simple moment condition, we show sufficient conditions about the complete convergence for arrays of Banach valued martingale differences; we also give a criterion about the convergence for arrays of Banach valued martingale differences. In the special case where the array of Banach valued martingale differences is the sequence of independent and identically distributed real valued random variables, our result contains the theorems of Hsu-Robbins-Erdös (1947, 1949, and 1950), Spitzer (1956), and Baum and Katz (1965). In the real valued single martingale case, it generalizes the results of Alsmeyer (1990). The consideration of Banach valued martingale arrays (rather than a Banach valued single martingale) makes the results very adapted in the study of weighted sums of identically distributed Banach valued random variables, for which we prove new theorems about the rates of convergence in the law of large numbers. The results are established in a more general setting for sums of infinite many Banach valued martingale differences. The obtained results improve and extend those of Ghosal and Chandra (1998).
1. Introduction
The consideration of a Banach valued martingale array (rather than a Banach valued single martingale) makes our results very adapted in the study of weighted sums of identically distributed Banach valued random variables. Many authors have contributed to this subject. Gut [20], Lanzinger and Stadtmüller [21] considered weighted sums of i.i.d. random variables. Li et al. [9], Wang et al. [22] studied weighted sums of independent random variables. Yu [23] considered weighted sums of martingale differences (see also the references therein). Ghosal and Chandra [19] considered weighted sums of arrays of martingale differences. As applications of our main results, we generalize or improve some of their results. For example, we prove a new theorem about the convergence rate for weighted sums of identically distributed Banach valued martingale differences.
As information, we mention that Baum-Katz type theorems in different dependent setups have been studied by many authors. For example, Li et al. [24] studied moving average processes; Shao [25, 26], Szewczak [27] considered mixing conditions; Baek and Park [28] studied negatively dependent random variables; Liang [29], Liang and Su [30], Kuczmaszewska [31], Kruglov [32], and Ko [33] studied negatively associated random variables.
The rest of the paper is organized as follows. In Section 2, we establish some maximal inequalities for Banach valued martingales. In Section 3, we show our main results on the convergence rates for Banach valued martingale arrays, which improve and complete Theorem 2 of Ghosal and Chandra [19]. In Section 4, we consider the important special case of triangular Banach valued martingale arrays, and obtain an extension of Theorem 1 and 2 of Alsmeyer [18]. We also generalize a result of Chow and Teicher (cf. [34, page 393]) about the complete convergence of sums of independent real valued random variables. In Section 5, we look for the convergence rates for the maxima of sequences of any Banach valued random variables, in order to obtain further equivalent conditions about the convergence rates for Banach valued martingales in the following section. In Section 6, we consider the convergence rates for Banach valued martingales. Our results extend Theorems 1–4 of Baum and Katz [6] for i.i.d. real valued random variables and generalize Theorems 1 and 2 of Alsmeyer [18]. As applications, in Section 7, we obtain new results on the convergence rates for weighted sums of Banach valued martingale differences, which extend Theorems 2 and 3 of Lanzinger and Stadtmüller [21] on weighted sums of the form . In Section 8, we consider more general weighted sums of Banach valued martingale differences, for which we extend Theorem 3.3 of Baxter et al. [35], Corollary 1 of Ghosal and Chandra [19], and Theorems 2.2–2.4 of Li et al. [9] and generalize Theorem 2 of Yu [23].
For notations, as usual, we write ℕ* = {1,2, …}, ℕ = {0}⋃ ℕ* and ℝ = (−∞, ∞).
2. Maximal Inequalities for Banach Valued Martingales
In this section, we show new maximal inequalities for Banach valued martingales.
In the following, we consider relations among , , , and ℙ{∥Sn∥ > ε}.
Our first theorem describes relations between and for an adapted sequence of 𝔹-valued random variables .
Theorem 1. Let be an adapted sequence of 𝔹-valued random variables. Then, for any ε, γ > 0, and q ≥ 1,
Our second theorem shows relations between and for a sequence of 𝔹-valued martingale differences : that is, for each (integer) 1 ≤ j ≤ n, Xj is ℱj measurable and belongs to , and 𝔼[Xj∣ℱj−1] = 0 a.s.
Theorem 2. Let be a finite sequence of 𝔹-valued martingale differences. For any ε > 0, γ ∈ (1,2], q ≥ 1, and L ∈ ℕ, if 𝔹 is γ-smooth, then
Corollary 3. Let {(Xj, ℱj)} j≥1 be a sequence of 𝔹-valued martingale differences. Suppose that, for some γ ∈ (1,2],
We get Theorems 1 and 2 by a refinement of the method of Alsmeyer [18].
Proof of Theorem 1. The first inequality is obvious. We only consider the second one. Clearly,
Proof of Theorem 2. The first inequality is obvious, because if max 1≤j≤n∥Sj∥ ≤ ε, then
(a) We first prove that
Let M be the largest j ∈ [1, m] such that T(j) ≤ m. Then, T(M) ≤ m.
We will prove that M ≥ L + 1. Suppose that M ≤ L. Then, by the definition of M, T(M + 1) > m so that
Therefore, T(j) < ∞ for all j ∈ [1, L + 1]. Thus, (23) holds.
(b) We next give an estimation of . For 0 ≤ r ≤ n,
(c) We finally give un upper bound for the term of the right hand side of (24), using (32). Set
A simple calculation shows that
3. Convergence Rates for Arrays of Banach Valued Martingale Differences
In this section, we consider the convergence rates in the law of large numbers for arrays of Banach valued martingale differences.
Theorem 4. Assume that for some γ ∈ (1,2], as n → ∞,
Proof. Notice that, by Corollary 3, the condition (50) implies the a.s. convergence of Sn,∞. Equation (51) comes from (50) as
We are interested in the convergence rates of the probabilities ℙ{sup j≥1∥Snj∥ > ε} and ℙ{∥Sn,∞∥ > ε}. We will describe their rates of convergence by comparing them with an auxilary function ϕ(n) and by considering the convergence of the related series.
We begin with some relations among , ℙ{sup j≥1∥Xnj∥ > ε}, ℙ{sup j≥1∥Snj∥ > ε}, and ℙ{∥Sn,∞∥ > ε}.
Lemma 5. Let ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2], q ≥ 1 and some integer L ≥ 0,
Lemma 6. Let ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2] and q ∈ [1, ∞),
Proof. The conclusion comes directly from Theorem 1.
Lemma 7. Let ϕ : ℕ → [0, ∞) be a positive function. Suppose that (58) holds for some γ ∈ (1,2] and q ∈ [1, ∞). Then,
Proof. The equivalence is an immediate consequence of Corollary 3.
Theorem 8. Let ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2], q ∈ [1, ∞] and λ ∈ (0, q),
Remark 9. The condition (62) holds if for some γ ∈ (1,2], r ∈ ℝ and ε1 > 0,
Proof of Theorem 8. Notice that when (62) holds for q = ∞ and some λ ∈ (0, ∞), then for q ∈ (λ, ∞),
Theorem 10. Let ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2], q ∈ [1, ∞] and λ ∈ (0, q),
Notice that, by (69), 𝔼mn(γ) < ∞ for each n ≥ 1 with ϕ(n) > 0, so that Sn,∞ is well defined (cf. Corollary 3). When ϕ(n) = 0, we use the convention that the associated term containing ϕ(n) as a factor is defined by 0.
When q = ∞, γ = 2, and {(Xnj, ℱnj)} j≥1 is a sequence of real-valued martingale differences, the implication “(70)⇒(72)” reduces to Theorem 2 of Ghosal and Chandra [19]. (Although the condition does not appear in Theorem 2 of [19], it is implicitly used in its proof.) So, our result improves and completes that of Ghosal and Chandra [19] in the sense that we prove the equivalence between (70) and (72) (not just the implication “(70)⇒(72)”) under much weaker conditions.
Remark 11. Theorem 10 also holds if mn(γ) is replaced by for some M ≥ 1. In fact, the case M ≥ 2 can be reduced to the case M = 1 by considering the subsequences {(Xn,lM+i, ℱn,lM+i)} l≥0 (1 ≤ i ≤ M) of {(Xnk, ℱnk)} k≥1, which are still sequences of 𝔹-valued martingale differences.
Corollary 12. Suppose that (67) holds for some γ ∈ (1,2], r ∈ ℝ, and ε1 > 0. Then one has the implications (70)⇔(71)⇔(72)⇒(73).
Proof of Theorem 10. As in the proof of Theorem 8, we can assume that q < ∞. Since as L → ∞, q(1 + L)/(q + L) → q > λ, we can choose an integer L ≥ 0 large enough such that q(1 + L)/(q + L) > λ. Let n0 be large enough such that for all n ≥ n0. Then,
4. Convergence Rates for Triangular Arrays of Banach Valued Martingale Differences
In this section, we consider the convergence rates in the law of large numbers for triangular arrays of Banach valued martingale differences.
Theorem 13. Let α ∈ ℝ. Assume that for some γ ∈ (1,2], as n → ∞,
Proof. It suffices to apply Theorem 4 for the array of 𝔹-valued martingale differences {(Ynj, 𝒢nj), j ≥ 1, n ≥ 1} defined by
We are interested in the convergence rates of the probabilities ℙ{max 1≤j≤n∥Snj∥ > εnα} and ℙ{∥Snn∥ > εnα}. We will describe their rates of convergence by comparing them with an auxilary function ϕ(n) and by considering the convergence of the related series.
We begin with some relations among , ℙ{max 1≤j≤n∥Xnj∥ > εnα}, ℙ{max 1≤j≤n∥Snj∥ > εnα}, and ℙ{∥Snn∥ > εnα}.
Theorem 14. Let α ∈ ℝ and ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2], q ∈ [1, ∞] and λ ∈ (0, q),
Corollary 15. Let 1/2 < α ≤ 1 and b ≥ 0. Let l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nbl(n). Suppose that for some γ ∈ (1/α, 2], q ∈ [1, ∞] with q > b/(γα − 1),
Remark 16. It is obvious that (86) holds with q = ∞ if for some constant K > 0, all n ≥ 1 and j ≥ 1,
Proof of Theorem 14. It suffices to apply Theorem 8 for the array of 𝔹-valued martingale differences {(Ynj, 𝒢nj), j ≥ 1, n ≥ 1} defined by (80).
Proof of Corollary 15. Since γ > 1/α, we have
Theorem 17. Let α ∈ ℝ and ϕ : ℕ → [0, ∞) be a positive function. Suppose that for some γ ∈ (1,2], q ∈ [1, ∞] and λ ∈ (0, q),
Corollary 18. Let α > 1/2 and b ≥ 0. Let l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nb−1l(n). Suppose that (86) holds for some γ ∈ (max {1/α, 1}, 2] and q ∈ [1, ∞] with q > b/(γα − 1). If 𝔹 is γ-smooth, then one has the implications (91)⇔(92)⇔(93)⇒(94).
For a single real-valued martingale, when l(n) = 1 and α ≤ 1, Corollary 18 reduces to Alsmeyer′s result in [18]. We notice that the consideration of a triangular array makes the result very adapted to study weighted sums of identically distributed 𝔹-valued random variables of the form .
Proof of Theorem 17. It suffices to apply Theorem 10 for the array of 𝔹-valued martingale differences {(Ynj, 𝒢nj), j ≥ 1, n ≥ 1} defined by (80).
Proof of Corollary 18. Notice that
Then,
As a special case, we obtain the following extension of a result of Chow and Teicher [34, page 393] about the complete convergence on sums of independent random variables.
Corollary 20. Let {(Xnj, ℱnj)} 1≤j≤n (n ≥ 1) be sequences of identically distributed 𝔹-valued martingale differences. Let p ∈ [1,4). Suppose that (86) holds for some γ ∈ (max {p/2, 1}, 2] and q ∈ [1, ∞] with q > p/(2γ − p). If 𝔹 is γ-smooth, then if and only if
When {Xnj} are rowwise independent real-valued martingale differences, the sufficiency in Corollary 20 was proved in [34, page 393].
5. Convergence Rate for the Maxima of any Banach Valued Random Variables
In this section, we study the convergence rate for the maxima of a sequence of any Banach valued random variables to obtain further equivalent conditions about the convergence rate for a Banach valued martingale in Section 6.
We are interested in the convergence rates of and . Notice that for any ε > 0 if and only if a.s. So, our results in this section describe the rate convergence for the almost surely convergence of .
The following result shows that and ℙ{M(ε) > εnα} have similar asymptotic properties. More precise comparisons will be given in Theorems 22 and 24.
Lemma 21. Let α > 0. Then, for any υ ≥ 1 and any ε > 0,
Proof. The first inequality of (102) is obvious. If
Assume that for some v0 > 0 and all v ≥ v0, (103) holds (with the notation introduced in the lemma). Then, there exists υ2 = υ2(v0, b, α) > 0, such that for all υ1 ≥ υ2,
Theorem 22. Let α, b > 0 and ϕ(n) = nbl(n). Then, the following assertions are equivalent:
Proof. We use Lemma 21. By the second inequality of (102), we see that (112) implies (111); by the first inequality of (102), we know that (111) implies (110). As (103) implies (104), we see that (110) implies (112). Thus (110), (111), and (112) are all equivalent.
Lemma 23. Let α > −1. Then for some n0, c1, c2 > 0 and all N ≥ n0,
Proof. Without loss of generality, we suppose that l(s) has the form (101) with c(s) ≡ 1. Therefore, for δ ∈ (0, α + 1), sδl(s) is increasing in [n1, ∞) for some n1 > 0 large enough. Consequently, for some positive constants c0, c2, and c3 (which may depend on n1) and all N ≥ n1,
Theorem 24. Let α, b > 0 and ϕ(n) = nb−1l(n). Then, the following assertions are equivalent:
Proof. We proceed as in [34, page 394] where similar results were established for l(n) = 1 and real-valued random variables.
(a) We first prove that (117) is equivalent to
(b) We next remark that (119) is equivalent to
(c) We now prove that (121) implies (124). Set β = b/α. We have
(d) We then conclude that (117), (118), and (119) are equivalent. By (a), (b), and (c), we see that (117) implies (119). By Lemma 21, we have the implications: (119)⇒(118)⇒(117).
(e) We finally prove that (119) and (120) are equivalent. We have
6. Convergence Rates for Banach Valued Martingales
In this section, we consider the convergence rate in the law of large numbers for a sequence of Banach valued martingales. We will obtain more equivalent conditions than in Section 4, using the results of Section 5.
Theorem 25. Let α, b > 0, l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nbl(n). Suppose that for some γ ∈ (1,2], q ∈ [1, ∞] and λ ∈ (0, q),
Notice that, compared with Theorem 14, Theorem 25 contains the additional conditions (139) and (140). When ϕ(n) = nb and for i.i.d. real-valued random variables, the implications (135)⇒(139)⇒(138) with o(1) of Theorem 25 contain Theorem 4 of Baum and Katz [6].
Remark 26. As in Theorem 25, the conclusions of Theorem 25 remain valid if mn(γ) is replaced by
In fact, the case M ≥ 2 can be reduced to the case M = 1 by considering the subsequences {(XlM+i, ℱlM+i)} l≥0 (1 ≤ i ≤ M) of {(Xj, ℱj)} j≥1, which are still sequences of 𝔹-valued martingale differences.
Corollary 27. Let 1/2 < α ≤ 1 and b > 0. Let l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nbl(n). Suppose that for some γ ∈ (1/α, 2], q ∈ [1, ∞] with q > b/(γα − 1),
Proof of Theorem 25. Applying Theorem 14 to Xnj = Xj, ℱnj = ℱj (1 ≤ j ≤ n), and ϕ(n) = nbl(n), we get the implications (135)⇔(136)⇔(137)⇒(138). Applying Theorem 22 to Yn = ∥Sn∥, we know that (137) and (140) are equivalent. Obviously, we have the implications (140)⇒(139)⇒(138). Therefore, we have proved the implications (135)⇔(136)⇔(137)⇔(140)⇒(139)⇒(138).
Proof of Corollary 27. Since γ > 1/α, we have
Theorem 28. Let α, b > 0. Let l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nb−1l(n). For any ε > 0, set
Compared with Theorem 17, in Theorem 28 we have the additional conditions (149), (150), (151), (154), and (155).
Remark 29. As in Theorem 25, the conclusions of Theorem 28 remain valid if mn(γ) is replaced by
Corollary 30. Let α > 1/2, b > 0. Let l(·) > 0 be a function slowly varying at ∞ and ϕ(n) = nb−1l(n). Suppose that for some γ ∈ (max {1,1/α}, 2], q ∈ [1, ∞] with q > b/(γα − 1), (142) holds. If 𝔹 is γ-smooth, then one has the implications (153)⇔(154)⇔(155)⇔(156)⇔(148)⇔(149)⇔(150)⇒(151)⇒(152).
If Xj′s are identically distributed real-valued random variables, then (156) is equivalent to the moment condition . So, Corollary 30 contains Theorems 1, 2, and 3 of Baum and Katz [6] when ϕ(n) = nb−1 and for i.i.d. real-valued random variables. When ϕ(n) = nb−1, α ≤ 1 and for real-valued martingale differences, Corollary 30 was proved by Alsmeyer [18, Theorems 1 and 2].
Proof of Theorem 28. Applying Theorem 10 to
7. Convergence Rates for Weighted Sums of Banach Valued Martingale Differences of the Form
We will need the following elementary result.
Lemma 31. Let (Yj) j≥1 be a sequence of any 𝔹-valued random variables. If there exist n0, K > 0, such that for some q ∈ [1, ∞] and all n ≥ n0,
Proof. Let S0 = 0, , n ≥ 1. Then,
The following theorem is a Marcinkiewicz-Zygmund type strong law of large numbers for the weighted sums (161).
Theorem 32. Let {(Xj, ℱj)} j≥1 be 𝔹-valued martingale differences. Suppose that for some γ ∈ (1,2], there exist n0, K > 0, such that for all n ≥ n0,
Notice that when a = 1 and for real-valued martingale differences, the result (167) is implied by the classical Marcinkiewicz-Zygmund strong laws of large numbers. Also, it is evident that (167) holds if and only if ℙ{sup m≥n(∥Tm∥/ma+α) > ε} → 0 for any ε > 0. So, (168) describes the convergence rates in the Marcinkiewicz-Zygmund strong laws of large numbers (167).
Proof of Theorem 32. Clearly,
To establish a general Baum-Katz type theorem for the weighted sums (161), we first introduce a definition and a technical lemma.
Definition 33. For a function Rρ regularly varying at ∞ of index ρ ≠ 0, one define, as its inverse function.
Notice that when ρ > 0, Rρ(x) is strictly increasing for x large enough with lim x→∞Rρ(x) = +∞, so that is well defined on [u0, ∞) for u0 > 0 large enough. For simplicity, we always make the convention that if u ∈ [0, u0), so that is well defined on [0, ∞). We make a similar convention in the case where ρ < 0.
The following lemma shows that the inverse function of a regularly varying function of index ρ ≠ 0 remains regularly varying.
Lemma 34. If Rρ(x) = xρl(x) is regularly varying at ∞ of index ρ ≠ 0, where l(x) is of the canonical form , then its inverse function is regularly varying at ∞ of index 1/ρ, where is slowly varying at ∞.
Proof. Let y = Rρ(x). Define . We have
Theorem 35. Let {(Xj, ℱj)} j≥1 be identically distributed 𝔹-valued martingale differences and b ≥ 0. Suppose that for some γ ∈ (1,2] and q ∈ [1, ∞] with q > b/(γ(α + 1) − 1), γ(α + 1) − 1 > 0 and γ(a − 1) + 1 > 0,
(a) When a ≠ (b − α)/(b + 1),
(b) When a = (b − α)/(b + 1), (183) is implied by
Remark 36. Theorem 35 also holds if (182) is replaced by
Of particular interest are the cases where the slowly varying functions l and l1 are constants or powers of the logarithmic function, which will be studied in the following corollaries. We first consider the case where l and l1 are constants.
Corollary 37. Let {(Xj, ℱj)} j≥1 be identically distributed 𝔹-valued martingale differences and b ≥ 0. Suppose that (182) holds for some γ ∈ (1,2] and q ∈ [1, ∞] with q > b/(γ(α + 1) − 1), γ(α + 1) − 1 > 0, and γ(a − 1) + 1 > 0. If 𝔹 is γ-smooth, then
Notice that the condition on implies in particular , giving . Therefore, the conclusion of the corollary is interesting only when the exponents in (189) are greater than γ.
When (Xj) j≥1 are i.i.d. real-valued random variables and a ∈ (0,1), we get the sufficiencies of Theorems 2 and 3(a)(i) and (iii) of Lanzinger and Stadtmüller [21] by Corollary 37.
We then consider the case where l(x) = (log +x) p (p ∈ ℝ) and l1(x) = (log +x) β (β ∈ ℝ).
Corollary 38. Let {(Xj, ℱj)} j≥1 be identically distributed 𝔹-valued martingale differences and b ≥ 0. Suppose that (182) holds for some γ ∈ (1,2] and q ∈ [1, ∞] with q > b/(γ(α + 1) − 1), γ(α + 1) − 1 > 0, and γ(a − 1) + 1 > 0. If 𝔹 is γ-smooth, then for β, p ∈ ℝ,
In the case where (Xj) j≥1 are i.i.d. real-valued random variables and the maximum max 1≤j≤n∥Tj∥ is replaced by the ∥Tn∥, by Corollary 38, if a ∈ (0,1), β = 0, and p = −1, we get the sufficiency of Theorem 3(a)(ii) of Lanzinger and Stadtmüller [21]; if a = 0, β = 1, and p < 0, p ≠ −1, we get the sufficiencies of Theorem 3(b) of Lanzinger and Stadtmüller in [21].
Proof of Theorem 35. Notice that is slowly varying at ∞ by Proposition 1.5.9a in [39, page 26]. Set
Lemma 39. Let b ≥ 0, α > −1, a > −α, and l(·) and l1(·) are slowly varying at ∞. Let X1 be any 𝔹-valued random variable. Then, the following assertions hold.
Proof. We proceed as in the proof of Lemma 3.4 of Gut [20]. We distinguish three cases according to a > 1, a = 1 or a < 1. By choosing a smooth version, we can suppose that l1 is differentiable (cf. [39]).
Case 1 (a > 1). In (198), we use the change of variables
Case 2 (a = 1). By the change of variables
Case 3 (−α < a < 1). By the change of variables
We distinguish three cases according to a < (b − α)/(b + 1), a > (b − α)/(b + 1), or a = (b − α)/(b + 1).
(i) Suppose that a < (b − α)/(b + 1). By Proposition 1.5.8 of [39, page 26], we have as u → ∞,
(ii) Suppose that a > (b − α)/(b + 1). By Proposition 1.5.10 of [39, page 27], we have as u → ∞,
(iii) Suppose that a = (b − α)/(b + 1). In this case, (209) reduces to
8. Convergence of Weighted Sums of Banach Valued Martingale Differences of the Form
In this section, we consider more general weighted sums of Banach valued martingale differences than those considered in Section 7.
Let (Xj) j≥1 be a sequence of i.i.d. random variables with 𝔼Xj = 0, and let {anj, j ≥ 1, n ≥ 1} be an array of real numbers. The study of the convergence of weighted sums as n → ∞ is a classical subject; see for example, Salem and Zygmund [40], Hill [41], Hanson and Koopman [42], Pruitt [43], Franck and Hanson [44], Chow [45], Chow and Lai [46], and Stout [47]. Pruitt [43] found a necessary and sufficient condition for in probability and a sufficient condition for a.s. Baxter et al. [35] also showed a sufficient condition for a.s. Li et al. [9] studied the complete convergence of weighted sums of independent random variables of the form . Yu [23] and Ghosal and Chandra [19] considered the same problem for martingale differences (Xnj). We will extend or improve some of the aforementioned works.
8.1. Law of Large Numbers for Weighted Sums of Banach Valued Martingale Differences
Let (𝔹, ∥·∥) be a separable Banach space. In this subsection, we find sufficient conditions for the convergence of weighted sums of 𝔹-valued martingale differences (Xnj).
In the following, we consider the same problem for 𝔹-valued martingale differences (Xnj).
Theorem 40. Let {(Xnj, ℱnj)} j≥1, n ≥ 1, be sequences of 𝔹-valued martingale differences. Let {anj, j ≥ 1, n ≥ 1} be an array of real numbers satisfying and An = sup j≥1 | anj | → 0 as n → ∞. Suppose that for some γ ∈ (1,2], there exists K > 0, such that
In the following, we also consider the similar problem for arrays of 𝔹-valued martingale differences (Xnj).
Theorem 41. Let {(Xnj, ℱnj)} j≥1, n ≥ 1, be sequences of 𝔹-valued martingale differences. Let {anj, j ≥ 1, n ≥ 1} be an array of real numbers satisfying and max j≥1 | anj | = O(n−β), β > 0. Suppose that, for some γ ∈ (1,2], there exists a constant K > 0 such that for all j ≥ 1, n ≥ 1. If 𝔹 is γ-smooth, then
Proof. Set Ynj = anjXnj, j ≥ 1, n ≥ 1, then {(Ynj, ℱnj), j ≥ 1, n ≥ 1} is an array of 𝔹-valued martingale differences and for any n ≥ 1,
If additionally γ > 1 + 1/β, then (229) implies that
The following theorem extends Theorem 3.3 of Baxter et al. [35].
Theorem 42. Let {(Xj, ℱj)} j≥1 be 𝔹-valued martingale differences. Suppose that for some γ ∈ (1,2], there exists K > 0, such that for all j ≥ 1,
When (Xj, j ≥ 1) are i.i.d. real-valued random variables, α ≤ 1, and δ = 1, (240) reduces to Theorem 3.3 of Baxter et al. [35].
8.2. Complete Convergence of Weighted Sums of Banach Valued Martingale Differences
Let (𝔹, ∥·∥) be a separable Banach space. In this subsection, we consider complete convergence of weighted sums of 𝔹-valued martingale differences (Xnj) of the form . We extend and improve Corollary 1 of Ghosal and Chandra [19] and Theorems 2.2–2.4 of Li et al. [9]. We also generalize Theorem 2 of Yu [23].
Theorem 43. Let {(Xnj, ℱnj)} j≥1, n ≥ 1, be sequences of 𝔹-valued martingale differences. Let α > 0 and {anj, j ≥ 1, n ≥ 1} be an array of real numbers. If there exists γ ∈ (1,2] such that
In the square-integrable real-valued martingale differences case, the result was proved by Ghosal and Chandra in Corollary 1 [19] if, additionally, for some δ < 2α.
We generalize Theorem 2 of Yu [23] from two directions by Theorem 43: first, we extend sequences of Lp (p ≥ 2) martingale differences to Lγ (γ ∈ (1,2]) sequences of 𝔹-valued martingale differences; secondly, we do not need the condition for some δ < α.
Proof of Theorem 43. Set Ynj = n−αanjXnj. From (249), we have
Theorem 44. Let {(Xnj, ℱnj)} j≥1, n ≥ 1, be sequences of 𝔹-valued martingale differences. Let {anj, j ≥ 1, n ≥ 1} be an array of real numbers. Suppose that for some constants γ ∈ (1,2], K > 0, α ∈ (0,1/2] and δ < γα,
When (Xnj) j≥1, n ≥ 1, is a sequence of zero mean independent real-valued random variables and γ = 2, Theorem 44 reduces to Theorem 2.3 of Li et al. [9].
Proof of Theorem 44. Set Ynj = n−αanjXnj, j ≥ 1, n ≥ 1, then {(Ynj, ℱnj), j ≥ 1, n ≥ 1} is an array of 𝔹-valued martingale differences. Since (255) and (257), for any n ≥ 1, we have
Theorem 45. Let be a triangular array of identically distributed 𝔹-valued martingale differences. Let β > −1, and let {anj, 1 ≤ j ≤ n} be a triangular array of positive numbers satisfying
When {Xnj} = {Xj} are the same sequence of i.i.d. real-valued random variables and for all n ≥ 1, Theorem 45 reduces to the sufficiency of Theorem 2.4 of Li et al. [9].
Proof of Theorem 45. Set
Acknowledgment
The author is most grateful to editor Dumitru Motreanu and an anonymous referee for their careful reading and insightful comments. This work has been partially supported by the Research Fund of Beijing International Studies University (no. 13Bb023) and Doctoral Research Start-up Funds Projects of Beijing International Studies University.