0% found this document useful (0 votes)
14 views12 pages

AOPSUMS97

The document summarizes research on estimating moments of sums of independent random variables. It presents new simple formulas that estimate moments in the general case when the variables are independent and symmetric or nonnegative. These estimates extend previous results that applied to specific variable types. The estimates are exact up to universal constants and can be used to easily derive previously known estimates for particular cases. As an application, the document also proves estimates for constants in Rosenthal inequalities.

Uploaded by

tmkphs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views12 pages

AOPSUMS97

The document summarizes research on estimating moments of sums of independent random variables. It presents new simple formulas that estimate moments in the general case when the variables are independent and symmetric or nonnegative. These estimates extend previous results that applied to specific variable types. The estimates are exact up to universal constants and can be used to easily derive previously known estimates for particular cases. As an application, the document also proves estimates for constants in Rosenthal inequalities.

Uploaded by

tmkphs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

The Annals of Probability

1997, Vol. 25, No. 3, 1502–1513

ESTIMATION OF MOMENTS OF SUMS OF INDEPENDENT


REAL RANDOM VARIABLES1

By Rafał Latała
Warsaw University

For the sum S = Xi of a sequence Xi  of independent symmetric
(or nonnegative) random variables, we give lower and upper estimates of
moments of S. The estimates are exact, up to some universal constants,
and extend the previous results for particular types of variables Xi .

Introduction. Let  X1  X2     be a sequence of independent real random


variables and let S = Xi . In the last few years several papers have appeared
in which there were found exact estimates (up to some constants) of moments
of S; that is, of the quantities
Sp = ESp 1/p 
The growth of moments is closely related to the behavior of the tails of S.
In [7] and independently in [8] and [6], Chapter 4 were found precise, up to
some constants, tail estimates in the case of Xi = ai εi , where ai ∈ R and
εi  is the Bernoulli sequence. In [2] estimates for moments were given in this
case. This result was generalized in [1] to the case of Xi = ai Yi , ai ∈ R and
Yi i.i.d., symmetric random variables with logarithmically concave tails. In
[4] estimates for moments of S were established, when the Xi are symmetric
random variables with logarithmically convex tails.
In this paper we give simple formulas for estimating of moments which
hold in the general case when Xi are independent symmetric or nonnegative
random variables (Theorems 1 and 2). In particular, using them we easily
derive the above mentioned results. As a simple application, we also prove
that the constants Cp in the Rosenthal inequalities
     1/p 
   
 X i ≤ C p max  Xi   Xi p
p
p 2

are of order p/ ln p; compare [5].

Definitions and notation. Let us define the following functions on R for


p > 0:
ϕp x = 1 + xp 
ϕp x + ϕp −x
ϕ̃p x = 
2

Received January 1996; revised October 1996.


1 Research partially supported by Foundation for Polish Science and KBN Grant 2 P301 022 07.

AMS 1991 subject classifications. 60E15, 60G50.


Key words and phrases. Estimation of moments, sums of independent random variables, Rosen-
thal inequality.
1502
SUMS OF INDEPENDENT RANDOM VARIABLES 1503

For a random variable X we define


φp X = Eϕp X
and for a sequence Xi  of independent nonnegative (resp. symmetric) random
variables we define the following Orlicz norm:
   
 Xi
Xi p = inf t > 0 ln φp ≤p 
t
For two functions f g we write f ∼ g to signify that for some constant C,
C−1 f ≤ g ≤ Cf.

1. Nonnegative random variables. Let us begin with the following sim-


ple lemma.

Lemma 1. For X1      Xn independent nonnegative random variables we


have
φp X1 + · · · + Xn  ≤ φp X1  · · · φp Xn 

Proof. Obviously it is enough to prove Lemma 1 for n = 2 and this reduces


to the observation that
ϕp x + y ≤ ϕp xϕp y for x y ≥ 0

Lemma 2. If X Y are independent nonnegative random variables, then




φp 2X + φ2/p
p XY ≥ φp Xφp Y

Proof. First let us notice that (by taking pth roots)


ϕp tx ≥ t2/p ϕp x for t ≥ 1 x ≥ 1
hence
Eϕp 2X + φ2/p 2/p
p XYIY≥1 ≥ Eϕp φp XYIY≥1
(1)
≥ φp XEϕp YIY≥1 
2/p
Since for 0 ≤ y < 1 x ≥ 0, ϕp 2x + φp Xy ≥ ϕp 1 + yx + y =
ϕp yϕp x, we have

(2) Eϕp 2X + φ2/p


p XYIY<1 ≥ φp XEϕp YIY<1 

This and (1) gives the proof of Lemma 2. ✷

Lemma 3. If X1  X2      Xn are independent nonnegative random vari-


ables such that φp X1  · · · · · φp Xn  ≤ ep , then

φp 2e2 X1 + · · · + Xn  ≥ φp X1  · · · φp Xn 


1504 R. LATAŁA

Proof. Let Yk = 2φp X1  · · · φp Xk 2/p X1 + · · · + Xk . We prove by


induction that
(3) φp Yk  ≥ φp X1  · · · φp Xk 
For k = 1 it is obvious, so assume that (3) holds for some k. Then by mono-
tonicity of ϕp and the previous lemma,

φp Yk+1  ≥ φp 2Xk+1 + φ2/p


p Xk+1 Yk  ≥ φp Xk+1 φp Yk 

≥ φp X1  · · · φp Xk+1  ✷

Theorem 1. Let X1  X2      Xn be a sequence of independent nonnegative


random variables, and p > 0. Then the following inequalities hold:
e−1
Xi p ≤ X1 + · · · + Xn p ≤ eXi p for p ≥ 1
2e2
and
ep − 11/p
Xi p ≤ X1 + · · · + Xn p ≤ eXi p for p ≤ 1
2e2

Proof. Let us assume that


  
 Xi
ln φp = p
t
so that φp X1 /t · · · φp Xn /t = ep . By Lemma 1,
 
X1 + · · · + Xn
φp ≤ ep 
t
However, ϕp x ≥ xp for x ≥ 0, so for any nonnegative variable Z, φp Z ≥
p
Zp and therefore
X1 + · · · + Xn p ≤ et
To show the other inequality, let us observe that by Lemma 3,
 
X + · · · + Xn
(4) φp 2e2 1 ≥ ep 
t
However, for any nonnegative random variable Z,
(5) φp Z ≤ 1 + Zp p for p ≥ 1
by the triangle inequality. For p ≤ 1, since ϕp x ≤ 1 + xp for x ≥ 0, we have
that
(6) φp Z ≤ 1 + Zp
p for p ≤ 1
From (4), (5) and (6) we obtain the desired lower estimates, and this completes
the proof. ✷
SUMS OF INDEPENDENT RANDOM VARIABLES 1505

In the particular case of i.i.d. nonnegative r.v., Theorem 1 yields the follow-
ing result of S. J. Montgomery-Smith (private communication).

Corollary 1. If p ≥ 1 and X X1      Xn are i.i.d. nonnegative random


variables then
  1/s  
p n p
X1 + · · · + Xn p ∼ sup Xs max 1 ≤s≤p 
s p n

Proof. By Theorem 1 we have


X1 + · · · + Xn p ∼ inf t > 0 φp X/t ≤ ep/n 
First assume that φp X/t ≤ ep/n and 1 ≤ s ≤ p. Then since for x ≥ 0,
ϕp x = 1 + xp/s s ≥ 1 + px/ss ≥ 1 + p/ss xs , we obtain
 s  s
p  
 X  ≤ ep/n − 1
s  t s
If n ≥ p, then ep/n − 1 ≤ ep/n, so that
 
p n 1/s
t ≥ e−1 Xs 
s p
and if n ≤ p and s ≥ p/n, then ep/n − 11/s ≤ e and so we obtain
 
p p n 1/s
t ≥ e−1 Xs ≥ e−1 Xs 
s s p
To estimate from the other side, we may assume that
  1/s  
p n p
sup Xs max 1 ≤ s ≤ p = t
s p n
Since for x ≥ 0,
 
 p k
(7) ϕp x ≤ x + xp 
k<p
k

p
and k
≤ ep/kk , if n ≥ p we have that
  p
X  pk k Xp p
φp ≤ Xk + p
≤ 1 + ≤ ep/n 
2et k<p
2tk k 2et n

If p ≥ n, we have p/n1/k ≤ k1/k < e for k ≥ p/n. Also Xk ≤ Xp/n for
k ≤ p/n. Therefore from (7) we obtain
  p
X  pk k Xp
φp ≤ exppX/2etp/n  + X k +
2et p/n<k<p
2tkk 2etp
p
≤ ep/2n + ≤ ep/n  ✷
n
1506 R. LATAŁA

2. Symmetric random variables.

Lemma 4. For any p ≥ 2 and real numbers a < b < c < d, satisfying the
condition a + d = b + c = 2, the function

ft = a + tp + b − tp + c − tp + d + tp

is nondecreasing for t ≥ 0.

Proof. Since f is convex it is enough to check that f 0 ≥ 0. But


−1 
p f 0 = ap−2 a − bp−2 b − cp−2 c + dp−2 d = gd − 1 − gc − 1, where

gs = 1 + sp−2 1 + s + 1 − sp−2 1 − s

So it is enough to show that the function g is nondecreasing on 0 ∞. This


is true since

g s = p − 11 + sp−2 − 1 − sp−2  ≥ 0 for s ∈ 0 1

and

g s = p − 11 + sp−2 − s − 1p−2  ≥ 0 for s ∈ 1 ∞

Lemma 5. For X1      Xn independent symmetric random variables and


p ≥ 2 we have

φp X1 + · · · + Xn  ≤ φp X1  · · · φp Xn 

Proof. The proof easily reduces to the case of n = 2 and X1 = xε1  X2 =


yε2 , with 0 ≤ y ≤ x. In this case, this becomes the inequality

ϕ̃p x + y + ϕ̃p x − y ≤ 2ϕ̃p xϕ̃p y

This follows by Lemma 4, applied to a = 1 − x − y b = 1 − x + y c = 1 + x − y


and d = 1 + x + y. ✷

Lemma 6. If t ≥ 1 x ≥ 1 and p ≥ 1, then

(8) ϕ̃p tx ≥ tp/2 ϕ̃p x

Proof. Let us fix x ≥ 1 and define for t ≥ 1,


p
ft = ln ϕ̃p tx − ln t
2
We have to show that ft ≥ f1. This is true, since f is nondecreasing on
1 ∞. This is so because

p tx − 1tx + 1p−1 + tx + 1tx − 1p−1


f t = ≥ 0 ✷
2t tx − 1p + tx + 1p
SUMS OF INDEPENDENT RANDOM VARIABLES 1507

Lemma 7. If X1  X2      Xn are independent symmetric random variables


such that φp X1  · · · φp Xn  ≤ ep , then for p ≥ 1,
φp 2e2 X1 + · · · + Xn  ≥ φp X1  · · · φp Xn 

Proof. Following the proof of Lemma 3, it is enough to show that


φp 2X + φp X2/p Y ≥ φp Xφp Y
for independent symmetric variables X and Y. By the convexity of ϕp , we
obtain Eϕp a + bε ≤ Eϕp a + cε for real numbers a b c, such that b ≤ c.
Therefore, since φp X ≥ 1, we have for any real numbers x y with y ≤ 1,
Eϕ̃p ε1 xϕ̃p ε2 y = ϕ̃p xϕ̃p y = Eϕp ε2 y + ε1 x + ε2 xy
≤ Eϕp ε2 y + ε1 2x ≤ Eϕp 2ε1 x + φp X2/p ε2 y
= Eϕ̃p 2ε1 x + φp X2/p ε2 y
So we may proceed as in the proof of Lemma 2, using Lemma 6 and the above
inequality. ✷

Now proceeding exactly as in the case of nonnegative random variables we


derive the following from Lemmas 5 and 7.

Theorem 2. Let X1  X2      Xn be a sequence of independent symmetric


random variables, and p ≥ 2. Then the following inequalities hold:
e−1
Xi p ≤ X1 + · · · + Xn p ≤ eXi p 
2e2
Also in a similar way as in the nonnegative case, we prove the following.

Corollary 2. If p ≥ 2 and X X1      Xn are i.i.d symmetric random


variables then we have
  1/s  
p n p
X1 + · · · + Xn p ∼ sup Xs max 2 ≤s≤p 
s p n

Remark 1. If we change ln in the definition of Xi p to log a for some


a > 1, then the lower constants in Theorems 1 and 2 will change to a −
1/2a2  and the upper constants will change to a. The lowest ratio of these
constants is obtained when a = 3/2.

Remark 2. If Xi are independent, mean zero random variables, and εi 


is a Bernoulli sequence independent of Xi  then
     
     
1/2
 X 
i ≤ 
 ε X 
i i ≤ 2 
 X 
i 
p p p

Hence we may obtain Theorem 2 for mean zero random variables, with slightly
worse constants, by setting φp Xi  = φp εi Xi  = Eϕ̃p Xi .
1508 R. LATAŁA

Remark 3. If p < 2, then by Khintchine’s inequality we have for indepen-


dent symmetric random variables Xi
       
     
cp  X2i  ≤  Xi  ≤  X2i  
p p p

where the cp are positive constants depending only on p. So we may use


Theorem 1 to obtain some estimates of moments for p < 2.

3. Examples of applications. We give a few examples of random


variables Xi , where one can compute the functions Mp Xi equivalent to
1/p ln φp Xi  in the sense that
 
ai Xi p ∼ inf t > 0 Mp Xi ai /t ≤ 1 

We will assume that p ≥ 2 and use the following simple estimates of ϕ̃p :
pp − 1 2 p2 2
(9) ϕ̃p x ≥ 1 + x ≥1+ x 
4 8

(10) ϕ̃p x ≤ cosh px ≤ 1 + p2 x2 for px ≤ 1


and

1
(11) max 2
1 + xp  1 + xp ≤ ϕ̃p x ≤ 1 + xp ≤ epx 

3.1. Let ε be a symmetric Bernoulli variable, that is, Pε = ±1 = 1/2 and

t if pt ≥ 1
Mp ε t = 2
pt  if pt ≤ 1
Then by a simple calculation we get ln φp tε ≤ pMp ε t by (10) and (11),
and ln φp 4tε ≥ p min1 Mp ε t by (9) and (11). Hence Theorem 2 yields
the following result (cf. [2]):
   
   √  2 1/2
 ai εi  ∼ ai + p ai 
p
i≤p i>p

where εi  is a sequence of independent symmetric Bernoulli variables, and


ai  is a nonincreasing sequence of nonnegative numbers.

3.2. We may generalize the previous example. Let X be a symmetric ran-


dom variable with logarithmically concave tails; that is, PX ≥ t = e−Nt
for t ≥ 0, where N R+ → R+ ∪ ∞ is a convex function. Since it is only a
matter of multiplication of X by some constant, we will assume that
(12) inf t > 0 Nt ≥ 1 = 1
In this case, we will set

p−1 N∗ pt if pt ≥ 2
Mp X t = 2
pt  if pt < 2
SUMS OF INDEPENDENT RANDOM VARIABLES 1509

where N∗ t = supts − Ns t > 0. We will prove that


(13) ln φp tX/4 ≤ pMp X t
and
(14) p min1 Mp X t ≤ ln φp e3 tX
By the symmetry of X we may assume that t > 0. If pt ≥ 2, by (11), and
integrating by parts
∞ ∗
∞
φp tX/4 ≤ EeptX/4 = 1 + es−N4s/pt ds ≤ 1 + eN pt/2 e−s ds
0 0
N∗ pt/2 N∗ pt
≤1+e ≤e 
If pt < 2, then t < 1. By the convexity of N and the normalization property
(12), we get Nx ≥ x for x ≥ 1. Hence
∞
EX2 ≤ 1 + x2 e−x dx = 1 + 5e−1 ≤ 3
1
and
 ∞
E1 + tX/4p IptX≥4 ≤ 1 + tx/4p e−x dx
4/pt
 ∞
≤ e−x/2 dx sup 1 + tx/4p e−x/2 ≤ 2et2 p2 /8
4/pt x≥4/pt

Therefore, by (10) and (11) we obtain


φp tX/4 ≤ E1 + p2 t2 X2 /16IptX<4 + E1 + tX/4p IptX≥4 ≤ 1 + t2 p2 
and (13) follows. To prove the second estimate, let us first assume that pt < 2.
Then by (12), we have EX2 ≥ e−1 . By (9) it then follows that
ln φp e3 tX ≥ ln1 + p2 t2 e5 /8 ≥ p2 t2 
Now let pt ≥ 2, then N∗ pt ≥ 1. If p ≥ N1/t then by (11) we obtain
φp e3 tX ≥ 1 + e3 p e−N1/t ≥ ep 
So we need only consider the case when N∗ pt = pts − Ns for 1/pt ≤ s ≤
1/t. But in this case, by (11),

φp e3 tX ≥ 12 1 + e3 tsp e−Ns ≥ epts−Ns = eN pt

The proof of (14) is complete.
From (13) and (14) we obtain the following slight generalization of the result
of [1]:
    
   ∗  2 1/2
 ai Xi  ∼ inf t > 0 Ni pai /t ≤ p + p ai 
p
i≤p i>p

where Xi  is a sequence of independent random variables with logarithmi-


cally concave tails normalized so that inf t PXi  ≥ t ≤ e−1  = 1, and
Ni t = ln PXi  ≥ t, and ai  is a nonincreasing sequence of nonnegative
numbers and p ≥ 2.
1510 R. LATAŁA

3.3. Let X be a symmetric random variable with logarithmically convex


tails; that is, PX ≥ t = e−Nt for t ≥ 0, where N R+ → R+ is a concave
function and
Mp X t = maxtp Xp 2 2
p  pt X2 

We will prove that in this case


(15) ln φp e−2 tX ≤ maxtp Xp 2 2 2
p  p t X2  ≤ pMp X t

and
(16) p min1 Mp X t ≤ ln φp e2 tX
Since tX also has logarithmically convex tails, we may assume that t = 1.
p
First let C = maxXp  p2 X22 . Then by (10) and (11) we have
−2
φp e−2 X ≤ E1 + e−4 p2 X2 Ie−2 pX≤1 + Eee pX
I1≤e−2 pX≤p
(17)
+ 2p e−2p EXp Ie−2 pX≥p 
Integrating by parts, we obtain
−2 2
 p 2
Eee pX
Ie2 ≤pX≤e2 p ≤ e1−Ne /p
+ et−Nte /p
dt
1

but from Chebyshev’s inequality


2
e−Ne  ≤ Ce−2p
and
2
e−Ne /p
≤ Ce−4 
Hence by the concavity of N, if t = λ1 + 1 − λp, we get
2 2
/p−1−λNe2 
e−Nte /p
≤ e−λNe ≤ Ce−4λ−2p1−λ ≤ Ce−2t 
Therefore,
−2
 p
Eee pX
Ie2 ≤pX≤e2 p ≤ Ce−3 + Ce−t dt ≤ Ce−3 + e−1 
1

Finally from (17), it follows that


ln φp X ≤ ln1 + Ce−4 + e−3 + e−1 + e−p  ≤ ln1 + C ≤ C
and (15) is proved. Let us now establish (16). We may suppose that φp e2 X ≤
ep , otherwise (16) follows trivially. But then, from (11), we have that Xp ≤
e−1 . Therefore, from Chebyshev’s inequality, N1 ≥ p, and by the concavity
of N, we have Nx ≥ px for x ≤ 1. Hence
1
EX2 IX≤1 ≤ 2xe−px dx ≤ 2p−2
0

and
EX2 IX>1 ≤ EXp ≤ e−2p φp e2 X ≤ e−p ≤ p−2 
SUMS OF INDEPENDENT RANDOM VARIABLES 1511

Therefore p2 EX2 ≤ 3, and hence by (9),


 
2 p2 4
ln φp e X ≥ ln 1 + e EX ≥ p2 EX2 
2
8
By (11) we also have

ln φp e2 X ≥ ln1 + e2p EXp  ≥ p minXp


p  1

and (16) is shown.


From (15) and (16) immediately follows the result of [4] that states
      1/2
  p 1/p
 Xi  ∼ EXi + p EX2i
p

for p ≥ 2 and Xi  a sequence of independent symmetric random variables


with logarithmically convex tails.

Lemma 8. If Xi are independent nonnegative random variables then for


p ≥ 1 and c > 0 we have
     
1 + cp  1 −1/p

p 1/p
(18) Xi p ≤ 2 max EXi  1 + p EXi 
cp c
If Xi are independent symmetric random variables, then we have for p ≥ 3
and c ∈ 0 1
  1/2    1/p 
1 + cp/2  1
(19) Xi p ≤ 2 max √ EX2i  1+ p−1/p EXi p
c p c
and for p ∈ 2 3
 1/2  1/p 
(20) Xi p ≤ 2 max EX2i  2p−1/p EXi p 

Proof. Since the function 1+xp is convex for p ≥ 1, the function x−1 1+
p
x − 1 is nondecreasing on 0 ∞. Hence ϕp x ≤ 1 + 1 + cp c−1 x for 0 ≤
x ≤ c, and so

ϕp x ≤ 1 + 1 + cp c−1 x + 1 + c−1 p xp for x ≥ 0

Therefore
p
ln φp Xi  ≤ 1 + cp c−1 EXi + 1 + c−1 p EXi 

and (18) follows.


To prove the inequalities for symmetric r.v., let us put fx = x−2 1 + xp +
1 − xp − 2 and gx = x3 f x, whenever x ≤ 1. We have g0 = g 0 = 0,
and

g x = pp − 1p − 2x1 + xp−3 − 1 − xp−3 


1512 R. LATAŁA

Hence for p ≥ 3, fx is nondecreasing. Therefore for c ∈ 0 1 and x ≤ c,


we have ϕ̃p x − 1 ≤ fcx2 /2 ≤ c−2 1 + cp x2 . Therefore
ϕ̃p x ≤ 1 + 1 + cp c−2 x2 + 1 + c−1 p xp 
As above, this implies (19). If

2 ≤ p ≤ 3, fx is nonincreasing, hence for
x ≤ 1, we have ϕ̃p x ≤ 1 + p2 x2 . Therefore for any x we have
ϕ̃p x ≤ 1 + px2 + 2p xp
and (20) follows.
From Theorem 1, 2 and Lemma 8 (taking c = ln p/p) we obtain the follow-
ing result.

Corollary 3. There exists a universal constant K such that if Xi are in-


dependent nonnegative random variables and p ≥ 1, then
  p    
  p 1/p
 Xi  ≤ K max EXi  EXi
p ln p
and if Xi are independent symmetric random variables and p ≥ 2 then
  p  1/2   1/p 
 
 Xi  ≤ K max EX2i  EXi p 
p ln p

Remark 3. If we put in Lemma 8 c = 2s − 1−1 , then Theorem 2 yields


the following one-dimensional version of the result of Pinelis (c.f. [9] and [10]).
For independent symmetric random variables Xi , and p ≥ 2 we have
   √
 
 Xi  ≤ K min sAp + sep/s A2 1 ≤ s ≤ p
p

√ pAp
∼ Ap + pA2 + √ 
ln2 + Ap /A2  p

where Ar =  EXi r 1/r and K is a universal constant.

Acknowledgments. This work grew out of useful discussions with Pro-


fessor Paweł Hitczenko concerning the moments of three-point valued random
variables. Remarks of Krzysztof Oleszkiewicz simplified the function ϕp and
some of the proofs. Finally, this paper would never have appeared without the
encouragement of Professor Stanisław Kwapień, and his belief in the existence
of a simple formula for the moments.

REFERENCES
[1] Gluskin, E. D. and Kwapień, S. (1995). Tail and moment estimates for sums of independent
random variables with logarithmically concave tails. Studia Math. 114 303–309.
[2] Hitczenko, P. (1993). Domination inequality for martingale transforms of Rademacher se-
quence. Israel J. Math. 84 161–178.
[3] Hitczenko, P. and Kwapień, S. (1994). On the Rademacher series. In Probability in Banach
Spaces 9 (J. Hoffmann-Jorgensen, J. Kuelbs and M. B. Marcus, eds.) 31–36. Birkhäuser,
Boston.
SUMS OF INDEPENDENT RANDOM VARIABLES 1513

[4] Hitczenko, P., Montgomery-Smith, S. J. and Oleszkiewicz, K. (1997). Moment inequali-


ties for linear combinations of certain i.i.d. symmetric random variables. Studia Math.
123 15–42.
[5] Johnson, W. B., Schechtman, G. and Zinn, J. (1985). Best constants in moment inequali-
ties for linear combinations of independent and exchangeable random variables. Ann.
Probab. 13 234–253.
[6] Ledoux, M. and Talagrand, M. (1991). Probability in Banach Spaces. Springer, Berlin.
[7] Montgomery, H. L. and Odlyzko, A. M. (1988). Large deviations of sums of independent
random variables. Acta Arith. 49 425–434.
[8] Montgomery-Smith, S. J. (1990). The distribution of Rademacher sums. Proc. Amer. Math.
Soc. 109 517–522.
[9] Pinelis, I. (1994). Optimum bounds for the distribution of martingales in Banach spaces.
Ann. Probab. 22 1694–1706.
[10] Pinelis, I. (1995). Optimum bounds on moments of sums of independent random vectors.
Siberian Adv. Math. 5 141–150.

Institute of Mathematics
Warsaw University
Banacha 2
02-097 Warszawa
Poland
E-mail: [email protected]

You might also like