applet-magic.com Thayer Watkins Silicon Valley & Tornado Alley USA

The Expected Value of of Sample Maximums as a Function of Sample Size

Let x be a stochastic variable with a probability density function p(x) and a cumulative probability distribution
function of P(x). Thus the probability of an observation being less than or equal to x is P(x). This function
P(x) necessarily has an inverse function P^{-1}(z). Note that P(½)=x_{median}. Let x_{max}
be defined to be the lowest value of x such that P(x)=1. Likewise x_{min} is the largest x
such that P(x)=0. Note that x_{max} and x_{min} may or may not be finite.

The probability density function for the sample maximum of a sample of size n, q_{n}(x) is given by
the probability of getting (n-1) observations which are less than or equal to x and one that is exactly x. The one
observation that is exactly x can occur at any one of n places in the sample. Thus the probability density is

q_{n}(x) = n[P(x)]^{n-1}p(x)

Note that

q_{n}(x)dx = n[P(x)]^{n-1}p(x)dx = d[P(x)^{n}]

The expected value of the sample maximum is

M_{n} = ∫_{-∞}^{∞} xq_{n}(x)dx

Let z=[P(x)]^{n} so x=P^{-1}(z^{1/n}). Then changing the variable of integration in the above expression to z
results in

M_{n} = ∫_{0}^{1} P^{-1}(z^{1/n})dz

Now consider the limit of M_{n} as n increases without bound and note that the limit of a function of a variable is equal
to the function of the limit of the the variable; i.e.,

Since lim_{x→∞}P(x)=1` and lim_{x→−∞}P(x)=0
the above reduces to

`
Φ_{n}(ω) = iω ∫_{-∞}^{∞} exp(−iωx)[P(x)]^{n}dx

The integral on the right is the characteristic function of the n-th power of P(x) which can be expressed as the
the n-th convolution product of the characteristic function of P(x). The characteristic function of P(x) is the
characteristic function of p(x) divided by iω.