Let
be a
random vector with known distribution. Let a
random vector
be a function of
:
where
.
How do we derive the distribution of
from the distribution of
?
Although there is no general answer to this question, there are some special
cases in which the distribution of
can be easily derived from the distribution of
.
We discuss these cases below.
In the cases in which the function
is one-to-one (hence invertible) and the random vector
is either discrete or continuous, there are readily applicable formulae for
the distribution of
.
We report these formulae below.
When
is a discrete random
vector the joint
probability mass function of
is given by the following proposition.
Proposition (probability mass of a one-to-one
function)
Let
be a
discrete random vector with support
and joint probability mass function
.
Let
be one-to-one on the support of
.
Then, the support of
is
and
its probability mass function
is
If
,
then
If
,
then trivially
.
Example
Let
be a
discrete random vector and denote its components by
and
.
Let the support of
be
and
its joint probability mass function
be
Let
The
support of
is
The
inverse function
is
The
joint probability mass function of
is
When
is a continuous
random vector and
is differentiable, then also
is continuous and its
joint probability
density function is given by the following proposition.
Proposition (density of a one-to-one
function)
Let
be a
continuous random vector with support
and joint probability density function
.
Let
be one-to-one and differentiable on the support of
.
Denote by
the Jacobian matrix of
,
i.e.,
where
is the
-th
component of
and
is the
-th
component of
.
Then, the support of
is
If
the determinant of the Jacobian matrix
satisfies
then
the joint probability density function of
is
See: Poirier, D. J. (1995) Intermediate statistics and econometrics: a comparative approach, MIT Press.
A special case of the above proposition obtains when the function
is a linear one-to-one mapping.
Proposition
Let
be a
continuous random vector with joint probability density
.
Let
be a
random vector such
that
where
is a constant
vector and
is a constant
invertible matrix. Then,
is a continuous random vector whose probability density function
satisfies
where
is the determinant of
.
In this case the inverse function
isThe
Jacobian matrix
is
When
the joint density of
is
Example
Let
be a
random vector with
support
and
joint probability density
function
where
and
are the two components of
.
Define a
random vector
with components
and
as
follows:
The
inverse function
is defined
by
The
Jacobian matrix of
is
Its
determinant
is
The
support of
is
The
support of
is
and
the support of
is
For
,
the joint probability density function of
is
while
for
,
the joint probability density function is
.
When the components of
are independent
and
then
the distribution of
can be derived using the convolution formulae illustrated in the lecture
entitled Sums of independent random variables.
The joint moment generating function of
,
provided it exists, can be computed
as
using
the transformation
theorem. If
is recognized as the joint moment generating function of a known distribution,
then such a distribution is the distribution of
(two random vectors have the same distribution if and only if they have the
same joint moment generating function, provided the latter exists).
The joint characteristic
function of
can be computed
as
using
the transformation theorem. If
is recognized as the joint characteristic function of a known distribution,
then such a distribution is the distribution of
(two random vectors have the same distribution if and only if they have the
same joint characteristic function).
Below you can find some exercises with explained solutions.
Let
be a uniform random variable with
support
and
probability density
function
Let
be a continuous random variable, independent of
,
with
support
and
probability density
function
Let
Find
the joint probability density function of the random vector
Since
and
are independent, their joint probability density function is equal to the
product of their marginal density
functions:
The
support of
is
and
the support of
is
The
support of
is
The
function
is one-to-one and its inverse
is defined
by
with
Jacobian
matrix
The
determinant of the Jacobian matrix
is
which
is different from zero for any
belonging to
.
The formula for the joint probability density function of
is
and
which
implies
Let
be a
random vector with
support
and
joint probability density
function
where
and
are the two components of
.
Define a
random vector
with components
and
as
follows:
Find
the joint probability density function of the random vector
.
The inverse function
is defined
by
The
Jacobian matrix of
is
Its
determinant
is
The
support of
is
The
support of
is
The
support of
is
For
,
the joint probability density function of
is
while
for
,
the joint probability density function is
.
Please cite as:
Taboga, Marco (2021). "Functions of random vectors and their distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/functions-of-random-vectors.
Most of the learning materials found on this website are now available in a traditional textbook format.