The joint moment generating function (joint mgf) is a multivariate generalization of the moment generating function.
Similarly to the univariate case, a joint mgf uniquely determines the joint distribution of its associated random vector, and it can be used to derive the cross-moments of the distribution by partial differentiation.
If you are not familiar with the univariate concept, you are advised to first read the lecture on moment generating functions.
Table of contents
Let us start with a formal definition.
Definition Let be a random vector. If the expected valueexists and is finite for all real vectors belonging to a closed rectangle :with for all , then we say that possesses a joint moment generating function and the function defined byis called the joint moment generating function of .
Not all random vectors possess a joint mgf. However, all random vectors possess a joint characteristic function, a transform that enjoys properties similar to those enjoyed by the joint mgf.
As an example, we derive the joint mgf of a standard multivariate normal random vector.
Example Let be a standard multivariate normal random vector. Its support isand its joint probability density function isAs explained in the lecture entitled Multivariate normal distribution, the components of are mutually independent standard normal random variables, because the joint probability density function of can be written aswhere is the -th entry of and is the probability density function of a standard normal random variable:Therefore, the joint mgf of can be derived as follows:Since the mgf of a standard normal random variable isthen is defined for any . As a consequence, is defined for any .
The next proposition shows how the joint mgf can be used to derive the cross-moments of a random vector.
Proposition If a random vector possesses a joint mgf , then possesses finite cross-moments of order , for any . Furthermore, if you define a cross-moment of order aswhere and , thenwhere the derivative on the right-hand side is the -th order partial derivative of evaluated at the point .
We do not provide a rigorous proof of this proposition, but see, e.g., Pfeiffer (1978) and DasGupta (2010). The main intuition, however, is quite simple. Differentiation is a linear operation and the expected value is a linear operator. This allows us to differentiate through the expected value, provided appropriate technical conditions (omitted here) are satisfied:Evaluating this derivative at the point , we obtain
The following example shows how this proposition can be applied.
Example Let's continue with the previous example. The joint mgf of a standard normal random vector isThe second cross-moment of can be computed by taking the second cross partial derivative of :
One of the most important properties of the joint mgf is that it completely characterizes the joint distribution of a random vector.
Proposition Let and be two random vectors, possessing joint mgfs and . Denote by and their joint distribution functions. and have the same joint distribution if and only if they have the same joint mgfs:
The reader may refer to Feller (2008) for a rigorous proof. The informal proof given here is almost identical to that given for the univariate case. We confine our attention to the case in which and are discrete random vectors taking only finitely many values. As far as the left-to-right direction of the implication is concerned, it suffices to note that if and have the same distribution, thenThe right-to-left direction of the implication is proved as follows. Denote by and the supports of and and by and their joint probability mass functions. Define the union of the two supports:and denote its members by . The joint mgf of can be written asBy the same line of reasoning, the joint mgf of can be written asIf and have the same joint mgf, thenfor any belonging to a closed rectangle where the two mgfs are well-defined, andRearranging terms, we obtainThis equality can be verified for every only iffor every . As a consequence, the joint probability mass functions of and are equal, which implies that also their joint distribution functions are equal.
This proposition is used very often in applications where one needs to demonstrate that two joint distributions are equal. In such applications, proving equality of the joint moment generating functions is often much easier than proving equality of the joint distribution functions.
The following sections contain more details about the joint mgf.
Let be a random vector possessing joint mgf .
Definewhere is a constant vector and and is an constant matrix.
Then, the random vector possesses a joint mgf and
Using the definition of mgf, we getIf is defined on a closed rectangle , then is defined on another closed rectangle whose shape and location depend on and .
Let be a random vector.
Let its entries , ..., be mutually independent random variables possessing a mgf.
Denote the mgf of the -th entry of by .
Then, the joint mgf of is
This fact is demonstrated as follows:
Let , ..., be mutually independent random vectors, all of dimension .
Let be their sum:
Then, the joint mgf of is the product of the joint mgfs of , ..., :
This fact descends from the properties of mutually independent random vectors and from the definition of joint mgf:
Some solved exercises on joint moment generating functions can be found below.
Let be a discrete random vector and denote its components by and .
Let the support of be and its joint probability mass function be
Derive the joint moment generating function of , if it exists.
By the definition of moment generating function, we haveObviously, the joint moment generating function exists and it is well-defined because the above expected value exists for any .
Let be a random vector with joint moment generating function
Derive the expected value of .
The moment generating function of isThe expected value of is obtained by taking the first derivative of its moment generating function:and evaluating it at :
Let be a random vector with joint moment generating function
Derive the covariance between and .
We can use the following covariance formula:The moment generating function of isThe expected value of is obtained by taking the first derivative of its moment generating function:and evaluating it at :The moment generating function of isTo compute the expected value of we take the first derivative of its moment generating function:and evaluating it at :The second cross-moment of is computed by taking the second cross-partial derivative of the joint moment generating function:and evaluating it at :Therefore,
DasGupta, A. (2010) Fundamentals of probability: a first course, Springer.
Feller, W. (2008) An introduction to probability theory and its applications, Volume 2, Wiley.
Pfeiffer, P. E. (1978) Concepts of probability theory, Dover Publications.
Please cite as:
Taboga, Marco (2021). "Joint moment generating function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/joint-moment-generating-function.
Most of the learning materials found on this website are now available in a traditional textbook format.