Appendix A: Review of Vectors and Matrices

In this appendix, we briefly review some algebra and properties of vectors and matrices. No proofs are given as they can be found in standard textbooks on matrices (e.g., Graybill, 1969).

An m × n real-valued matrix is an m × n array of real numbers. For example,

Inline

is a 2 × 3 matrix. This matrix has two rows and three columns. In general, an m × n matrix is written as

8.46 8.46

The positive integers m and n are the row dimension and column dimension of Inline. The real number aij is referred to as the (i, j)th element of Inline. In particular, the elements aii are the diagonal elements of the matrix.

An m × 1 matrix forms an m-dimensional column vector, and a 1 × n matrix is an n-dimensional row vector. In the literature, a vector is often meant to be a column vector. If m = n, then the matrix is a square matrix. If aij = 0 for ij and m = n, then the matrix Inline is a diagonal matrix. If aij = 0 for ij and aii = 1 for all i, then Inline is the m × m identity matrix, which is commonly denoted by Inline or simply Inline if the dimension is clear.

The n × m matrix

Inline

is the transpose of the matrix Inline. For example,

Inline

We use the notation Inline = Inline to denote the transpose of Inline. From the definition, Inline and Inline = Inline. If Inline = Inline, then Inline is a symmetric matrix.

Basic Operations

Suppose that Inline and Inline are two matrices with dimensions given in the subscript. Let b be a real number. Some basic matrix operations are defined next:

  • Addition: Inline = [aij + cij]m×n if m = p and n = q.
  • Subtraction: Inline = [aijcij]m×n if m = p and n = q.
  • Scalar multiplication: Inline.
  • Multiplication: Inline provided that n = p.

When the dimensions of matrices satisfy the condition for multiplication to take place, the two matrices are said to be conformable. An example of matrix multiplication is

Inline

Important rules of matrix operations include (a) Inline and (b) Inline in general.

Inverse, Trace, Eigenvalue, and Eigenvector

A square matrix Inline is nonsingular or invertible if there exists a unique matrix Inline such that Inline = Inline = Inline, the m × m identity matrix. In this case, Inline is called the inverse matrix of Inline and is denoted by Inline = Inline.

The trace of Inline is the sum of its diagonal elements [i.e., Inline]. It is easy to see that (a) Inline, (b) Inline, and (c) Inline provided that the two matrices are conformable.

A number λ and an m × 1 vector Inline, possibly complex valued, are a right eigenvalue and eigenvector pair of the matrix Inline if Inline. There are m possible eigenvalues for the matrix Inline. For a real-valued matrix Inline, complex eigenvalues occur in conjugated pairs. The matrix Inline is nonsingular if and only if all of its eigenvalues are nonzero. Denote the eigenvalues by {λi|i = 1, … , m}: We have Inline. In addition, the determinant of the matrix Inline can be defined as Inline. For a general definition of determinant of a matrix, see a standard textbook on matrices (e.g., Graybill, 1969).

Finally, the rank of the matrix Inline is the number of nonzero eigenvalues of the symmetric matrix Inline. Also, for a nonsingular matrix Inline, Inline.

Positive-Definite Matrix

A square matrix Inline (m × m) is a positive-definite matrix if (a) Inline is symmetric and (b) all eigenvalues of Inline are positive. Alternatively, Inline is a positive-definite matrix if for any nonzero m-dimensional vector Inline, we have Inline.

Useful properties of a positive-definite matrix Inline include (a) all eigenvalues of Inline are real and positive, and (b) the matrix can be decomposed as

Inline

where Inline is a diagonal matrix consisting of all eigenvalues of Inline and Inline is an m × m matrix consisting of the m right eigenvectors of Inline. It is common to write the eigenvalues as λ1 ≥ λ2 ≥ ⋯ ≥ λm and the eigenvectors as Inline such that Inline and Inline. In addition, these eigenvectors are orthogonal to each other—namely, Inline = 0 if ij—if the eigenvalues are distinct. The matrix Inline is an orthogonal matrix and the decomposition is referred to as the spectral decomposition of the matrix Inline. Consider, for example, the simple 2 × 2 matrix

Inline

which is positive definite. Simple calculations show that

Inline

Therefore, 3 and 1 are eigenvalues of Inline with normalized eigenvectors Inline and Inline, respectively. It is easy to verify that the spectral decomposition holds—that is,

Inline

For a symmetric matrix Inline, there exists a lower triangular matrix Inline with diagonal elements being 1 and a diagonal matrix Inline such that Inline; see Chapter 1 of Strang (1980). If Inline is positive definite, then the diagonal elements of Inline are positive. In this case, we have

Inline

where Inline is again a lower triangular matrix and the square root is taken element by element. Such a decomposition is called the Cholesky decomposition of Inline. This decomposition shows that a positive-definite matrix Inline can be diagonalized as

Inline

Since Inline is a lower triangular matrix with unit diagonal elements, Inline is also lower triangular matrix with unit diagonal elements. Consider again the prior 2 × 2 matrix Inline. It is easy to verify that

Inline

satisfy Inline. In addition,

Inline

Vectorization and Kronecker Product

Writing an m × n matrix Inline in its columns as Inline, we define the stacking operation as vec(Inline) = Inline, which is an mn × 1 vector. For two matrices Inline and Inline, the Kronecker product between Inline and Inline is

Inline

For example, assume that

Inline

Then vec(Inline) = (2, − 1, 1, 3), vec(Inline) = (4, − 2, − 1, 5, 3, 2), and

Inline

Assuming that the dimensions are appropriate, we have the following useful properties for the two operators:

1. Inline in general.

2. Inline = Inline.

3. Inline.

4. Inline.

5. If Inline and Inline are invertible, then Inline = Inline.

6. For square matrices Inline and Inline, Inline.

7. vecInline = vecInline + vec(Inline).

8. vec(Inline) = Inline vec(Inline).

9. Inline.

10.

Inline

In multivariate statistical analysis, we often deal with symmetric matrices. It is therefore convenient to generalize the stacking operation to the half-stacking operation, which consists of elements on or below the main diagonal. Specifically, for a symmetric square matrix Inline, define

Inline

where Inline is the first column of Inline, and Inline is a (ki + 1)-dimensional vector. The dimension of vech(Inline) is k(k + 1)/2. For example, suppose that k = 3. Then we have vech(Inline) = (a11, a21, a31, a22, a32, a33), which is a six-dimensional vector.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset