i
i
i
i
i
i
i
i
A.3. Matrices 903
However, for larger sizes than 4 × 4, there are no simple formulae,
and Cramer’s rule also becomes infeasible for matrices larger than 4 × 4.
Gaussian elimination is then the method of preference, and it can be used
to solve for u = Mv ⇒ v = M
−1
u—that is, an implicit solution, as is the
case for Cramer’s rule. However, Gaussian elimination can also be used to
compute the matrix M
−1
explicitly. Consider the system in Equation A.39,
where u and v are arbitrary vectors of the same dimension as M and I (the
identity matrix):
Mu = Iv. (A.39)
Performing Gaussian elimination on this system until M has been trans-
formed into the identity matrix I means that the right side identity matrix
has become the inverse M
−1
.Thus,u and v are, in fact, not of any
particular use; they are merely means for expressing Equation A.39 in a
mathematically sound way.
LU decomposition is another method that can be used to compute the
inverse efficiently. However, a discussion of Gaussian elimination and LU
decomposition is beyond the scope of this text. Virtually any book on
linear algebra [741, 742] or numerical methods books, such as the one by
Press et al. [1034], describe these methods.
Some important rules of computation for the inverse of matrices are:
i)(M
−1
)
T
=(M
T
)
−1
, ii)(MN)
−1
= N
−1
M
−1
.
Eigenva lue and Eigenvector Computation
The solution to the eigenvalue problem has a large range of uses. For
example, one application area is the computation of tight bounding volumes
(see Section 17.4). The problem is stated as follows:
Ax = λx, (A.40)
where λ is a scalar.
7
The matrix A has to be square (say of size n ×n), and
x = 0, then, if x fulfills this equation, x is said to be an eigenvector to A,
and λ is its belonging eigenvalue. Rearranging the terms of Equation A.40
yields Equation A.41:
(λI − A)x = 0. (A.41)
This equation has a solution if and only if p
A
(λ)=det(λI − A)=0,
where the function p
A
(λ) is called the characteristic polynomial to A.The
eigenvalues, λ
0
,...,λ
n−1
, are thus solutions to p
A
(λ) = 0. Focus for a
while on a particular eigenvalue λ
i
to A.Thenx
i
is its corresponding
eigenvector if (λ
i
I−A)x
i
= 0, which means that once the eigenvalues have
been found, the eigenvectors can be found via Gaussian elimination.
7
We use λ (even though this is not consistent with our notation) because that is what
most texts use.