

What Is the Second Difference Matrix?
source link: https://nhigham.com/2022/01/31/what-is-the-second-difference-matrix/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

What Is the Second Difference Matrix?
The second difference matrix is the tridiagonal matrix with diagonal elements
and sub- and superdiagonal elements
:
It arises when a second derivative is approximated by the central second difference . (Accordingly, the second difference matrix is sometimes defined as
.) In MATLAB,
can be generated by
gallery('tridiag',n)
, which is returned as a aparse matrix.
This is Gil Strang’s favorite matrix. The photo, from his home page, shows a birthday cake representation of the matrix.
The second difference matrix is symmetric positive definite. The easiest way to see this is to define the full rank rectangular matrix
and note that . The factorization corresponds to representing a central difference as the product of a forward difference and a backward difference. Other properties of the second difference matrix are that it is diagonally dominant, a Toeplitz matrix, and an
-matrix.
Cholesky Factorization
In an LU factorization the pivots
are
,
,
, …,
. Hence the pivots form a decreasing sequence tending to 1 as
. The diagonal of the Cholesky factor contains the square roots of the pivots. This means that in the Cholesky factorization
with
upper bidiagonal,
and
as
.
Determinant, Inverse, Condition Number
Since the determinant is the product of the pivots, .
The inverse of is full, with
element
for
. For example,
The -norm condition number satisfies
(as follows from the formula (1) below for the eigenvalues).
Eigenvalues and Eigenvectors
The eigenvalues of are
where , with corresponding eigenvector
The matrix with
is therefore an eigenvector matrix for :
.
Variations
Various modifications of the second difference matrix arise and similar results can be derived. For example, consider the matrix obtained by changing the element to
:
It can be shown that has
element
and eigenvalues
,
.
If we perturb the of
to
, we obtain a singular matrix, but suppose we perturb the
element to
:
The inverse is , where
with
element
is a matrix of Givens, and the eigenvalues are
,
.
Notes
The factorization is noted by Strang (2012).
For a derivation of the eigenvalues and eigenvectors see Todd (1977, p. 155 ff.). For the eigenvalues of see Fortiana and Cuadras (1997). Givens’s matrix is described by Newman and Todd (1958) and Todd (1977).
References
This is a minimal set of references, which contain further useful references within.
- J. Fortiana and C. N. Cuadras, A Family of Matrices, the Discretized Brownian Bridge, and Distance-Based Regression, Linear Algebra Appl. 264, 173–188, 1997.
- Morris Newman and John Todd, The Evaluation of Matrix Inversion Programs, J. Soc. Indust. Appl. Math. 6(4), 466–476, 1958.
- Gilbert Strang, Essays in Linear Algebra, Wellesley-Cambridge Press, Wellesley, MA, USA, 2012. Chapter A.3: “My Favorite Matrix”.
- John Todd, Basic Numerical Mathematics, Vol.
: Numerical Algebra, Birkhäuser, Basel, and Academic Press, New York, 1977.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK