A Matrix-Valued Generalization of Cross Product for
Two Vectors in R^{3} with Application to a Matrix Operator for
Computing the Curl of the Curl of a Vector Field

Donald R. Burleson, Ph.D.

Copyright (c) 2012; all rights reserved.

Let _{} be any two vectors
in the space R^{3}. It is well
known that the cross product

_{}

can alternatively be computed as a matrix product of the form

_{}

where as is customary we have made
use of the notion of regarding the vector and the 1x3 row matrix as isomorphic;
we shall in fact use their respective notations (i.e. with or without the
separating commas) interchangeably. One
readily shows that the matrix product is equivalent to the usual _{}.

Since _{} we have also _{}. (This and similar
results can also be demonstrated using matrix transposes, since the matrices
being discussed here are skew-symmetric and A^{T}, for example, can be
replaced with -A.)

These facts motivate the following

DEFINITION: The
function γ from
the vector space R^{3} to the set M_{3} of all 3x3 matrices
over the field of real numbers will be defined as

_{}.

Given a vector _{} the image matrix γ(_{}) for purposes of this discussion will be denoted simply A.

Immediately we may prove:

THEOREM: For any matrix A in the range of the function γ,

spectrum(A)
= _{}

PROOF: The characteristic polynomial of A is

det(A
- λI) = _{}

= - λ ( λ^{2}^{
}+ a_{1}^{2} ) -
(-a_{3})( -a_{3}λ - a_{1}a_{2}) + a_{2}(a_{1}a_{3}
- a_{2}λ)

= - λ ( λ^{2}
+ a_{1}^{2} + a_{2}^{2} + a_{3}^{2}
)

= - λ ( λ^{2}
+ _{})

the zeros of which (i.e. the eigenvalues of A) give the spectrum as claimed. ▐

COROLLARY: Every matrix in the range of the function γ is singular.

PROOF: As shown, the spectrum of A always includes the eigenvalue λ = 0. ▐

REMARK: It makes good
operational sense for these matrices to be singular, since, for example, if
such a matrix B had an inverse, the relation _{} would imply that one
could right-multiply both sides by B^{ -1} to solve for a unique vector
_{} enabling the given
cross product, but the uniqueness of this vector is not generally the case.

The equivalence of _{} with _{} (where _{} is now regarded as a
matrix) allows the recasting of many cross product expressions in terms of
matrix products. In particular, iterated
cross products involving three vectors can be rewritten in a variety of ways,
e.g.

_{}

since the operation now is matrix multiplication, which, unlike vector cross product, is associative. Thus, oddly enough, the resulting expression is "re-associated" when compared with the original iterated cross product grouping.

Since B replaces _{} in _{} = _{} and since A replaces _{} after a fashion in _{} = _{} (i.e. the vectors get
replaced by matrices one at a time), and especially since a product of two
γ-image matrices right-multiplies onto one of the vectors to produce the
desired three-vector cross product, it seems reasonable to dignify such a
matrix product by defining it as a matrix-valued operation on the two
underlying vectors, a sort of generalization of the concept of cross
product. Hence:

DEFINITION: For _{} and _{},
the matrix-valued operation combining _{} with _{} to produce AB, i.e,
the function from R^{3}xR^{3} to M_{3} under which the
image of _{} is AB, will be denoted
Π. That is, _{} Π _{} = AB. _{}

Thus, for example, one can write: _{}Π _{}

Explicitly, the generalized product _{} is the matrix

_{}

= _{}

from which it is easy to prove:

THEOREM: For matrices
A and B in the range of the function γ, if the pre-image vectors _{} are non-zero orthogonal
vectors, then trace(AB) = trace(_{}) = 0, and conversely.

PROOF: By inspection the trace of the matrix AB is

_{}

and the trace is zero if and only if this dot product is zero, which is the case if the vectors are orthogonal. ▐

_{}

It turns out that the primary interest in this generalized cross product resides in the matter of three-vector cross products. To that effect:

LEMMA: For three
vectors _{}and _{}:

_{}Π _{} = _{}Π _{}

and _{}Π _{}Π _{}

PROOF: The first equality has already been established. Further:

_{}

_{}.

Also _{}

= _{}.

And finally _{}

= _{} ▐

EXAMPLES: Let A, B, and C respectively be the matrices corresponding to the vectors (1,3,-2), (2,-1,4), and (-3,2,1).

By the customary computation using

_{}

By the results of the lemma, this is also computable as

_{}

= (1,3,-2) _{}

as before. Likewise, by determinants

_{}_{}

and by the lemma we may also compute this result as

_{}

_{}

One particularly intriguing application for the notion of a
matrix operator's being employed in three-vector iterated cross product
expressions is the matter of **the curl of
the curl of a vector field**, a concept having applications in the theory of
electromagnetism and elsewhere.

As usual the "

Since the curl of a vector field _{} is formally defined as

_{}_{}

and is a vector field itself, the
curl of this vector field in turn, i.e. the curl of the curl of the given
vector field _{}, is

_{}

The previously proven lemma provides a way of expressing
this iterated cross product, since the function γ
naturally extends to the "operator-valued" gradient operator
vector. It turns out that the curl of
the curl of the given vector field can be computed by way of an
operator-valued matrix simply applied to the vector field _{} on the right. As one would intuitively expect, this is done
by a matrix operator M such that

_{}

so that when the matrix operator M is applied twice in succession the result is

_{}

To that effect:

THEOREM: The curl of
the curl of a vector field _{} = (f,g,h) is given by applying, to the right of the
vector field by matrix multiplication, the operator

_{}.

PROOF: By the lemma,

_{}

_{}

which routinely computes out to the desired operator-valued matrix form. ▐