Skip to main content
Contents
Embed
Dark Mode Prev Up Next
\(\require{cancel}\require{mathtools}\newcommand{\N}{\mathbb N}
\newcommand{\Z}{\mathbb Z}
\newcommand{\Q}{\mathbb Q}
\newcommand{\R}{\mathbb R}
\newcommand{\lambd}{\lambda}
\newcommand{\rrefarrow}{\xrightarrow{\mathrm{rref}}}
\newcommand{\tr}[1]{\mathrm{tr}#1}
\newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}}
\newcommand{\complex}[1]{\mathbb{C}^{#1}}
\newcommand{\spn}[1]{\left\langle#1\right\rangle}
\newcommand{\set}[1]{\left\{#1\right\}}
\newcommand{\leading}[1]{\boxed{#1}}
\newcommand{\nullity}[1]{n\left(#1\right)}
\newcommand{\dimension}[1]{\dim\left(#1\right)}
\newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)}
\newcommand{\rank}[1]{r\left(#1\right)}
\newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)}
\newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)}
\newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)}
\newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)}
\newcommand{\vect}[1]{\mathbf{#1}}
\newcommand{\lteval}[2]{#1\left(#2\right)}
\newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3}
\newcommand{\zerovector}{\vect{0}}
\newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}}
\newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)}
\newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}}
\newcommand{\vectrepname}[1]{\rho_{#1}}
\newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}}
\newcommand{\lt}{<}
\newcommand{\gt}{>}
\newcommand{\amp}{&}
\definecolor{fillinmathshade}{gray}{0.9}
\newcommand{\fillinmath}[1]{\mathchoice{\colorbox{fillinmathshade}{$\displaystyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\textstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptscriptstyle\phantom{\,#1\,}$}}}
\)
Worksheet Participate
Objectives
Be able to perform matrix multiplication when it is possible.
Explore ways in which matrix multiplication acts like multiplication of numbers, and ways in which matrix multiplication is different.
1. Is Matrix Multiplication Commutative in General?
Consider the matrices
\begin{align*}
A\amp=\begin{bmatrix} 1 \amp 2 \amp 3 \end{bmatrix} \amp B\amp= \begin{bmatrix} 2 \\ -3 \\4 \end{bmatrix}\text{.}
\end{align*}
(a)
Is
\(AB\) defined, and if so, what size is it?
(b)
Is
\(BA\) defined, and if so, what size is it?
(c)
We are used to multiplication of numbers being
commutative , that is, that the order in which we multiply numbers together does not matter,
\(xy=yx\) for all numbers
\(x\) and
\(y\text{.}\) What do your answers above mean for the
commutativity of matrix multiplication in general?
2. Diagonal Matrices and the Identity Matrix.
Consider the matrices
\begin{align*}
A\amp=\begin{bmatrix} 2 \amp 0 \amp 0 \\ 0 \amp -1 \amp 0 \\ 0 \amp 0 \amp 1 \end{bmatrix} \amp B\amp= \begin{bmatrix} 1 \amp 3 \amp 0 \\ 2 \amp 2 \amp -1 \\ 3 \amp 1 \amp -2 \\ \end{bmatrix}\text{.}
\end{align*}
(a)
Calculate
\(AB\) and
\(BA\text{.}\)
(b)
Examine the rows of
\(AB\) and the rows of
\(B\text{.}\) What do you notice about the diagonal entries of
\(A\) and the effect of multiplying
\(B\) by
\(A\) on the left?
(c)
Examine the columns of
\(BA\) and the columns of
\(B\text{.}\) What do you notice about the diagonal entries of
\(A\) and the effect of multiplying
\(B\) by
\(A\) on the right?
(d)
The number
\(1\) is called the
multiplicative identity because multiplying any number
\(x\) by
\(1\) equals the same number you started with,
\(1\cdot x = x\cdot 1 = x\text{.}\) Given an
\(n\times n\) matrix
\(X\text{,}\) what is the
identity matrix , that is, the matrix
\(I\) such that multiplying on both the left and the right leaves
\(X\) unchanged,
\(I\cdot X=X\cdot I = X\text{?}\)
3. Properties of Zero and Cancellation.
Define
\begin{align*}
A\amp=\begin{bmatrix} 1 \amp 2 \\ -2 \amp -4 \\ \end{bmatrix}
\amp B\amp= \begin{bmatrix} 2 \amp -4 \\-1 \amp 2 \\ \end{bmatrix}
\amp C\amp=\begin{bmatrix} 3 \amp 0 \\ 1 \amp 3 \end{bmatrix}
\amp D\amp= \begin{bmatrix} 1 \amp 2 \\ 2 \amp 2 \end{bmatrix}
\end{align*}
(a)
With numbers, we know that if
\(ab=0\text{,}\) then either
\(a = 0\) or
\(b=0\text{.}\)
Compute
\(AB\text{.}\) With matrices, if
\(AB = 0\text{,}\) is it necessarily true that either
\(A=0\) or
\(B=0\text{?}\)
(b)
When we are dealing with numbers, we know that if
\(a\neq 0\) and
\(ac = ad\text{,}\) then
\(c=d\text{.}\)
Compute both
\(AC\) and
\(AD\text{.}\) With matrices, if
\(A\neq \mathbf{0}\) and
\(AC = AD\text{,}\) is it necessarily true that
\(C=D\text{?}\)
4. Is Matrix Multiplication Associative? Distributive?
Consider the matrices
\begin{align*}
A\amp=\begin{bmatrix} 1 \amp 2 \\ 3 \amp -2 \\ \end{bmatrix}
\amp B\amp= \begin{bmatrix} 0 \amp 4 \\ 2 \amp -1 \\ \end{bmatrix}
\amp C\amp= \begin{bmatrix} -1 \amp 3 \\ 4 \amp 3 \end{bmatrix}\text{.}
\end{align*}
Here is a sage cell to perform (or check) the required computations.
(a)
Compute
\(AB\) and then
\((AB)C\text{,}\) and record your answers.
(b)
Compute
\(BC\) and then
\(A(BC)\) and record your answers.
(c)
What does this suggest about the
associativity of matrix multiplication?
(d)
Compute
\(B+C\) and then
\(A(B+C)\) and record your answers.
(e)
Compute
\(AC\text{,}\) and then
\(AB+AC\) (recall you computed
\(AB\) in
PartΒ a ).
(f)
What does this suggest about the
distributive property of matrix multiplication?