Linear Algebra I: Homework 6

Due: Friday, March 9, 2018
  1. Answer whether the following are subspaces of the provided vector space.

    1. For an \(m \times n\) matrix \(M\) with \(m \ne n\), is the set \(\ker(M)\) a subspace of \(\mathbb{R}^n\)?

      Yes. We talked about why the kernel is a subspace of the domain in class.

    2. For an \(m \times n\) matrix \(M\) with \(m \ne n\), is the set \(\mathop{\text {Im}}(M)\) a subspace of \(\mathbb{R}^n\)?

      No. The image is a subspace of the codomain, not the domain.

    3. Is the set of rational functions \(h(x) = f(x)/g(x)\) (with \(g(x) \ne 0\)) a subspace of the vector space of all functions?

      Yes. \(a h_1(x) + b h_2(x) = (af_1(x)g_2(x) + bf_2(x)g_1(x))/(g_1(x)g_2(x))\) is itself a rational function (it has a numerator function and a denominator function).

    4. For any \(3 \times 4\) matrix \(M\) is the set of solutions to the equation

      \[M\vec x = \begin{pmatrix}3\\1\\-1\end{pmatrix}\]

      a subspace of \(\mathbb R^4\)?

      No. \(0\) is not a solution, so not in the set (so it can’t be a subspace).

    1. How many ways are there to write the vector

      \[\begin{pmatrix}3 \\ -1 \\ 2\end{pmatrix}\]

      as a linear combination of the vectors:

      \[\begin{pmatrix}1 \\ 1 \\ 1\end{pmatrix} \quad\text{and}\quad \begin{pmatrix}1 \\ 2 \\ -1\end{pmatrix}\]

      The number of ways is the number of solutions to the equation,

      \[\begin{pmatrix}1 & 1 \\ 1 & 2 \\ 1 & -1\end{pmatrix} \vec x = \begin{pmatrix}3\\-1\\2\end{pmatrix}.\]

      This row reduces to the equation,

      \[\begin{pmatrix}1 & 1 \\ 0 & 1 \\ 0 & -2\end{pmatrix} \vec x = \begin{pmatrix}3\\-4\\-1\end{pmatrix}.\]

      There are no solutions, so there are no ways to write the vector as a linear combination of the others.

    2. If \(ad - bc \ne 0\), how many ways are there to write the vector \(\vec x \in \mathbb R^2\) as a linear combination of the vectors:

      \[\begin{pmatrix}a\\c\end{pmatrix} \quad\text{and}\quad \begin{pmatrix}b\\d\end{pmatrix}\]

      The number of ways is the solution to the equation,

      \[\begin{pmatrix}a&b\\c&d\end{pmatrix} \vec y = \vec x,\]

      The matrix has nonzero determinant since \(ad-bc \ne 0\), so it has an inverse: This means there’s exactly one solution, and hence exactly one way to write \(\vec x\) as a linear combination of the two vectors.

    1. Find a set \(S\) of vectors so that

      \[\mathop{\text{span}}{(S)} = \mathop{\text{Im}}\begin{pmatrix} 1 & 2 & 2 & 3 \\ 1 & -1 & 2 & 0 \\ 0 & 1 & 0 & 1 \end{pmatrix}\]

      The problem mentions nothing about a basis, so we can just take the span of the columns of the matrix.

    2. Find a set \(S\) of vectors so that

      \[\mathop{\text{span}}{(S)} = \ker\begin{pmatrix} 1 & 2 & 2 & 3 \\ 1 & -1 & 2 & 0 \\ 0 & 1 & 0 & 1 \end{pmatrix}\]

      This is a little harder. Remember, we’re solving \(M\vec x = 0\), where \(M\) is the matrix in the problem. The matrix row reduces to,

      \[\begin{pmatrix}1 & 0 & 2 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 0 & 0\end{pmatrix}.\]

      So \(z,w\) are free and \(y = -w\) and \(x = -2z-w\). When \(z = 1, w = 0\) we get a vector

      \[\begin{pmatrix}-2\\0\\1\\0\end{pmatrix},\]

      And when \(z = 0, w = 1\) we get the vector

      \[\begin{pmatrix}-1\\-1\\0\\1\end{pmatrix}.\]

      The span of these two vectors is the kernel of the matrix.

  2. Suppose that the vectors \(\vec u, \vec v, \vec w\) in \(\mathbb R^n\) are all nonzero and orthogonal to each other. Is the set \(\left\{\vec u, \vec v, \vec w\right\}\) linearly dependent? Hint: What happens if \(\vec u = a \vec v + b \vec w\)?

    We’ll look at this in two different ways. First, if we pretend \(\vec u = a \vec v + b \vec w\) as per the hint, we’ll see that \(\vec u \cdot \vec u = \vec u \cdot (a \vec v + b \vec w) = a(\vec u \cdot \vec v) + b(\vec u \cdot \vec w) = 0 + 0 = 0\). But this means \(\vec u = 0\) which goes against what the problem says. This is impossible, so the \(\vec u\) cannot depend on the other two. (The same reasoning says that \(\vec v\) and \(\vec w\) can’t depend on each other).

    Here’s another way: Let \(M\) be the matrix with columns \(\vec u, \vec v, \vec w\). Then the vectors are linearly independent if \(M\) is invertible. The matrix \(M^T M\) is the matrix with the square lengths of \(\vec u, \vec v, \vec w\) along the diagonals (Why?), so it has nonzero determinant (Why?), and is hence invertible with inverse matrix \(A\). So \(AM^T M = I\) but this means \((AM^T)M = I\) so \(M^{-1} = (AM^T)\). So \(M\) is invertible, meaning that the vectors are linearly independent.

  3. Answer whether the following sets of vectors are linearly independent.
    1. \[\left\{ \begin{pmatrix}1\\1\\0\end{pmatrix}, \begin{pmatrix}-1\\2\\-1\end{pmatrix}, \begin{pmatrix}2\\3\\-1\end{pmatrix} \right\}\]

    The determinant of the matrix with these columns is \(-2\), so the matrix is invertible, and so the vectors are linearly independent.

    1. \[\left\{ \begin{pmatrix}1\\1\\0\end{pmatrix}, \begin{pmatrix}-1\\2\\-1\end{pmatrix}, \begin{pmatrix}0\\3\\0\end{pmatrix}, \begin{pmatrix}1\\2\\0\end{pmatrix} \right\}\]

    Any REF of the matrix built from these columns will have to be missing at least one pivot, so these vectors have to be linearly dependent (Why?).