Exercise 06 solutions

🔹 must know, 🔸 ideally should know, ⛑ challenging, 🔫 very challenging

Exercises from the Rasmus’ notes


🔹 Exercise 1

Let \(B\) be the basis for \(P_2\) consisting of the polynomials \(p=x^2+2 x-1, q=2 x^2+2\) and \(r=x+4\). Which polynomial has coordinates \((1,2,-3)\) relative to \(B\)? Compute the coordinates of \(7 x+8\) relative to \(B\).

Solution

First of all, we know that the set of basis vectors look as follows:

\[\begin{aligned} B &= \{ \begin{bmatrix} 1 \\ 2 \\ -1 \end{bmatrix}, \begin{bmatrix} 2 \\ 0 \\ 2 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 4 \end{bmatrix} \} \end{aligned}\]

Second, we know the coordinates with respect to the basis of the vector that we want to obtain. Therefore, to see how the works looks we can simply write:

\[\begin{aligned} v &= 1\begin{bmatrix} 1 \\ 2 \\ -1 \end{bmatrix} + 2\begin{bmatrix} 2 \\ 0 \\ 2 \end{bmatrix} - 3\begin{bmatrix} 0 \\ 1 \\ 4 \end{bmatrix} = \begin{bmatrix} 5 \\ -1 \\ -9 \end{bmatrix} \end{aligned}\]

Now, if we want to compute coordinates of a new vector

\[v' = \begin{bmatrix} 0 \\ 7 \\ 8 \end{bmatrix}\]

with respect to \(B\), we just need to find solution to the following system:

\[A = \left[\begin{array}{lll|l} 1 & 2 & 0 & 0 \\ 2 & 0 & 1 & 7 \\ -1 & 2 & 4 & 8 \end{array}\right]\]

Using Gauss-Jordan elimination or any other method that you prefer, we arrive at the following coordinates:

\[[\mathbf{v'}]_B = \left(\begin{array}{c} 2 \\ -1 \\ 3 \end{array}\right)\]


🔸 Exercise 2

Suppose \(B\) is a basis for a vector space \(V\) , \(u \in V\) and \(c \in R\). Prove that \([cu]_B = c[u]_B\).

Solution

We need to prove that coordinates of vector \(cu\) are identical to coordinates of vector \(u\) scaled by \(c\). Well, we know that coordinates of vector \(cu\) from the expression

\[cu = cx_1b_1 + \dots + cx_nb_n\]

must be

\[[cu]_B = \begin{bmatrix} cx_1 \\ \dots \\ cx_n \end{bmatrix}\]

Now, based on the following expression

\[u = x_1b_1 + \dots + x_nb_n\]

we know that coordinates of vector \(u\) are:

\[[u]_B = \begin{bmatrix} x_1 \\ \dots \\ x_n \end{bmatrix}\]

If we scale these by \(c\) we obtain:

\[c[u]_B = \begin{bmatrix} cx_1 \\ \dots \\ cx_n \end{bmatrix}\]

Therefore, we proved that \([cu]_B = c[u]_B\).


🔸 Exercise 5

Definition. If \(u, v\) are vectors in \(R^n\) write \(u ⊥ v\) to mean that \(u \cdot v = 0\), i.e., that \(v\) and \(u\) are orthogonal. Say a vector \(u\) is orthogonal to a subspace \(V\) of \(R^n\) if \(u ⊥ v\) for all \(v\) in \(V\).

Prove that the vector \((0, 0, 1)\) is orthogonal to the xy-plane in \(R^3\) according to the above definition.

Solution

First of all, all vectors \(v\) that are part of \(xy\) plane \(R^3\) are of the form \((x, y, 0)\). To prove that the vector \(u = (0, 0, 1)\) is orthogonal to the given subspace \(V\) (plane), we need to prove that for any vector \(v \in V\), the following holds true:

\[u \cdot v = 0\]

More specifically, this means:

\[u \cdot v = (0, 0, 1) \cdot (x, y, 0) = 0\times x + 0\times y + 1\times 0 = 0\]

Therefore we can conclude that \(u\) is orthogonal to the subspace \(V\) which is the \(xy\) plane in \(R^3\).


🔸 Exercise 6

Suppose \(\mathbf{v}\) is a vector in \(\mathbb{R}^n\). Prove that the set

\[V=\left\{\mathbf{u} \in \mathbb{R}^n \mid \mathbf{v} \perp \mathbf{u}\right\}\]

of vectors orthogonal to \(\mathbf{v}\) is a subspace of \(\mathbb{R}^n\). Describe the form of \(V\) in the case where \(n\) is 2 or 3 . Draw \(V\) for the case of \(\mathbf{v}=(1,1)\).

Solution

This the type of exercise that we practiced last week, i.e., given some subset, prove that is a subspace of given vector space. First, we need to prove that it is non-empty. It is non-empty because there exists zero vector \(u\) since:

\[u \cdot v = 0 \Rightarrow u \perp v \Rightarrow u \in V\]

Second, assume we have vectors \(u_1, u_2 \in V\), then \(u_1 + u_2 \in V\):

\[(u_1 + u_2)v = u_1v + u_2v = 0 + 0 = 0\]

Finally, given some real numbered scalar \(c\) and \(u \in V\), we can prove that \(cu \in V\):

\[(cu)v = c(uv) = c0 = 0\]

which also proves that the subset \(V\) can be closed under scalar multiplication. Since of all the conditions are met, we conclude that \(V\) is a subspace of \(R^n\).

In addition, if we are in \(R^3\), then \(V\) is a plane, whereas in \(R^2\) \(V\) is a line. If we want to draw how the \(V\) would look like when \(v = (1, 1)\), we just realize that to draw a line, we need two points. We already know one point from \(V\) which is \(u_1 = (0, 0)\). Second point, could be for instance \(u_2 = (-1, 1)\).


⛑ Exercise 7

Suppose \(B=\left\{\mathbf{v}_1, \ldots, \mathbf{v}_n\right\}\) is a basis for the subspace \(V\) of \(\mathbb{R}^m\), and that \(\mathbf{u}\) is a vector \(i \in \mathbb{R}^m\). Prove that \(\mathbf{u}\) is orthogonal to \(V\) if and only if it is orthogonal to each of the vectors in \(B\). Hint: There are two directions to prove here. First prove that if \(\mathbf{u}\) is orthogonal to \(V\) then it is orthogonal to each of the vectors in \(B\). Then prove that if \(\mathbf{u}\) is orthogonal to each of the vectors in \(B\) then it is orthogonal to \(V\). This requires that you show that \(\mathbf{u} \bullet \mathbf{w}=0\) for any vector \(\mathbf{w}\) in \(V\). To do this, use the fact that one can write any such \(\mathbf{w}\) as \(\mathbf{w}=c_1 \mathbf{v}_1+\cdots+c_n \mathbf{v}_n\).

Solution

As mentioned in the hint, we essentially need to prove two things. First, we need to prove that \(u \perp V \Rightarrow u \perp B\). Well, we know that since \(B\) forms basis of \(V\), it means that \(B \subseteq V\). Therefore if we assume that \(u\) is orthogonal to all vectors in \(V\), it implies that it is also orthogonal to vectors in \(B\) since vectors in \(B\) are part of \(V\). Next, we need to prove that if \(u\) is orthogonal to \(B\), then it is also orthogonal to \(V\). Therefore we need to show that \(u \cdot w = 0\) for any \(w \in V\). Well, we know that any such \(w\) can be written as a linear combination of basis vectors \(B = \{v_1, v_2, \dots, v_n\}\):

\[\mathbf{w}=c_1 \mathbf{v}_1+\cdots+c_n \mathbf{v}_n\]

Thus, if we substitute this for \(w\):

\[u \cdot w = u \cdot (c_1 \mathbf{v}_1+\cdots+c_n \mathbf{v}_n)\]

Then we can use algebraic rules for working with dot product and rewrite system as well as realize that from our assumption we know that \(u \cdot v = 0\) for any \(v \in B\):

\[u \cdot (c_1 \mathbf{v}_1+\cdots+c_n \mathbf{v}_n) = c_1\left(\mathbf{u} \cdot \mathbf{v}_1\right)+\cdots+c_n\left(\mathbf{u} \cdot \mathbf{v}_n\right)=0\]

Thus we have proved it also the other way around. As a result we have proved the following:

\(\mathbf{u}\) is orthogonal to \(V\) if and only if it is orthogonal to each of the vectors in \(B\).


⛑ Exercise 8

In the setting of Exercise 7, let \(P\) be the matrix whose columns are the vectors of the basis \(B\). Prove that \(\mathbf{u}\) is orthogonal to \(V\) if and only if \(P^T \mathbf{u}=0\). You may assume that the result of Exercise 7 has been proved and can be used here.

Solution

Let us first write what \(P^Tu\) actually corresponds to:

\[p^t \mathbf{u}=\left[\begin{array}{cccc} \mid & \mid & & \mid \\ \mathbf{v}_1 & \mathbf{v}_2 & \ldots & \mathbf{v}_n \\ \mid & \mid & & \mid \end{array}\right]^t \mathbf{u}=\left[\begin{array}{ccc} - & \mathbf{v}_1^t & - \\ - & \mathbf{v}_2^t & - \\ & \vdots & \\ - & \mathbf{v}_n^t & - \end{array}\right] \mathbf{u}=\left[\begin{array}{c} \mathbf{v}_1 \bullet \mathbf{u} \\ \vdots \\ \mathbf{v}_n \bullet \mathbf{u} \end{array}\right]\]

From exercise 7, we know that \(\mathbf{u}\) is orthogonal to \(V\) if and only if it is orthogonal to each of the vectors in \(B\). Meaning, it must hold true that \(u \cdot v = 0\) for all \(v \in B\). This happens precisely only when \(P^Tu = 0\) which can be seen from the above equation.


🔍 About the page


  • Last updated: 05/10/2022
  • Unless othewise stated, exercises come from the book: Elementary Linear Algebra, International Metric Edition, Ron Larson
  • Purpose: This page was created as part of a preparation for my teaching assistant session in the course Linear algebra and optimization managed by Rasmus Ejlers Møgelberg. Please note that unless othewise stated, the solutions on this page are my own, thus it should not be considered as official solutions provided as part of the course.