- Show that the set of real sequences $(u_n)$ that satisfy the recurrence relation $u_{n+1}=u_n +u_{n-1}$ (for $n \ge 1$)is a real vector space (a subspace of the space of all sequences of real numbers). Find a basis, and write down the dimension of the vector space.

*Proof.*

We use the Subspace Test. $u_n=0$ satisfies the recurrence relation. If $u_{n+1}=u_n +u_{n-1}$ and $v_{n+1}=v_n +v_{n-1}$, we have $u_{n+1}+λv_{n+1}=u_n+λv_n+u_{n-1}+λv_{n-1}$. So, by the Subspace Test, the set of real sequences $(u_n)$ that satisfy the recurrence relation is a real vector space. For any initial value $(w_0,w_1)$, we can solve a linear system to express it as $(u_0,u_1)+λ(v_0,v_1)$ as long as $(u_0,u_1)$ and $(v_0,v_1)$ are linearly independent. Since $(w_n)$ is determined by initial value, we have $w_n=u_n+λv_n$ for all $n$. Therefore the dimension is two. Because $\{\varphi^n\}$ and $\{\varphi^{-n}\}$ satisfy the recurrence relation and they are independent, they form a basis, where $\varphi=\frac{1+\sqrt5}2$. - For each of the following vector spaces and each of the specified subsets, determine whether or not the subset is a subspace. That is, in each case, either verify the conditions defining a subspace(or use the subspace test), or show by an example that one of the conditions does not hold.

(i)$V=\mathbb R^4$:(a)$\{(a,b,c,d)∈V:a+b=c+d\}$;

(b)$\{(a,b,c,d)∈V:a+b=1\}$;

(c)$\{(a,b,c,d)∈V:a^2=b^2\}$.

(ii)$V=\mathcal M_{n\times n}(\mathbb R)$:(a)the set of upper triangular matrices;

(b)the set of invertible matrices;

(c)the set of singular matrices('singular' means not invertible).

*Solution.*

(i)(a)$0+0=0+0$, therefore $0_V$ satisfies the condition. For any $(a_1,b_1,c_1,d_1),(a_2,b_2,c_2,d_2)$ that satisfy the condition, we have $a_1+b_1=c_1+d_1,a_2+b_2=c_2+d_2$, so $a_1+λa_2+b_1+λb_2=c_1+λc_2+d_1+λd_2$, so $(a_1,b_1,c_1,d_1)+λ(a_2,b_2,c_2,d_2)$ satisfies the condition. By the subspace test, it is a subspace.

(b)$0+0\ne1$, therefore $0_V$ doesn't satisfy the condition, so it isn't a subspace.

(c)$1^2=1^2$ and $1^2=(-1)^2$, therefore $(1,1,0,0)$ and $(1,-1,0,0)$ satisfies the condition, but $(1,1,0,0)+(1,-1,0,0)=(2,0,0,0)$ doesn't satisfy the condition, therefore it isn't a subspace.

(ii)(a)$O_n$ is upper triangular. If $A$ is upper triangular, for any $λ$, we have $λA$ is upper triangular. The sum of 2 upper triangular matrices is upper triangular, therefore it is a subspace.

(b)$O_n$ is not invertible. So the set of invertible matrices is not a subspace.

(c)Consider a $n×n$ matrix $A$ with a 1 at (1,1) and 0 at all other entries. Both $A,I-A$ are singular, but their sum is $I$, which is invertible. - Let $S$ be a finite spanning set for a vector space $V$. Let $T$ be a smallest subset of $S$ that spans $V$. Show that $T$ is linearly independent, hence a basis of $V$.

*Proof.*

If $T=\{t_1,\cdots,t_n\}$ is linearly dependent, there exists not all-zero coefficients $λ_i$ such that $λ_1t_1+\cdots+λ_nt_n=0$. Let $λ_k\ne0\;(1\le k\le n)$, we have $t_k=-\frac{λ_1}{λ_k}t_1-\cdots-\frac{λ_n}{λ_k}t_n$. Using this equation to substitute for $t_k$ in linear combination of vectors in $T$, we see that any vector in $V$ is linear combination of vectors in $T\setminus\{t_k\}$, therefore $T\setminus\{t_k\}$ spans $V$, contradicting the minimality of $T$. - (i)Which of the following sets of vectors in $\mathbb R$ are linearly independent?

(a) {(1,3,0),(2,-3,4),(3,0,4)}, (b) {(1,2,3),(2,3,1),(3,1,2)}

(ii)Let $V:=\mathbb{R}^{\mathbb{R}}=\{f: \mathbb{R} \rightarrow \mathbb{R}\}$. Which of the following sets are linearly independent in $V$?

(a) $\{f,g,h\}$ where $f(x)=5 x^{2}+x+1, g(x)=2 x+3$ and $h(x)=x^2-1$.

(b) $\{p,q,r\}$ where $p(x)=\cos ^{2}x, q(x)=\cos 2 x$ and $r(x)=1$.

*Solution.*

(i) We have (1,3,0)+(2,-3,4)-(3,0,4)=0, so (a) is linearly dependent. $\begin{pmatrix}1&2&3\\2&3&1\\3&1&2\end{pmatrix}\rightarrow\begin{pmatrix}1&2&3\\0&-1&-5\\0&0&18\end{pmatrix}$, so (b) is linearly independent.

(ii)(a)$\{f,g,h\}\subset$ the subspace spanned by $\{1,x,x^2\}$, and $\{1,x,x^2\}$ is linearly independent. $\begin{pmatrix}5&1&1\\0&2&3\\1&0&-1\end{pmatrix}\rightarrow\begin{pmatrix}1&\frac15&\frac15\\0&1&\frac32\\0&0&\frac92\end{pmatrix}$, so $\{f,g,h\}$ is linearly independent.

(b)$2p(x)-q(x)-r(x)=0$, so $\{p,q,r\}$ is linearly dependent. - (i)Let $u,v,w$ be linearly independent vectors in a vector space $V$.

(a) Show that $u+v, u-v, u-2v+w$ are also linearly independent.

(b) Are $u+v-3w, u+3v-w, v+w$ linearly independent?

(ii) Let $\left\{v_{1}, v_{2}, \ldots, v_{n}\right\}$ be a linearly independent set of $n$ vectors in a vector space $V$. Prove that each of the following sets is also linearly independent:

(a) $\left\{c_{1} v_{1}, c_{2} v_{2}, \ldots, c_{n} v_{n}\right\}$ where $c_{i} \neq 0$ for $1 \leqslant i \leqslant n$;

(b) $\left\{w_{1}, w_{2}, \ldots, w_{n}\right\}$ where $w_{i}=v_{i}+v_{1}$ for $1 \leqslant i \leqslant n$.

*Proof.*

(i)(a)Suppose there exists coefficients $λ_1,λ_2,λ_3$ such that $λ_1(u+v)+λ_2(u-v)+λ_3(u-2v+w)=0$, then $(λ_1+λ_2+λ_3)u+(λ_1-λ_2-2λ_3)v+λ_3w=0$, by independence, $λ_1+λ_2+λ_3=λ_1-λ_2-2λ_3=λ_3=0$, so $λ_1=λ_2=λ_3=0$. Therefore $u+v, u-v, u-2v+w$ are linearly independent.

(b)$u+v-3w-(u+3v-w)+2(v+w)=0$, so they are not linearly independent.

(ii)(a)Suppose there exists coefficients $λ_1,λ_2,\cdots,λ_n$ such that $λ_1c_{1} v_{1}+λ_2c_{2} v_{2}+\cdots+λ_nc_{n} v_{n}=0$, by independence, $λ_1c_{1}=λ_2c_{2}=\cdots=λ_nc_{n}=0$, therefore $λ_{1}=λ_{2}=\cdots=λ_{n}=0$, so $\left\{c_{1} v_{1}, c_{2} v_{2}, \ldots, c_{n} v_{n}\right\}$ are linearly independent.

(b)Suppose there exists coefficients $λ_1,λ_2,\cdots,λ_n$ such that $λ_1w_{1}+λ_2w_{2}+\cdots+λ_nw_{n}=0$, so $(λ_1+λ_2+\cdots+λ_n)v_1+λ_2v_{2}+\cdots+λ_nv_{n}=0$, by independence, $λ_1+λ_2+\cdots+λ_n=λ_2=\cdots=λ_n=0$, therefore $λ_{1}=λ_{2}=\cdots=λ_n=0$, so $\left\{w_{1}, w_{2}, \ldots, w_{n}\right\}$ are linearly independent. - (i)Let $V_{1}:=\left\{\left(x_{1}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: x_{1}+\cdots+x_{n}=0\right\}$. Show that $V_1$ is a subspace of $\mathbb{R}^{n}$ and find a basis for it.

(ii)Let $V_{2}:=\left\{\left(x_{i j}\right) \in \mathcal{M}_{n \times n}(\mathbb{R}): x_{i j}=x_{j i} \text { for all relevant }(i, j)\right\}$. Show that $V_2$ is a subspace of $\mathcal{M}_{n \times n}(\mathbb{R})$―this is the space of real symmetric matrices―and find a basis for it.

(iii)Let $V_{3}:=\left\{\left(x_{i j}\right) \in \mathcal{M}_{n \times n}(\mathbb{R}): x_{i j}=-x_{j i} \text { for all relevant }(i, j)\right\}$. Show that $V_3$ is a subspace of $\mathcal M_{n\times n}(\mathbb R)$―this is the space of real*skew-symmetric*$n\times n$ matrices―and find a basis for it.

*Solution.*

(i)$(0,\cdots,0)\in V_1$. For $\left(x_{1}, \ldots, x_{n}\right),\left(y_{1}, \ldots, y_{n}\right)\in V_1$, we have $x_1+\cdots+x_n=y_1+\cdots+y_n=0$, therefore $x_1+λy_1+x_2+λy_2+\cdots+x_n+λy_n=0$, therefore $\left(x_{1}, \ldots, x_{n}\right)+λ\left(y_{1}, \ldots, y_{n}\right)\in V_1$. By subspace test, $V_1$ is a subspace of $V$. $\{(1,\underbrace{0,\cdots,0}_{k},-1,0,\cdots,0)\mid k=0,1,\cdots,n-2\}$ form a basis of $V_1$.

(ii)$O_n\in V_2$. For $(x_{ij}),(y_{ij})\in V_2$, $x_{ij}=x_{ji}$, $y_{ij}=y_{ji}$, so $x_{ij}+λy_{ij}=x_{ji}+λy_{ji}$, so $(x_{ij})+λ(y_{ij})\in V_2$, by the subspace test, $V_2$ is a subspace of $\mathcal M_{n\times n}$. Let $(a_{ij})_{uv}$ be the $n×n$ matrix with only non-zero elements $a_{uv}=a_{vu}=1$, then $\{(a_{ij})_{uv}\mid u=1,\cdots,n,\;v=u,\cdots,n\}$ form a basis of $V_2$.

(iii)$O_n\in V_3$. For $(x_{ij}),(y_{ij})\in V_3$, $x_{ij}=-x_{ji}$, $y_{ij}=-y_{ji}$, so $x_{ij}+λy_{ij}=-x_{ji}-λy_{ji}$, so $(x_{ij})+λ(y_{ij})\in V_3$, by the subspace test, $V_3$ is a subspace of $\mathcal M_{n\times n}$. Let $(a_{ij})_{uv}$ be the $n×n$ matrix with only non-zero elements $a_{uv}=-a_{vu}=1$, then $\{(a_{ij})_{uv}\mid u=1,\cdots,n,\;v=u,\cdots,n\}$ form a basis of $V_3$.

**Starter**

**S1**. Let $V$ be a vector space, and take subspaces $U,W\leqslant V$. Then prove that $U+W\leqslant V$ and $U\cap W≤V$.

__Claim__Let $U,W≤V$ then $U+W≤V.$

__Proof__We use the Subspace Test.

- Since $U≤V$, we have $0_V∈U$. Similarly, $W≤V$ so $0_V∈W$. Now $0_V=0_V+0_V∈U+W$.
- Take $u_1+w_1,u_2+w_2∈U+W$, where $u_1,u_2∈ U$ and $w_1,w_2∈W$, and take $\lambda∈\mathbb F$(where $V$ is a vector space over the field $\mathbb F$).

Then $\lambda(u_1+w_1)+(u_2 +w_2)=(\lambda u_1+u_2)+(\lambda w_1+w_2)\in U +W$, since $U$ and $W$ are both subspaces.

__Claim__Let $U,W≤V$. Then $U\cap W≤V$.

__Proof__We use the Subspace Test.- Since $U≤V$, we have $0_V∈U$. Similarly, $W≤V$ so $0_V∈W$. Now $0_V∈U\cap W$.
- Take $v_1,v_2∈U\cap W$, and take $\lambda∈F$. Then $v_1,v_2∈ U$ and $v_1,v_2∈W$. Then $\lambda u_1+u_2∈U$, since $U$ is a subspace, and similarly $\lambda v_1+v_2∈W$. So $\lambda v_1+v_2∈U\cap W$.

**S2**. For each of the following, give an example or prove that no such example exists, first when $V=\mathbb{R}^{3}$ and second when $V=\mathcal{M}_{2 \times 2}(\mathbb{R})$.

(i) A set of 2 linearly independent vectors in $V$.

(ii) A set of 3 linearly independent vectors in $V$.

(iii )A set of 4 linearly independent vectors in $V$.

(iv)A spanning set of 2 vectors in $V$.

(v)A spanning set of 3 vectors in $V$.

(vi)A spanning set of 4 vectors in $V$.

For the parts where there are examples, there are many possible examples!

(a) Let $V=\mathbb{R}^{3}$.

(i)For example, {(1,0,0),(0,1,0)} is linearly independent. (But we cannot simply take any set of 2 vectors in $V$. For example, {(1,0,0),(2,0,0)} is not linearly independent.

(ii)For example, {(1,0,0),(0,1,0),(0,0,1)} is linearly independent.

(iii)There are no sets of 4 linearly independent vectors in $V$. One way to see this is that $V$ has dimension 3-for example we see that the set in (ii) above is a basis with 3 elements and so any linearly independent set has size at most 3. We could also see it directly, rather than by quoting a result. Suppose that $a_1=(a_{11},a_{21},a_{31}),a_2=(a_{12},a_{22},a_{32}),a_3=(a_{13},a_{23},a_{33}),a_4=(a_{14},a_{24},a_{34})$ are four vectors in $V$.

[Secret aim: these four vectors are linearly dependent.]

Take $A_1,A_2,A_3,A_4∈\mathbb R$ such that $\lambda_1 a_1+\lambda_2a_2+\lambda_3a_3+\lambda_4a_4=0$.

[Secret aim: there is a solution with $\lambda_1,\lambda_2,\lambda_3,\lambda_4$ not all 0]

Then$$\left(\begin{array}{llll}a_{11} & a_{12} & a_{13} & a_{14} \\ a_{21} & a_{22} & a_{23} & a_{24} \\ a_{31} & a_{32} & a_{33} & a_{34}\end{array}\right)\left(\begin{array}{l}\lambda_{1} \\ \lambda_{2} \\ \lambda_{3} \\ \lambda_{4}\end{array}\right)=\left(\begin{array}{l}0 \\ 0 \\ 0\end{array}\right)$$If we were to apply EROs to reduce the 3×4 matrix to RRE form, we would find that there is at least one free variable $\lambda_i$; because there can be at most three columns containing the leading entry of a row. But we can choose any value for a free variable, and so there is certainly a solution with $\lambda_1,\lambda_2,\lambda_3,\lambda_4$ not all 0.

(iv)There are no spanning sets of 2 vectors in $V$. Since $V$ has dimension 3, any spanning set must contain at least 3 elements. (Or, again, we could see it directly.)

(v)For example, {(1,0,0),(0,1,0),(0,0,1)} is a spanning set.

(vi)For example, {(1,0,0),(0,1,0),(0,0,1),(1,1,1)} is a spanning set (If we have a spanning set with 3 vectors, then we can add any fourth vector at all and still have a spanning set.)

(b) $V=\mathcal{M}_{2 \times 2}(\mathbb{R})$

(i)For example, $\{\left(\begin{array}{ll}1 & 0 \\ 0 & 0\end{array}\right),\left(\begin{array}{ll}0 & 1 \\ 0 & 0\end{array}\right)\}$ is linearly independent.

(ii) For example, $\{\left(\begin{array}{ll}1 & 0 \\ 0 & 0\end{array}\right),\left(\begin{array}{ll}0 & 1 \\ 0 & 0\end{array}\right),\left(\begin{array}{ll}0 & 0 \\ 1 & 0\end{array}\right)\}$ is linearly independent.

(iii) For example, $\{\left(\begin{array}{ll}1 & 0 \\ 0 & 0\end{array}\right),\left(\begin{array}{ll}0 & 1 \\ 0 & 0\end{array}\right),\left(\begin{array}{ll}0 & 0 \\ 1 & 0\end{array}\right),\left(\begin{array}{ll}0 & 0 \\ 0 & 1\end{array}\right)\}$ is linealy independent.

(iv) There are no spanning sets of 2 vectors in $V$. The linearly independent set in (iii) above is in fact the standard basis for $V$, which has dimension 4, so any spanning set must contain at least 4 elements.

(v)Similarly to (iv), there are no spanning sets of 3 vectors in $V$.

(vi)The example from (iii) is also a spanning set with 4 elements-it is a basis.

**S3**. Let $V$ be the set of polynomials of degree at most 2 with real coefficients. That is, $V=\left\{a_{0}+a_{1} x+a_{2} x^{2}: a_{0}, a_{1}, a_{2} \in \mathbb{R}\right\}$.

Show that this is a vector space (under the usual polynomial addition and scalar multiplication). Give a basis $B_1$ for $V$. Give another basis $B_2$ that shares exactly one element with $B_1$. Give a third basis $B_3$ that shares no elements with $B_1$ or $B_2$.

To show that $V$ is a vector space, we check the usual list of axioms, noting that $V$ is indeed closed under addition and scalar multiplication (for example, adding two polynomials of degree at most 2 does give a polynomial of degree at most 2).*I'm not going to write out the whole list of axiom checks here*! Alternatively, knowing that $\mathbb R^{\mathbb R}$ is a real vector space, it can be shown that $V$ is a subspace of $\mathbb R^{\mathbb R}$ and so is a vector space in its own right.

There are many possible bases for $V$-this is the point of the question. The standard basis, which I'm going to call $B_1$, is $B_1=\{1,x,x^2\}$. We can see immediately that this is linearly independent and that it spans $V$. An example of another basis that shares exactly one element with $B_1$ is $B_2 =\{1,x+x^2,x-x^2\}$. To show that this is a basis, since we already know that $V$ has dimension 3 and this is a set of 3 elements, it is enough to prove either that $B_2$ spans $V$ or that $B_2$ is linearly independent (we don't need to prove both). To show that $B_2$ is linearly independent: take $λ_1,λ_2,λ_3\in\mathbb R$ such that $λ_1\cdot1+λ_2(x+x^2)+λ_3(x-x^2)=0$. Comparing constant coefficients shows $λ_1=0$. Comparing coefficients of $x$ and $x^2$ gives $λ_2+λ_3=0$ and $λ_2-λ_3=0$ respectively, and solving these simultaneous equations gives $λ_2=λ_3=0$. A third basis, sharing no elements with $B_1$ or $B_2$, is $B_{3}=\left\{3-x^{2}, \pi x, 1+x+x^{2}\right\}$. Again, since this is a set with $\dim V$ elements, to prove that $B_3$ is a basis it suffices to prove that $B_3$ is linearly independent. We can do this using a similar argument to that used for $B_2$.

**Pudding**

**P1**. Consider $V$ the vector space of all real sequences. For $k\geqslant1$, let $e^{(k)}$ be the sequence where all terms are 0 except for a 1 in position $k$. (So $e^{(1)}=(1,0,0,\cdots)$ and $e^{(2)}=(0,1,0,\cdots)$ for example.) Let $S=\left\{e^{(k)}: k \geqslant 1\right\}$. Is $S$ linearly independent in $V$? Does $S$ span $V$?

Note that a linear combination always involves only finitely many terms. We see that $S$ is linearly independent in $V$. Since $S$ is an infinite set, we need to show that every finite subset of $S$ is linearly independent. But if $\lambda_{1} e^{\left(k_{1}\right)}+\cdots+\lambda_{r} e^{\left(k_{r}\right)}=0$, then we must have $\lambda_{1}=\ldots=\lambda_{r}=0$ by looking at the entries in positions $k_{1}, \ldots, k_{r}$. So $S$ is indeed linearly independent.

However, $S$ does not span $V$. For example, $V$ contains a sequence $(1, 1, 1,\cdots)$ in which every term is 1. This sequence is not a linear combination of elements in $S$, because a linear combination involves only finitely many terms.

The question of whether this vector space has a basis is beyond the scope of this course. The current Part B course on Set Theory includes a proof that if we assume the Axiom of Choice then every vector space has a basis. This argument proves that a basis exists, but does not exhibit a concrete example of such a basis!

**P2**. Let $V=\mathbb R^4$. Let $W=\{(x_1,x_2,x_3,x_4)\in V:x_1+2x_2-x_3=0\}$. Show that $W$ is a subspace of $V$. What is the dimension of $W$? Find a basis $B_W$ of $W$. Consider the standard basis $B_V$ of $V$. Is there a subset of $B_V$ that is a basis for $W$? Can you add one or more vectors to your basis $B_W$ for $W$ to obtain a basis for $V$? Can you generalise?

**Claim**: $W≤V$.

__Proof__We use the Subspace Test.

Certainly $(0,0,0,0) \in W$.

Take $\left(x_{1}, x_{2}, x_{3}, x_{4}\right),\left(y_{1}, y_{2}, y_{3}, y_{4}\right) \in W$ and $\lambda \in \mathbb{R}$. Then $x_{1}+2 x_{2}-x_{3}=0$ and $y_{1}+2 y_{2}-y_{3}=0$. Now $\lambda\left(x_{1}, x_{2}, x_{3}, x_{4}\right)+\left(y_{1}, y_{2}, y_{3}, y_{4}\right)=\left(\lambda x_{1}+y_{1}, \lambda x_{2}+y_{2}, \lambda x_{3}+y_{3}, \lambda x_{4}+y_{4}\right)$, and $\left(\lambda x_{1}+y_{1}\right)+$$2\left(\lambda x_{2}+y_{2}\right)-\left(\lambda x_{3}+y_{3}\right)=\lambda\left(x_{1}+2 x_{2}-x_{3}\right)+\left(y_{1}+2 y_{2}-y_{3}\right)=0$. So, by the Subspace Test, $W≤V$.$\square$

Let $B_W=\{(1, 0, 1, 0), (0, 1, 2, 0),(0, 0, 0, 1)\}$. Note that all the elements of $B_W$ are in $W$. A quick check shows that $B_W$ is a linearly independent set. Also, if $(x_1,x_2,x_3,x_4)\in W$ then $x_3=x_1+2x_2$ so $(x_1,x_2,x_3,x_4)=x_1(1,0,1,0)+x_2(0,1,2,0)+x_4(0,0,0,1)$ is in the span of $B_W$.So $B_W$ a basis for $W$ and $W$ has dimension 3.

The standard basis of $V$ is $B_V=\{(1, 0,0,0),(0, 1, 0,0),(0,0, 1, 0), (0,0,0,1)\}$. The first three of these vectors do not lie in $W$, so cannot be included in a basis of $W$. But a basis of $W$ must contain three elements. So there is no subset of $B_V$ that is a basis for $W$. We can, however, add a vector to $B_W$ to get a basis for $V$. We know that $V$ has dimension 4, so any linearly independent set of 4 elements in $V$ will be a basis. For example, $B_W\cup\{(0,0,1,0)\}=\{(1,0,1,0),(0,1,2,0),(0,0.0,1),(0,0,1,0)\}$ is a basis. (Quick check of linear independene:if $λ_1(1,0,1,0)+λ_2(0,1,2,0)+λ_3(0,0,0,1)+λ_4(0,0,1,0)=(0,0,0,0)$ then, looking at each coordinate in turn $λ_1=0$ and $λ_2=0$ and $λ_1+2λ_2+λ_4=0$ and $λ_3=0$, so $λ_1=λ_2=λ_3=λ_4=0$.)

**P3**. A 3×3*magic square*is a 3×3 matrix with real entries, with the property that the sum of each row, each column, and each of the two main diagonals is the same. Find three examples of 3×3 magic squares. Show that the set of 3×3 magic squares forms a subspace of $\mathcal M_{3×3}(\mathbb R)$. What is its dimension?

Here are some not very exciting examples of magic squares:$\left(\begin{array}{lll}0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0\end{array}\right) \quad\left(\begin{array}{lll}1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1\end{array}\right) \quad\left(\begin{array}{lll}59 & 59 & 59 \\ 59 & 59 & 59 \\ 59 & 59 & 59\end{array}\right)$

Here are some more interesting examples of magic squares:$\left(\begin{array}{ccc}0 & 1 & -1 \\ -1 & 0 & 1 \\ 1 & -1 & 0\end{array}\right) \quad\left(\begin{array}{lll}2 & 9 & 4 \\ 7 & 5 & 3 \\ 6 & 1 & 8\end{array}\right) \quad\left(\begin{array}{ccc}2 & 11 & 2 \\ 5 & 5 & 5 \\ 8 & -1 & 8\end{array}\right)$

Let $S$ denote the set of 3×3 magic squares. We can use the subspace test to check that $S$ is a subspace of $\mathcal M_{3×3}(\mathbb R)$(I'm not including the details here.)

Here are three linearly independent magic squares:$\left(\begin{array}{lll}1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1\end{array}\right) \quad\left(\begin{array}{ccc}0 & 1 & -1 \\ -1 & 0 & 1 \\ 1 & -1 & 0\end{array}\right) \quad\left(\begin{array}{ccc}-1 & 1 & 0 \\ 1 & 0 & -1 \\ 0 & -1 & 1\end{array}\right)$

It turns out that they span the space of magic squares (although showing this is slightly fiddly, and so the space is a 3-dimensional subspace of $\mathcal M_{3×3}(\mathbb R)$.

Another way to explore the dimension is to record the constraints involved in having a magic square. We see that $\left(\begin{array}{lll}a & b & c \\ d & e & f \\ g & h & i\end{array}\right)$ is a magic square if and only if a certain system of 8 simultaneous equations is satisfied. This system can be represented as the matrix equation$$\left(\begin{array}{lllllllll}1 & 1 & 1 & 0 & 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 & 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 1 & 1 \\ 1 & 0 & 0 & 1 & 0 & 0 & 1 & 0 & 0 \\ 0 & 1 & 0 & 0 & 1 & 0 & 0 & 1 & 0 \\ 0 & 0 & 1 & 0 & 0 & 1 & 0 & 0 & 1 \\ 1 & 0 & 0 & 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 & 1 & 0 & 1 & 0 & 0\end{array}\right)\left(\begin{array}{c}a \\ b \\ c \\ d \\ e \\ f \\ g \\ h \\ i\end{array}\right)=\left(\begin{array}{c}\alpha \\ \alpha \\ \alpha \\ \alpha \\ \alpha \\ \alpha \\ \alpha \\ \alpha\end{array}\right)$$where $\alpha$ is the sum of each row/column/diagonal. The first three rows tell us that the sum of each row must be $\alpha$. Rows 4, 5, 6 tell us that the sum of each column must be $\alpha$. And finally rows 7 and 8 tell us that the sum of each diagonal must be $\alpha$. We can then use EROs to reduce the corresponding augmented matrix to RRE form. If you are an ERO enthusiast, then you can do this by hand; otherwise a computer will be happy to help. The augmented matrix reduces to$$\left(\begin{array}{ccccccccc|c}1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & \frac{2}{3} \alpha \\ 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1 & 0 & \frac{2}{3} \alpha \\ 0 & 0 & 1 & 0 & 0 & 0 & 0 & -1 & -1 & -\frac{1}{3} \alpha \\ 0 & 0 & 0 & 1 & 0 & 0 & 0 & -1 & -2 & -\frac{2}{3} \alpha \\ 0 & 0 & 0 & 0 & 1 & 0 & 0 & 0 & 0 & \frac{1}{3} \alpha \\ 0 & 0 & 0 & 0 & 0 & 1 & 0 & 1 & 2 & \frac{4}{3} \alpha \\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 1 & 1 & \alpha \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\end{array}\right)$$On the left. there are 7 determined variables and 2 free. We can also choose any real value of $\alpha$, which gives an extra degree of freedom. So we see that the dimension of the subspace of magic squares is 3.