Definitions

On this page we will keep a running list of important definitions (updated by you) from the course. These definitions must be memorized and you will be tested on them.


Set: A set is a collection of objects (mathematical or not) (p.9).

Vector in $\mathbb{R}^n$: A vector in $\mathbb{R}^n$ is an ordered list of n real numbers.

(2)
\begin{align} \mathbf{x} = (x_1 , x_2 , x_3 , \dotsb , x_n) \end{align}

Linear Combination: Let $\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k} \in \mathbb{R}^n$. and $\mathbf{c}_{1}, \mathbf{c}_{2},..., \mathbf{c}_{k} \in \mathbb{R}$. Then if $\mathbf{v}= \mathbf{c}_{1}\mathbf{v}_{1} + \mathbf{c}_{2}\mathbf{v}_{2} + ... + \mathbf{c}_{k}\mathbf{v}_{k}$, we say that $\mathbf{v}$ is a linear combination of $\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k} \in \mathbb{R}^n$.


Span: Let $\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k} \in \mathbb{R}^n$. The $\mathit{span}$ of $\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k}$ is the collection of all linear combinations of the vectors, and is denoted Span($\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k}$). That is, (5)
\begin{align} Span(\mathbf{v}_{1}, \mathbf{v}_{2}, ..., \mathbf{v}_{k}) = \{\mathbf{v} \in \mathbb{R}^n : \mathbf{v} = c_{1}\mathbf{v}_{1}+ c_{2}\mathbf{v}_{2}+ ...+ c_{k}\mathbf{v}_{k} \mbox{ for some scalars } c_{1}, c_{2}, ..., c_{k}\} \end{align}

Dot Product: Given vectors $\mathbf{x},\mathbf{y}\in\mathbb{R}^2$, define their dot product

(6)
\begin{align} \mathbf{x}\cdot\mathbf{y} = x_1 y_1 + x_2 y_2. \end{align}

More generally, given vectors $\mathbf{x},\mathbf{y}\in\mathbb{R}^n$, define their dot product

(7)
\begin{align} \mathbf{x}\cdot\mathbf{y} = x_1 y_1 + x_2 y_2 + \dotsb + x_n y_n. \end{align}

Orthogonal: We say vectors $\mathbf{x}, \mathbf{y}\in\mathbb{R}^n$ orthogonal if

(9)
\begin{align} \mathbf{x}\cdot\mathbf{y}=0 \end{align}

Proj$\mathbf{y}$$\mathbf{x}$: A projection is the transformation of points and lines in one plane onto another plane by connecting corresponding points on the two planes with parallel lines. The vector x$\parallel$ is called the projection of x onto y, written as projyx.
The projection of a vector $\mathbf{x}$ onto a vector $\mathbf{y}$ is given by

(11)
\begin{align} \mbox{proj}_{\mathbf{y}}\mathbf{x}=\frac{\mathbf{x}\cdot\mathbf{y}}{\mathbf {\parallel{y}\parallel}^2}\mathbf{y} \end{align}

Hyperplane: A hyperplane through the origin in $\mathbb{R}^n$ is given by a vector a in $\mathbb{R}^n$ that is orthogonal to all vectors on the hyperplane.
That is:

(12)
\begin{align} \{\mathbf{x}\in\mathbb{R}^n|\mathbf{a}\cdot\mathbf{x}=0\} \end{align}

System of Linear Equations:

Echelon (and Reduced Echelon) Form:
A matrix is in echelon form if:

  • The leading entries move to the right in successive rows.
  • The entries of the column below each leading entry are all 0.
  • All rows of 0's are at the bottom of the matrix.

Also, in addition to conditions of the echelon form, a matrix is called reduced echelon form if:

  • Every leading entry is 1.
  • All the entries of the column above each leading entry are 0.

If a matrix is in echelon form, the leading entry of any row is called pivot. The remaining variables are called free variables.

Consistent and Inconsistent Systems: A system is said to be consistent if the system of equations Ax = b has at least one solution. A system of equations with no solution is said to be inconsistent.

Rank: The rank of a matrix A is the number of nonzero rows (or the number of pivots) in any echelon form of A It is usually denoted by r.
Consequently, the number of "zero rows" in echelon form is m-r, when m is the total number of rows.

Nonsingular: An $n \times n$ matrix, $A$ is called nonsingular if the rank of $A$ is $n$.

Matrix Multiplication:
Let A be an $m\times n$ matrix and B an $n\times p$ matrix. Their product $AB$ is an $m\times p$ matrix whose ij-entry is

(18)
\begin{align} (AB)_{i,j}=row_i(A)\cdot col_j(B) \end{align}

Linear Transformation: A function $\mathit{T}$: $\mathbb{R}^n$ to $\mathbb{R}^m$ is called a $\mathit{linear}$ $\mathit{transformation}$ (or $\mathit{linear}$ $\mathit{map}$) if it satisfies the following $\mathit{linearity}$ $\mathit{properties}$:
(i) T(x + y) = T(x) + T(y) for all $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$.
(ii) T(cx) = cT(x) for all $\mathbf{x} \in \mathbb{R}^n$ and all scalars $\mathit{c}$.

Rotation Matrix: The matrix $A_\theta$ that rotates a vector through the angle $\theta$, given by:

(25)
\begin{align} A_\theta = \begin{bmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos \theta \end{bmatrix} \end{align}

Left and Right Inverses: Given an $m \times n$ matrix $A$, an $n \times m$ matrix $B$ is called a right inverse of $A$ if

(26)
\begin{equation} AB = I_{m}. \end{equation}

Similarly, an $n \times m$ matrix $C$ is called a left inverse of $A$ if

(27)
\begin{equation} CA = I_{n}. \end{equation}

Invertible: An $n \times n$ matrix $A$ is invertible if there is an $n \times n$ matrix $B$ satisfying

(32)
\begin{align} AB = I_n \mbox{ and } BA = I_n. \end{align}

The matrix $B$ is usually denoted $A^{-1}$ (read "$A$ inverse").

Transpose: An $m \times n$ matrix, $A$, with entries $a_{ij}$ has an transpose $n \times m$ matrix denoted $A^{\top}$ whose $ij$ entry is $a_{ji}$. The properties of the transpose operation are listed below in the notes.


An example of matrix next to its transpose matrix is shown below:

Symmetric: An $n \times n$ matrix $A$ is called symmetric if $A^{\top}$ = $A$ and is called skew-symmetric if $A^{\top}$= -$A$

An example of a symmetric and a skew-symmetric matrix:

Subspace: A subset $V$ of $\mathbb{R}^n$, $V \subset \Bbb{R}^n$, is a subspace iff the following all hold.

1. $\mathbf{0} \in V$ This means that $V$ contains the zero vector

2. for all $\mathbf{u} \in V$ and $\mathbf{v} \in V$, $\mathbf{u} + \mathbf{v} \in V$. This means that $V$ has closure under vector addition.

3. for all $\mathbf{u} \in V$ and $c \in \Bbb{R}$, $c\mathbf{u} \in V$. This means that $V$ has closure under scalar multiplication.


Nullspace: Let $A$ be an $m \times n$ matrix. The nullspace of $A$ is the set of solutions of the homogeneous system $A \mathbf{x} = \mathbf{0}$:(42)
\begin{align} \mathbf{N}(A) = \{\mathbf{x} \in \mathbb{R}^n : A\mathbf{x} = \mathbf{0}\}. \end{align}

Column Space: Let $A$ be an $m \times n$ matrix. The column space of $A$ is the subspace of $\mathbb{R}^m$ spanned by the columns of $A$:

(43)
\begin{equation} R(A)=Span(col_1(A), col_2(A),...,col_m(A)). \end{equation}

Row Space: Let $A$ be an $m \times n$ matrix. The row space of $A$ is the subspace of $\mathbb{R}^n$ spanned by the rows of $A$:

(44)
\begin{equation} R(A):=Span(row_1(A), row_2(A),...,row_n(A)). \end{equation}

Left Nullspace: Let $A$ be an $m \times n$ matrix. The left nullspace of $A$ is the set of solutions to the homogeneous system

(47)
\begin{align} N(A^{T})=\{\mathbf{x} \in \Bbb{R}^{m} : A^{T}\mathbf{x} = \mathbf{0}\} = \{\mathbf{x} \in \Bbb{R}^{m} : \mathbf{x}^{T}A = \mathbf{0}^{T}\} \end{align}

The left nullspace gives the linear combinations of the row vectors of of A that result in zero.

Linearly Independent:
Let

(53)
\begin{align} \mathbf{\{v_{1}, v_{2}, ..., v_{k}\}} \subset \Bbb{R}^{n} \end{align}

This set of vectors is linearly independent if when

(54)
\begin{align} c_{1}\mathbf{v_{1}} + c_{2}\mathbf{v_{2}} + ... + c_{k}\mathbf{v_{k}} = \mathbf{0} \end{align}

the only solution is

(55)
\begin{equation} c_{1} = c_{2} = ... = c_{k} = 0 \end{equation}

This means that the only solution is the trivial linear combination

(56)
\begin{align} 0\mathbf{v_{1}} + ... + 0\mathbf{v_{k}} \end{align}

Otherwise, the set is linearly dependent.

Basis: Let $V\subset \mathbb{R}^n$ be a subspace and $\mathbf{\{v_{1}, v_{2}, ..., v_{k}\}} \subset V$ be a collection of vectors. We say $\mathbf{\{v_{1}, v_{2}, ..., v_{k}\}}$ is a basis of $V$ if both of the following conditions hold:

  1. $\mathbf{\{v_{1}, v_{2}, ..., v_{k}\}}$ is a linearly independent set, and
  2. $V=Span(\mathbf{v_{1}, v_{2}, ..., v_{k}})$.

Dimension: The dimension of a subspace is the number of vectors in any basis of the subspace. By convention, we say that the dimension of $\{\mathbf{0}\}$ is 0.

Eigenvalue: Let $A$ be an $n\times n$ matrix. A scalar $\lambda\in\mathbb{R}$ is called an eigenvalue of $A$ if there exists a nonzero vector $\mathbf{v}$ satisfying $A\mathbf{v}=\lambda\mathbf{v}$.

Eigenvector: Let $A$ be an $n\times n$ matrix. A nonzero vector $\mathbf{v}\in\mathbb{R}^n$ is called an eigenvector of $A$ if there exists a scalar $\lambda$ satisfying $A\mathbf{v}=\lambda\mathbf{v}$.

Eigenspace: Let $A$ be an $n\times n$ matrix with eigenvalue $\lambda$. The eigenspace for $\lambda$ is given by $\mathbf{N}(A-\lambda I_n)$ where $I_n$ is the $n\times n$ identity matrix.

Characteristic Polynomial: Let $A$ be an $n\times n$ matrix. The characteristic polynomial is given by $det(A-tI_n)$, where $t$ is a variable. We use the characteristic polynomial to find eigenvalues (by setting the polynomial equal to 0 and solving).