Definitions

On this page we will keep a running list of important definitions (updated by you) from the course. These definitions must be memorized and you will be tested on them.

Set: A set is a collection of objects (mathematical or not) (p.9).

Vector: We define a vector in $\mathbb{R}^2$ to be an ordered pair of real numbers, $\mathbf{x}=(x_1,x_2)$. More generally, we define a vector in $\mathbb{R}^n$ to be an $n$-tuple of real numbers, $\mathbf{x}=(x_1,x_2,...,x_n)$.

Linear Combination: Let $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k\in\mathbb{R}^n$. We say $\mathbf{x}\in\mathbb{R}^n$ is a linear combination of $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$ if there exists $c_1, c_2, c_3,..., c_k$ in $\mathbb{R}$ so that $\mathbf{x}= c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots + c_k\mathbf{v}_k$.

Span: Given a collection of vectors $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k\in\mathbb{R}^n$, we say the span of these vectors is the set

(1)
\begin{align} \mbox{span}(\mathbf{v}_1, \mathbf{v}_2,..., \mathbf{v}_k) = \left\{\mathbf{x}\in\mathbb{R}^n \big|\ \mathbf{x}=c_1\mathbf{v}_1+c_2\mathbf{v}_2+...+c_k\mathbf{v}_k \mbox{ for some } c_1, c_2,..., c_k\in\mathbb{R}\right\} \end{align}

Dot Product: Let $\mathbf{x}=(x_1,x_2,...,x_n)$ and $\mathbf{y}=(y_1,y_2,...,y_n)$ be elements of $\mathbb{R}^n$. Then, $\mathbf{x} \cdot \mathbf{y}=x_1y_1+x_2y_2+...+x_ny_n$.

Orthogonal: We say vectors $\mathbf{x}$ and $\mathbf{y}$ in $\mathbb{R}^n$ are orthogonal if $\mathbf{x} \cdot \mathbf{y} = \mathbf{0}$ (p.20).

Proj$\mathbf{y}$$\mathbf{x}$: For any $\mathbf{x} \in \mathbb{R}^n$ and $\mathbf{y} \in \mathbb{R}^n$ with $\mathbf{y}\ne \mathbf{0}$ we get the vector:

(2)
\begin{align} \mbox{proj}_\mathbf{y}\mathbf{x} = \left( \frac{\mathbf{x} \cdot \mathbf{y}}{ ||\mathbf{y}||^2}\right)\mathbf{y} \end{align}

and call this the projection of $\mathbf{x}$ onto $\mathbf{y}$.

Hyperplane: We say a hyperplane is a collection of vectors $\mathbf{x}=(x_1,x_2,...,x_n)\in\mathbb{R}^n$ satisfying: $a_1x_1+a_2x_2+\cdots +a_nx_n = c$ for a fixed $\mathbf{a}=(a_1, a_2,..., a_n)\in\mathbb{R}^n$ and $c\in\mathbb{R}$. Written slightly different, if $\mathbf{a} \in \mathbb{R}^n$ is a nonzero vector and ${c} \in \mathbb{R}$, then the equation $\mathbf{a} \cdot \mathbf{x} = {c}$ defines a hyperplane in $\mathbb{R}^n$.

System of Linear Equations: A system of $m$ linear equations in $n$ variables is

(3)
\begin{align} a_{11}x_1 + a_{12}x_2 + ... +a_{1n}x_n = b_1\\ a_{21}x_1 + a_{22}x_2 + ... + a_{2n}x_n = b_2\\ \vdots\hspace{1in}\\ a_{m1}x_1 + a_{m2}x_2 + ... + a_{mn}x_n = b_m \end{align}

In shorthand notation, we write $A\mathbf{x}=\mathbf{b}$.

Echelon (and Reduced Echelon) Form:

A matrix is in echelon form if

1. The leading entries move from left to right in successive rows
2. The entries in the column below each leading entry are all zero
3. All zero rows are at the bottom of the matrix

A matrix is in Reduced Echelon Form (rref) if:

1. It is in echelon form
2. Every leading entry equals one
3. All entries in column above the leading entry equal 0.

Consistent and Inconsistent Systems: If the system of equations $A\mathbf{x}=\mathbf{b}$ has no solutions, the system is said to be inconsistent; if it has at least one solution, then it is said to be consistent.

Rank: Given any matrix, $A$, we say the rank of $A$ (r($A$), rk($A$), rank($A$)) is the number of non-zero rows in any echelon form of $A$.

Nonsingular: An $n\times n$ matrix, $A$, is called nonsingular if rank($A$) = $n$. $A$ is singular if rank($A$) $< n$.

Matrix Multiplication: Let $A$ be an $m\times n$ matrix and $B$ be an $n\times p$ matrix. Then $AB$ is defined by $(AB)_{ij}$ = $row_i(A)\cdot column_j(B)$.

Linear Transformation: A function $T: \mathbb{R}^n\to \mathbb{R}^m$ is called a linear transformation (or linear map) if it satisfies:

1. $T(\mathbf{x} + \mathbf{y}) = T(\mathbf{x}) + T(\mathbf{y})$ for all $\mathbf{x}, \mathbf{y}\in \mathbb{R}^n$
2. $T(c\mathbf{x})= cT(\mathbf{x})$ for all $\mathbf{x}\in \mathbb{R}^n$ and all scalars $c\in\mathbb{R}$.

These are often called the linearity properties.

Rotation Matrix: The rotation matrix is a $2\times 2$ matrix that rotates vectors in the $xy$-plane counterclockwise through an angle $\theta$ and about the origin and is given by:

(4)
\begin{pmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{pmatrix}

Left and Right Inverses: If $A$ is an $m\times n$ matrix, then we say $B$ is a left inverse of $A$ if $B$ is an $n\times m$ matrix with $BA=I_n$. We say $C$ is a right inverse if $C$ is an $n\times m$ matrix with $AC=I_m$.

Invertible: An $n\times n$ matrix, $A$, is invertible if there is some $n\times n$ matrix, $B$, with $BA = I_n$. We usually denote the inverse of $A$ by $A^{-1}$.

Transpose: If $A$ is an $m\times n$ matrix the transpose of $A$, denoted $A^T$ is the $n\times m$ matrix defined by $(A^T)_{ij}=(A)_{ji}$. In other words, the ith row of $A^T$ is the ith column of $A$.

Symmetric: An $n\times n$ matrix, $A$, is called symmetric if $A^T=A$.

Subspace: A subset $V\subseteq\mathbb{R}^n$ is called a subspace if the following all hold:

1. $\mathbf{0}\in V$.
2. if $\mathbf{x}\in V$ and $c\in\mathbb{R}^n$, then $c\mathbf{x}\in V$.
3. if $\mathbf{x}\in V$ and $\mathbf{y}\in V$, then $\mathbf{x}+\mathbf{y}\in V$.

Nullspace: Let $A$ be an $m\times n$ matrix.The nullspace of $A$ is the set of solutions of the homogeneous system $A\mathbf{x}=\mathbf{0}$ :

(5)
\begin{align} N(A)= \left\{\mathbf{x}\in \mathbb{R}^n \mid A\mathbf{x}=\mathbf{0}\right\} \end{align}

Column Space: Let $A$ be an $m\times n$ matrix. Then the column space of $A$ is given by:

(6)
\begin{align} C(A)= Span(col_1(A),...,col_n(A))\subseteq\mathbb{R}^m \end{align}

Row Space: Let $A$ be an $m\times n$ matrix. Then the row space of $A$ is given by:

(7)
\begin{align} R(A)= Span(row_1(A),...,row_m(A))\subseteq\mathbb{R}^n \end{align}

Left Nullspace: Let $A$ be an $m\times n$ matrix. Then the left nullspace of $A$ is given by:

(8)
\begin{align} N(A^T)= \left\{\mathbf{x}\in \mathbb{R}^m \mid A^T\mathbf{x}=\mathbf{0}\right\}=\left\{\mathbf{x}\in \mathbb{R}^m \mid \mathbf{x}^T A=\mathbf{0}\right\} \end{align}

Linearly Independent: The set of vectors {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} are linearly independent if and only if:

If $c_1\mathbf{v}_1+c_2\mathbf{v}_2+...+c_k\mathbf{v}_k = \mathbf{0}$ then $c_1 = c_2 = \cdots = c_k = 0$

Basis: Let $V\subseteq\mathbb{R}^n$ be a subspace. The set {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} is a basis for $V$ if and only if:

1. $V$ = span ($\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$), and
2. {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} is linearly independent.

Dimension: Let $V\subseteq\mathbb{R}^n$ be a subspace. The dimension of $V$ is the number of elements in any basis of $V$.

Nullity: The dimension of the nullspace of A is often called the nullity of A, which is denoted null (A).

Eigenvalue and Eigenvector: If $A$ is an $n\times n$ matrix, then a nonzero vector, $v$ is called an eigenvector if $Av = \lambda v$ for some $\lambda \in \mathbb{R}$. In this case we call $\lambda$ an eigenvalue for $A$.

Characteristic Polynomial: Let $A$ be a square matrix. Then $p(t) = p_A(t) = \det(A - tI)$ is called the characteristic polynomial of $A$.

page revision: 134, last edited: 16 Nov 2012 00:25