On this page we will keep a running list of important definitions (updated by you) from the course. These definitions must be memorized and you will be tested on them.
Set: A set is a collection of objects (mathematical or not) (p.9).
Vector: We define a vector in $\mathbb{R}^2$ to be an ordered pair of real numbers, $\mathbf{x}=(x_1,x_2)$. More generally, we define a vector in $\mathbb{R}^n$ to be an $n$-tuple of real numbers, $\mathbf{x}=(x_1,x_2,...,x_n)$.
Linear Combination: Let $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k\in\mathbb{R}^n$. We say $\mathbf{x}\in\mathbb{R}^n$ is a linear combination of $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$ if there exists $c_1, c_2, c_3,..., c_k$ in $\mathbb{R}$ so that $\mathbf{x}= c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots + c_k\mathbf{v}_k$.
Span: Given a collection of vectors $\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k\in\mathbb{R}^n$, we say the span of these vectors is the set
(1)Dot Product: Let $\mathbf{x}=(x_1,x_2,...,x_n)$ and $\mathbf{y}=(y_1,y_2,...,y_n)$ be elements of $\mathbb{R}^n$. Then, $\mathbf{x} \cdot \mathbf{y}=x_1y_1+x_2y_2+...+x_ny_n$.
Orthogonal: We say vectors $\mathbf{x}$ and $\mathbf{y}$ in $\mathbb{R}^n$ are orthogonal if $\mathbf{x} \cdot \mathbf{y} = \mathbf{0}$ (p.20).
Proj$\mathbf{y}$$\mathbf{x}$: For any $\mathbf{x} \in \mathbb{R}^n$ and $\mathbf{y} \in \mathbb{R}^n$ with $\mathbf{y}\ne \mathbf{0}$ we get the vector:
(2)and call this the projection of $\mathbf{x}$ onto $\mathbf{y}$.
Hyperplane: We say a hyperplane is a collection of vectors $\mathbf{x}=(x_1,x_2,...,x_n)\in\mathbb{R}^n$ satisfying: $a_1x_1+a_2x_2+\cdots +a_nx_n = c$ for a fixed $\mathbf{a}=(a_1, a_2,..., a_n)\in\mathbb{R}^n$ and $c\in\mathbb{R}$. Written slightly different, if $\mathbf{a} \in \mathbb{R}^n$ is a nonzero vector and ${c} \in \mathbb{R}$, then the equation $\mathbf{a} \cdot \mathbf{x} = {c}$ defines a hyperplane in $\mathbb{R}^n$.
System of Linear Equations: A system of $m$ linear equations in $n$ variables is
(3)In shorthand notation, we write $A\mathbf{x}=\mathbf{b}$.
Echelon (and Reduced Echelon) Form:
A matrix is in echelon form if
- The leading entries move from left to right in successive rows
- The entries in the column below each leading entry are all zero
- All zero rows are at the bottom of the matrix
A matrix is in Reduced Echelon Form (rref) if:
- It is in echelon form
- Every leading entry equals one
- All entries in column above the leading entry equal 0.
Consistent and Inconsistent Systems: If the system of equations $A\mathbf{x}=\mathbf{b}$ has no solutions, the system is said to be inconsistent; if it has at least one solution, then it is said to be consistent.
Rank: Given any matrix, $A$, we say the rank of $A$ (r($A$), rk($A$), rank($A$)) is the number of non-zero rows in any echelon form of $A$.
Nonsingular: An $n\times n$ matrix, $A$, is called nonsingular if rank($A$) = $n$. $A$ is singular if rank($A$) $< n$.
Matrix Multiplication: Let $A$ be an $m\times n$ matrix and $B$ be an $n\times p$ matrix. Then $AB$ is defined by $(AB)_{ij}$ = $row_i(A)\cdot column_j(B)$.
Linear Transformation: A function $T: \mathbb{R}^n\to \mathbb{R}^m$ is called a linear transformation (or linear map) if it satisfies:
- $T(\mathbf{x} + \mathbf{y}) = T(\mathbf{x}) + T(\mathbf{y})$ for all $\mathbf{x}, \mathbf{y}\in \mathbb{R}^n$
- $T(c\mathbf{x})= cT(\mathbf{x})$ for all $\mathbf{x}\in \mathbb{R}^n$ and all scalars $c\in\mathbb{R}$.
These are often called the linearity properties.
Rotation Matrix: The rotation matrix is a $2\times 2$ matrix that rotates vectors in the $xy$-plane counterclockwise through an angle $\theta$ and about the origin and is given by:
(4)Left and Right Inverses: If $A$ is an $m\times n$ matrix, then we say $B$ is a left inverse of $A$ if $B$ is an $n\times m$ matrix with $BA=I_n$. We say $C$ is a right inverse if $C$ is an $n\times m$ matrix with $AC=I_m$.
Invertible: An $n\times n$ matrix, $A$, is invertible if there is some $n\times n$ matrix, $B$, with $BA = I_n$. We usually denote the inverse of $A$ by $A^{-1}$.
Transpose: If $A$ is an $m\times n$ matrix the transpose of $A$, denoted $A^T$ is the $n\times m$ matrix defined by $(A^T)_{ij}=(A)_{ji}$. In other words, the ith row of $A^T$ is the ith column of $A$.
Symmetric: An $n\times n$ matrix, $A$, is called symmetric if $A^T=A$.
Subspace: A subset $V\subseteq\mathbb{R}^n$ is called a subspace if the following all hold:
- $\mathbf{0}\in V$.
- if $\mathbf{x}\in V$ and $c\in\mathbb{R}^n$, then $c\mathbf{x}\in V$.
- if $\mathbf{x}\in V$ and $\mathbf{y}\in V$, then $\mathbf{x}+\mathbf{y}\in V$.
Nullspace: Let $A$ be an $m\times n$ matrix.The nullspace of $A$ is the set of solutions of the homogeneous system $A\mathbf{x}=\mathbf{0}$ :
(5)Column Space: Let $A$ be an $m\times n$ matrix. Then the column space of $A$ is given by:
(6)Row Space: Let $A$ be an $m\times n$ matrix. Then the row space of $A$ is given by:
(7)Left Nullspace: Let $A$ be an $m\times n$ matrix. Then the left nullspace of $A$ is given by:
(8)Linearly Independent: The set of vectors {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} are linearly independent if and only if:
If $c_1\mathbf{v}_1+c_2\mathbf{v}_2+...+c_k\mathbf{v}_k = \mathbf{0}$ then $c_1 = c_2 = \cdots = c_k = 0$
Basis: Let $V\subseteq\mathbb{R}^n$ be a subspace. The set {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} is a basis for $V$ if and only if:
- $V$ = span ($\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$), and
- {$\mathbf{v}_1, \mathbf{v}_2,...,\mathbf{v}_k$} is linearly independent.
Dimension: Let $V\subseteq\mathbb{R}^n$ be a subspace. The dimension of $V$ is the number of elements in any basis of $V$.
Nullity: The dimension of the nullspace of A is often called the nullity of A, which is denoted null (A).
Eigenvalue and Eigenvector: If $A$ is an $n\times n$ matrix, then a nonzero vector, $v$ is called an eigenvector if $Av = \lambda v$ for some $\lambda \in \mathbb{R}$. In this case we call $\lambda$ an eigenvalue for $A$.
Characteristic Polynomial: Let $A$ be a square matrix. Then $p(t) = p_A(t) = \det(A - tI)$ is called the characteristic polynomial of $A$.