Applications include:
- differential equations and dynamical systems
- Markov chains
- circuits
- social network analyses
- frequency analysis
- Google’s Page Rank algorithm
- machine learning
2/8/2020
Applications include:
Example: if \(v = (42, \pi, 0, -e)\) then \(\dim(v) = 4\)
Example: if \(v_1 = (4,2,1)\), \(v_2 = (0, -1, 3)\), and \(c=-3\) then \[ \begin{align} v_1 + v_2 &= (4, 1, 4) \\ cv_2 &= (0, 3, -9) \end{align} \]
Example: The vector \((4,3)\) in the Euclidean basis is written as such because \[ \begin{pmatrix} 4 \\ 3 \end{pmatrix} = 4\begin{pmatrix} 1 \\ 0 \end{pmatrix} + 3\begin{pmatrix} 0 \\ 1\end{pmatrix} \] Say we want to change basis to using \(v_1 = (1,2)\) and \(v_2 = (1.5, 0.5)\) instead. Observe, \[ \begin{pmatrix} 4 \\ 3 \end{pmatrix} = 1\begin{pmatrix} 1 \\ 2 \end{pmatrix} + 2\begin{pmatrix} 1.5 \\ 0.5 \end{pmatrix} \] so \((4,3)\) is actually written as \((1,2)\) in this new basis.
Example: we can write the general form of a \(3\times 2\) matrix \(A\) as
\[ A = \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\ a_{31} & a_{32} \end{pmatrix} . \]
\[ Av = 3\begin{pmatrix} 3 \\ -1 \end{pmatrix} - 2\begin{pmatrix} 1 \\ 5 \end{pmatrix} = \begin{pmatrix} 9 \\ -3 \end{pmatrix} + \begin{pmatrix} -2 \\ -10 \end{pmatrix} = \begin{pmatrix} 7 \\ -13 \end{pmatrix} \]
Example: Let \(v\) be any 2-dimensional vector and for \(0\leq \theta < 2\pi\) define \[ R = \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix}. \] Then \(Rv\) is the vector \(v\) rotated \(\theta\) radians counterclockwise. Hence a \(90^\circ\) rotation is performed using \(R = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}\).
\[ y_i = \beta_0 + \beta_1x_{i1} + \beta_2x_{i2} + \cdots + \beta_nx_{ip} + \varepsilon_i \]
\[ \begin{align} y_1 &= \beta_0 + \beta_1 x_{11} + \beta_2 x_{12} + \cdots + \beta_p x_{1p} + \varepsilon_1\\ y_2 &= \beta_0 + \beta_1 x_{21} + \beta_2 x_{22} + \cdots + \beta_p x_{2p} + \varepsilon_2\\ &\hspace{2mm}\vdots\\ y_n &= \beta_0 + \beta_1 x_{n1} + \beta_2 x_{n2} + \cdots + \beta_p x_{np} + \varepsilon_n \end{align} \]
Yuk!
\[ \hat \beta = (X^T X)^{-1} X^T y \]
ANOVA is just a linear model with 0 or 1 in the design matrix
Example: Consider an ANOVA model with 3 treatment groups on 6 individuals. Taking \(\beta_0\) to be the mean response for the reference group, our model is
\[ y_{i} = \beta_0 + \beta_1 + \beta_2 + \varepsilon_{i} \] or, equivalently \(y = X\beta + \varepsilon\) where
\[ X = \begin{pmatrix} 1 & 0 & 0\\ 1 & 0 & 0\\ 1 & 1 & 0\\ 1 & 1 & 0\\ 1 & 0 & 1\\ 1 & 0 & 1\\ \end{pmatrix} \]
The Essence of Linear Algebra on YouTube is a great visual tutorial on linear algebra.
Setosa has more interactive webpages, including one for eigenvectors and eigenvalues