Note 1: ETH::A&D
Deck: ETH::A&D
Note Type: Horvath Classic
GUID: w{ro)4tDv:
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Pseudoinverse of \(A = \begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \end{bmatrix}\) (note it's already in the SVD form)?
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Pseudoinverse of \(A = \begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \end{bmatrix}\) (note it's already in the SVD form)?
Already in “SVD form” with \(U = I_2\), \(V = I_3\), and \(\Sigma = \begin{pmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \end{pmatrix}\) The pseudoinverse is: \[A^+ = \begin{pmatrix} \frac{1}{3} & 0 \ 0 & \frac{1}{2} \ 0 & 0 \end{pmatrix}\] Notice:
- Shape flipped: \(A\) is \(2\times3\), so \(A^+\) is \(3\times2\)
- Nonzero values inverted: \(3 \to \frac{1}{3}\), \(2 \to \frac{1}{2}\)
- Zeros stay zero
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
Pseudoinverse of \(A = \begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \end{bmatrix}\) (note it's already in the SVD form)? |
| Back |
|
<div>Already in “SVD form” with \(U = I_2\), \(V = I_3\), and \(\Sigma = \begin{pmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \end{pmatrix}\) The pseudoinverse is: \[A^+ = \begin{pmatrix} \frac{1}{3} & 0 \ 0 & \frac{1}{2} \ 0 & 0 \end{pmatrix}\] Notice:</div><div><ul><li>Shape flipped: \(A\) is \(2\times3\), so \(A^+\) is \(3\times2\)</li><li>Nonzero values inverted: \(3 \to \frac{1}{3}\), \(2 \to \frac{1}{2}\) </li><li>Zeros stay zero</li></ul></div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 2: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: Az@K^QzYw.
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is Positive Semidefinite if and only if {{c2::\(x^\top A x \geq 0\) for all \(x \in \mathbb{R}^n\)}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is Positive Semidefinite if and only if {{c2::\(x^\top A x \geq 0\) for all \(x \in \mathbb{R}^n\)}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is {{c1::Positive Semidefinite}} if and only if {{c2::\(x^\top A x \geq 0\) for all \(x \in \mathbb{R}^n\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
Note 3: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: C0VH)T^.1n
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Using SVD we can decompose every matrix \(A \in \mathbb{R}^{n \times m}\) into \(A =\) \(U \Sigma V^\top\).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Using SVD we can decompose every matrix \(A \in \mathbb{R}^{n \times m}\) into \(A =\) \(U \Sigma V^\top\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Using SVD we can decompose {{c1::every}} matrix \(A \in \mathbb{R}^{n \times m}\) into \(A =\) {{c2::\(U \Sigma V^\top\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 4: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: Eq^?6E3]XJ
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
The rank of a real symmetric matrix \(A\) is the number of non-zero eigenvalues (counting repetitions).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
The rank of a real symmetric matrix \(A\) is the number of non-zero eigenvalues (counting repetitions).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
The rank of a real symmetric matrix \(A\) is the number of {{c1::non-zero eigenvalues (counting repetitions)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 5: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: FX{E
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
SVD from rank 1 matrices with \(\sigma_1, \dots, \sigma_r\) be the non-zero singular values of \(A\), \(u_1, \dots, u_r\) the corresponding left singular vectors and \(v_1, \dots, v_r\) the corresponding right singular vectors.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
SVD from rank 1 matrices with \(\sigma_1, \dots, \sigma_r\) be the non-zero singular values of \(A\), \(u_1, \dots, u_r\) the corresponding left singular vectors and \(v_1, \dots, v_r\) the corresponding right singular vectors.
We have:\[ A = \sum_{k = 1}^r \sigma_k u_k v_k^\top \]
This follows directly from the compact SVD:
\[A = U_r \Sigma_r V_r^T = \begin{bmatrix} | & & | \\ \mathbf{u}_1 & \cdots & \mathbf{u}_r \\ | & & | \end{bmatrix} \begin{bmatrix} \sigma_1 & & \\ & \ddots & \\ & & \sigma_r \end{bmatrix} \begin{bmatrix} - & \mathbf{v}_1^T & - \\ & \vdots & \\ - & \mathbf{v}_r^T & - \end{bmatrix}\]
Expanding the matrix multiplication, we get:
\[A = \sigma_1 \mathbf{u}_1 \mathbf{v}_1^T + \sigma_2 \mathbf{u}_2 \mathbf{v}_2^T + \dots + \sigma_r \mathbf{u}_r \mathbf{v}_r^T = \sum_{i=1}^r \sigma_i \mathbf{u}_i \mathbf{v}_i^T\]
Each term \(\sigma_i \mathbf{u}_i \mathbf{v}_i^T\) is a rank-1 matrix because it is the outer product of two vectors, \(\mathbf{u}_i\) and \(\mathbf{v}_i\), scaled by the singular value \(\sigma_i\).
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
SVD from rank 1 matrices with \(\sigma_1, \dots, \sigma_r\) be the non-zero singular values of \(A\), \(u_1, \dots, u_r\) the corresponding left singular vectors and \(v_1, \dots, v_r\) the corresponding right singular vectors. |
| Back |
|
We have:\[ A = \sum_{k = 1}^r \sigma_k u_k v_k^\top \]<br>This follows directly from the compact SVD:<br><br>\[A = U_r \Sigma_r V_r^T = \begin{bmatrix} | & & | \\ \mathbf{u}_1 & \cdots & \mathbf{u}_r \\ | & & | \end{bmatrix} \begin{bmatrix} \sigma_1 & & \\ & \ddots & \\ & & \sigma_r \end{bmatrix} \begin{bmatrix} - & \mathbf{v}_1^T & - \\ & \vdots & \\ - & \mathbf{v}_r^T & - \end{bmatrix}\]<br>Expanding the matrix multiplication, we get: <br>\[A = \sigma_1 \mathbf{u}_1 \mathbf{v}_1^T + \sigma_2 \mathbf{u}_2 \mathbf{v}_2^T + \dots + \sigma_r \mathbf{u}_r \mathbf{v}_r^T = \sum_{i=1}^r \sigma_i \mathbf{u}_i \mathbf{v}_i^T\]<br>Each term \(\sigma_i \mathbf{u}_i \mathbf{v}_i^T\) is a rank-1 matrix because it is the outer product of two vectors, \(\mathbf{u}_i\) and \(\mathbf{v}_i\), scaled by the singular value \(\sigma_i\). |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 6: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: Fdf#%+wdU#
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
In the SVD the diagonal elements of \(\Sigma\), \(\sigma_i = \Sigma_{ii}\) are called the singular values of \(A\) and are ordered as \(\sigma_1 \geq \dots \sigma_{\min\{m, n\\)}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
In the SVD the diagonal elements of \(\Sigma\), \(\sigma_i = \Sigma_{ii}\) are called the singular values of \(A\) and are ordered as \(\sigma_1 \geq \dots \sigma_{\min\{m, n\\)}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
In the SVD the diagonal elements of \(\Sigma\), \(\sigma_i = \Sigma_{ii}\) are called {{c1::the singular values}} of \(A\) and are {{c1:: ordered as \(\sigma_1 \geq \dots \sigma_{\min\{m, n\}}\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 7: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: F{P9Py-@Ws
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
The singular values are the square-root of the eigenvalues of \(A^\top A\) (or \(AA^\top\)). Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
The singular values are the square-root of the eigenvalues of \(A^\top A\) (or \(AA^\top\)). Proof Included
Note that \(A^\top A\) and \(AA^\top\) share all non-zero eigenvalues. This can be seen easily as \(A^\top A\) is symmetric thus \(A^\top A = V^\top \Lambda V V(\Sigma^\top \Sigma) V^\top\) which implies that \(\Lambda = \Sigma^\top \Sigma\) and thus \(\lambda_i = \sigma_i^2\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
The singular values are the {{c1::square-root}} of the {{c2::eigenvalues of \(A^\top A\) (or \(AA^\top\))}}. <i>Proof Included</i> |
| Extra |
|
Note that \(A^\top A\) and \(AA^\top\) share all non-zero eigenvalues. This can be seen easily as \(A^\top A\) is symmetric thus \(A^\top A = V^\top \Lambda V V(\Sigma^\top \Sigma) V^\top\) which implies that \(\Lambda = \Sigma^\top \Sigma\) and thus \(\lambda_i = \sigma_i^2\). |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 8: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: G(7.sQ=i_?
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Proof that the Rayleigh Quotient has it's maximum and minimum at the largest/smallest EWs?
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Proof that the Rayleigh Quotient has it's maximum and minimum at the largest/smallest EWs?
Proof: It is easy to see that \(R(v_{\max}) = \lambda_{\max}\) and \(R(v_{\min}) = \lambda_{\min}\). See \(R(v_{\text{max}}) = \frac{v_{\text{max}}^\top A v_{\text{max}}}{v_{\text{max}}^\top v_{\text{max}}} = \frac{v_{\text{max}}^\top (\lambda_{\text{max}} v_{\text{max}})}{v_{\text{max}}^\top v_{\text{max}}} = \lambda_{\text{max}}\).
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
Proof that the Rayleigh Quotient has it's maximum and minimum at the largest/smallest EWs? |
| Back |
|
<br><div><b>Proof:</b> It is easy to see that \(R(v_{\max}) = \lambda_{\max}\) and \(R(v_{\min}) = \lambda_{\min}\). See \(R(v_{\text{max}}) = \frac{v_{\text{max}}^\top A v_{\text{max}}}{v_{\text{max}}^\top v_{\text{max}}} = \frac{v_{\text{max}}^\top (\lambda_{\text{max}} v_{\text{max}})}{v_{\text{max}}^\top v_{\text{max}}} = \lambda_{\text{max}}\).</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Note 9: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: G9![k&wZRU
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Given a symmetric matrix \(A \in \mathbb{R}^{n \times n}\) the Rayleigh Quotient defined for \(x \in \mathbb{R}^n \setminus {0}\), as \[ R(x) = {{c1::\frac{x^\top Ax}{x^\top x} }}\]attains it’s
- maximum at \(R(v_{\text{max) = \lambda_{\text{max}}\)}}
- minimum at \(R(v_{\text{min) = \lambda_{\text{min}}\)}}
where \(\lambda_{\text{max\) and \(\lambda_{\text{min}}\) are respectively the largest and smallest eigenvalues of \(A\) and \(v_{\text{max}}\) and \(v_{\text{min}}\) their associated eigenvectors}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Given a symmetric matrix \(A \in \mathbb{R}^{n \times n}\) the Rayleigh Quotient defined for \(x \in \mathbb{R}^n \setminus {0}\), as \[ R(x) = {{c1::\frac{x^\top Ax}{x^\top x} }}\]attains it’s
- maximum at \(R(v_{\text{max) = \lambda_{\text{max}}\)}}
- minimum at \(R(v_{\text{min) = \lambda_{\text{min}}\)}}
where \(\lambda_{\text{max\) and \(\lambda_{\text{min}}\) are respectively the largest and smallest eigenvalues of \(A\) and \(v_{\text{max}}\) and \(v_{\text{min}}\) their associated eigenvectors}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<div>Given a symmetric matrix \(A \in \mathbb{R}^{n \times n}\) the Rayleigh Quotient defined for \(x \in \mathbb{R}^n \setminus {0}\), as \[ R(x) = {{c1::\frac{x^\top Ax}{x^\top x} }}\]attains it’s</div><div><ul><li>{{c2::maximum at \(R(v_{\text{max}}) = \lambda_{\text{max}}\)}}</li><li>{{c2::minimum at \(R(v_{\text{min}}) = \lambda_{\text{min}}\)}}</li></ul><div>where {{c2::\(\lambda_{\text{max}}\) and \(\lambda_{\text{min}}\) are respectively the largest and smallest eigenvalues of \(A\) and \(v_{\text{max}}\) and \(v_{\text{min}}\) their associated eigenvectors}}.</div></div><blockquote><ul>
</ul></blockquote> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::1._Rayleigh_Quotient
Note 10: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: G^)Gmk^Z%6
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Every matrix \(A \in \mathbb{R}^{m \times n}\) has an SVD decomposition. In other words:
Every linear transformation is diagonal when viewed in the bases of the singular vectors.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Every matrix \(A \in \mathbb{R}^{m \times n}\) has an SVD decomposition. In other words:
Every linear transformation is diagonal when viewed in the bases of the singular vectors.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<div>Every matrix \(A \in \mathbb{R}^{m \times n}\) has an SVD decomposition. In other words:</div><div>{{c1::Every linear transformation is diagonal when viewed in the bases of the singular vectors.}}</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 11: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: IX@x0?}8bL
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
\(A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\) is invertible but not diagonalisable since the EW \(1\) has algebraic multiplicity 2 but geometric multiplicity 1.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
\(A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\) is invertible but not diagonalisable since the EW \(1\) has algebraic multiplicity 2 but geometric multiplicity 1.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
\(A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\) is invertible but not {{c1::diagonalisable}} since {{c1::the EW \(1\) has algebraic multiplicity 2 but geometric multiplicity 1}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 12: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: LvFIoMB!0)
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
By the spectral theorem, for any symmetric \(A\) we can write: \[ A = V \Lambda V^\top \]where \(\Lambda \in \mathbb{R}^{n \times n}\) is a diagonal matrix with the eigenvalues of \(A\) in it's diagonal, and \(V\) orthogonal matrix containing the eigenvectors \(V^\top V = I\).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
By the spectral theorem, for any symmetric \(A\) we can write: \[ A = V \Lambda V^\top \]where \(\Lambda \in \mathbb{R}^{n \times n}\) is a diagonal matrix with the eigenvalues of \(A\) in it's diagonal, and \(V\) orthogonal matrix containing the eigenvectors \(V^\top V = I\).
This decomposition is called an eigen-decomposition.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
By the spectral theorem, for any symmetric \(A\) we can write: \[ A = {{c1::V \Lambda V^\top }}\]where \(\Lambda \in \mathbb{R}^{n \times n}\) is {{c2::a diagonal matrix with the eigenvalues of \(A\) in it's diagonal}}, and \(V\) {{c2::orthogonal matrix containing the eigenvectors \(V^\top V = I\)}}. |
| Extra |
|
This decomposition is called an eigen-decomposition. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 13: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: Nj:Z@^])GP
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
If we take \(A, B\) PSD (or PD) then \(A + B\) is also PSD (or PD).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
If we take \(A, B\) PSD (or PD) then \(A + B\) is also PSD (or PD).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
If we take \(A, B\) PSD (or PD) then {{c1::\(A + B\)}} is also {{c2::PSD (or PD)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
Note 14: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: Pa)fnn7&WJ
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
\(A \in \mathbb{R}^{n \times n}\) arbitrary non-symmetric has rank \(n - \dim(N(A))\) so it's \(n\) minus the geometric multiplicity of \(\lambda = 0\) .
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
\(A \in \mathbb{R}^{n \times n}\) arbitrary non-symmetric has rank \(n - \dim(N(A))\) so it's \(n\) minus the geometric multiplicity of \(\lambda = 0\) .
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
\(A \in \mathbb{R}^{n \times n}\) arbitrary non-symmetric has rank {{c1:: \(n - \dim(N(A))\) so it's \(n\) minus the geometric multiplicity of \(\lambda = 0\) :: in terms of multiplicities}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 15: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: QlPn|vP;}Y
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
The gram Matrix of \(V \in \mathbb{R}^{n \times n}\) is \(G = V^\top V\).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
The gram Matrix of \(V \in \mathbb{R}^{n \times n}\) is \(G = V^\top V\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
The gram Matrix of \(V \in \mathbb{R}^{n \times n}\) is {{c1::\(G = V^\top V\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Note 16: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: d9#?3c)V#_
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda \in \mathbb{C}\) be an eigenvalue of \(A\), then {{c1::\(\lambda \in \mathbb{R}\):: property of the EW}}. Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda \in \mathbb{C}\) be an eigenvalue of \(A\), then {{c1::\(\lambda \in \mathbb{R}\):: property of the EW}}. Proof Included
Proof \(v \in \mathbb{C}^n\) be EV of \(\lambda\). Thus we have \(Av = \lambda v\). Since \(A\) is symmetric we have \(A^ = A\). \[\begin{align} \overline{\lambda}||v||^2 &= \overline{\lambda} v^*v \\ &= (\lambda v)^*v \\ &= (Av)^*v = v^*A^*v \\ &= v^* Av \text{ (uses } A^* = A \text{) } \\ &= v^*\lambda v \\ &= \lambda ||v||^2 \end{align}\]Since \(v \neq 0\), then \(||v|| \neq 0\) and so \(\lambda = \overline{\lambda}\) thus \(\lambda \in \mathbb{R}\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda \in \mathbb{C}\) be an eigenvalue of \(A\), then {{c1::\(\lambda \in \mathbb{R}\):: property of the EW}}. <i>Proof Included</i> |
| Extra |
|
<div><strong>Proof</strong> \(v \in \mathbb{C}^n\) be EV of \(\lambda\). Thus we have \(Av = \lambda v\). Since \(A\) is symmetric we have \(A^ = A\). \[\begin{align} \overline{\lambda}||v||^2 &= \overline{\lambda} v^*v \\ &= (\lambda v)^*v \\ &= (Av)^*v = v^*A^*v \\ &= v^* Av \text{ (uses } A^* = A \text{) } \\ &= v^*\lambda v \\ &= \lambda ||v||^2 \end{align}\]Since \(v \neq 0\), then \(||v|| \neq 0\) and so \(\lambda = \overline{\lambda}\) thus \(\lambda \in \mathbb{R}\).</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 17: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: dZ)aTr>2eb
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the non-zero eigenvalues of \(A^\top A\) are the same ones of \(AA^\top\). Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the non-zero eigenvalues of \(A^\top A\) are the same ones of \(AA^\top\). Proof Included
Shared EWs: For \((A\top A)v_k = \lambda_k v_k\) we get \(AA^\top A v_k = \lambda_k Av_k\) and thus \(Av_k\) EV and \(\lambda_k\) is an EW of \(AA^\top\).
Orthogonality: For \(j \neq k\) we have \((Av_j)^\top (Av_k) = v_j^\top A^\top Av_k = v_j^\top \lambda_k v_k = \lambda_k v_j^\top v_k = 0\)
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<div>Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the {{c1::non-zero eigenvalues}} of {{c2::\(A^\top A\) are the same ones of \(AA^\top\)}}. <i>Proof Included</i></div> |
| Extra |
|
<b>Shared EWs:</b> For \((A\top A)v_k = \lambda_k v_k\) we get \(AA^\top A v_k = \lambda_k Av_k\) and thus \(Av_k\) EV and \(\lambda_k\) is an EW of \(AA^\top\).<br><br><b>Orthogonality:</b> For \(j \neq k\) we have \((Av_j)^\top (Av_k) = v_j^\top A^\top Av_k = v_j^\top \lambda_k v_k = \lambda_k v_j^\top v_k = 0\)<div></div><div></div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 18: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: er&wX,XKn1
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Every symmetric PSD matrix \(M\) is a Gram Matrix of an upper triangular matrix \(C\).
\(M = C^\top C\) is known as the Cholesky Decomposition.
Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Every symmetric PSD matrix \(M\) is a Gram Matrix of an upper triangular matrix \(C\).
\(M = C^\top C\) is known as the Cholesky Decomposition.
Proof Included
Thus all PSD matrices are decomposable as \(C^\top C\) with \(C\) upper triangular!
Proof: Since \(M\) is symmetric PSD, we can say \(M = V \Lambda V^\top\) with \(\Lambda\) diagonal matrix with EWs in the diagonal.
- Since \(M\) is PSD, the eigenvalues (the diagonals) of \(\Lambda\) are \(\geq 0\) (non-negative) and thus we can build \(\Lambda^{1/2}\) by taking the square root of each diagonal entry.
- To make them be upper triangular, we take the QR-decomposition (\(V\Lambda^{1/2}\) has linearly independent columns) \((V \Lambda^{1/2})^\top = QR\) with \(Q\) such that \(Q^\top Q = I\) and \(R\) upper triangular.
- We then have \(M = (V \Lambda^{1/2})(V \Lambda^{1/2})^\top\)\( = (QR)^\top (QR) = \)\(R^\top Q^\top Q R = R^\top R\)
Taking \(C = R\) we get \(M = C^\top C\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Every symmetric PSD matrix \(M\) is a {{c1::Gram Matrix of an upper triangular matrix}} \(C\).<br>\(M = {{c2::C^\top C}}\) is known as the {{c2::Cholesky Decomposition}}.<br><i>Proof Included</i> |
| Extra |
|
Thus all PSD matrices are decomposable as \(C^\top C\) with \(C\) upper triangular!<br><br><div><b>Proof:</b> Since \(M\) is symmetric PSD, we can say \(M = V \Lambda V^\top\) with \(\Lambda\) diagonal matrix with EWs in the diagonal. </div><div><ul><li>Since \(M\) is PSD, the eigenvalues (the diagonals) of \(\Lambda\) are \(\geq 0\) (non-negative) and thus we can build \(\Lambda^{1/2}\) by taking the square root of each diagonal entry.</li><li>To make them be upper triangular, we take the QR-decomposition (\(V\Lambda^{1/2}\) has linearly independent columns) \((V \Lambda^{1/2})^\top = QR\) with \(Q\) such that \(Q^\top Q = I\) and \(R\) upper triangular. </li><li>We then have \(M = (V \Lambda^{1/2})(V \Lambda^{1/2})^\top\)\( = (QR)^\top (QR) = \)\(R^\top Q^\top Q R = R^\top R\) </li></ul></div><div>Taking \(C = R\) we get \(M = C^\top C\).</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Note 19: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: fILs=r`j+*
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Spectral Theorem: Any symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has \(n\) real eigenvalues and {{c1::an orthonormal basis of \(\mathbb{R}^{n \times n}\) consisting of it's eigenvectors :: EV}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Spectral Theorem: Any symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has \(n\) real eigenvalues and {{c1::an orthonormal basis of \(\mathbb{R}^{n \times n}\) consisting of it's eigenvectors :: EV}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<b>Spectral Theorem: </b>Any symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has {{c1::\(n\) real eigenvalues :: EW}} and {{c1::an orthonormal basis of \(\mathbb{R}^{n \times n}\) consisting of it's eigenvectors :: EV}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 20: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: i8+^S+3p7v
modified
Before
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation::2._Similar_Matrices
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has a complete set of real eigenvectors if and only if \(B\) does .
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation::2._Similar_Matrices
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has a complete set of real eigenvectors if and only if \(B\) does .
After
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation::2._Similar_Matrices
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has a complete set of real eigenvectors if and only if \(B\) does . Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation::2._Similar_Matrices
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has a complete set of real eigenvectors if and only if \(B\) does . Proof Included
Proof \(\lambda, v\) EW, EV pair for matrix \(A\) iff \(Av = \lambda v \Leftrightarrow \lambda S^{-1}v = S^{-1}Av = S^{-1}ASS^{-1}v = B(S^{-1}v)\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has {{c1::a complete set of real eigenvectors if and only if \(B\) does :: EVs}}. |
\(A \in \mathbb{R}^{n \times n}\) and \(B \in \mathbb{R}^{n \times n}\) are similar matrices. The matrix \(A\) has {{c1::a complete set of real eigenvectors if and only if \(B\) does :: EVs}}. <i>Proof Included</i> |
| Extra |
|
<div><b>Proof </b>\(\lambda, v\) EW, EV pair for matrix \(A\) iff \(Av = \lambda v \Leftrightarrow \lambda S^{-1}v = S^{-1}Av = S^{-1}ASS^{-1}v = B(S^{-1}v)\)<b>.</b></div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation::2._Similar_Matrices
Note 21: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: jn6
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda_1 \neq \lambda_2 \in \mathbb{R}\) be two distinct eigenvalues of \(A\) with corresponding eigenvectors \(v_1, v_2\): \(v_1\) and \(v_2\) are orthogonal. Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda_1 \neq \lambda_2 \in \mathbb{R}\) be two distinct eigenvalues of \(A\) with corresponding eigenvectors \(v_1, v_2\): \(v_1\) and \(v_2\) are orthogonal. Proof Included
Proof \(\lambda_1 v_1 ^\top v_2 = (Av_1)^\top v_2 = v_1^\top A ^\top v_2 = v_1^\top (Av_2) = \lambda_2 v_1^\top v_2\) Thus \(v_1^\top v_2\) must be \(0\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<div>Let \(A \in \mathbb{R}^{n \times n}\) be a symmetric matrix and \(\lambda_1 \neq \lambda_2 \in \mathbb{R}\) be two {{c2::distinct}} eigenvalues of \(A\) with corresponding eigenvectors \(v_1, v_2\): \(v_1\) and \(v_2\) are {{c1::orthogonal:: property}}. <i>Proof Included</i></div> |
| Extra |
|
<div><b>Proof</b> \(\lambda_1 v_1 ^\top v_2 = (Av_1)^\top v_2 = v_1^\top A ^\top v_2 = v_1^\top (Av_2) = \lambda_2 v_1^\top v_2\) Thus \(v_1^\top v_2\) must be \(0\).</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 22: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: jo;...VEEI
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
A matrix has a complete set of real eigenvectors if all its eigenvalues are real and the geometric multiplicities are the same as the algebraic multiplicities of all it's eigenvalues .
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
A matrix has a complete set of real eigenvectors if all its eigenvalues are real and the geometric multiplicities are the same as the algebraic multiplicities of all it's eigenvalues .
Example \(I\) has eigenvalue \(1\) with geometric multiplicity \(n\) (\(\dim(N(I - 1 \cdot I)) = n\)) and algebraic multiplicity \(n\) (As the characteristic polynomial of \(I\), \(P(z) = (z - 1)(z - 1) \dots (z - 1)\) with that repeated \(n\) times).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
A matrix has a <b>complete set of real eigenvectors</b> if {{c1::all its eigenvalues are real and the geometric multiplicities are the same as the algebraic multiplicities of all it's eigenvalues :: in terms of multiplicities}}. |
| Extra |
|
<div><strong>Example</strong> \(I\) has eigenvalue \(1\) with geometric multiplicity \(n\) (\(\dim(N(I - 1 \cdot I)) = n\)) and algebraic multiplicity \(n\) (As the characteristic polynomial of \(I\), \(P(z) = (z - 1)(z - 1) \dots (z - 1)\) with that repeated \(n\) times).</div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 23: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: p}G;kCHD5f
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Give an example of the compact form of the SVD for \(A \in \mathbb{R}^{4 \times 5}\) with \(\text{rank}(A) = 3\): (name the dimensions)
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Give an example of the compact form of the SVD for \(A \in \mathbb{R}^{4 \times 5}\) with \(\text{rank}(A) = 3\): (name the dimensions)
\[A = U_3 \Sigma_3 V_3^T = \begin{bmatrix} | & | & | \\ \mathbf{u}_1 & \mathbf{u}_2 & \mathbf{u}_3 \\ | & | & | \end{bmatrix} \begin{bmatrix} \sigma_1 & 0 & 0 \\ 0 & \sigma_2 & 0 \\ 0 & 0 & \sigma_3 \end{bmatrix} \begin{bmatrix} - & \mathbf{v}_1^T & - \\ - & \mathbf{v}_2^T & - \\ - & \mathbf{v}_3^T & - \end{bmatrix}\]
where \(U_3\) is \(4 \times 3\), \(\Sigma_3\) is \(3 \times 3\), and \(V_3^T\) is \(3 \times 5\).
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
Give an example of the <b>compact form</b> of the SVD for \(A \in \mathbb{R}^{4 \times 5}\) with \(\text{rank}(A) = 3\): (name the dimensions) |
| Back |
|
\[A = U_3 \Sigma_3 V_3^T = \begin{bmatrix} | & | & | \\ \mathbf{u}_1 & \mathbf{u}_2 & \mathbf{u}_3 \\ | & | & | \end{bmatrix} \begin{bmatrix} \sigma_1 & 0 & 0 \\ 0 & \sigma_2 & 0 \\ 0 & 0 & \sigma_3 \end{bmatrix} \begin{bmatrix} - & \mathbf{v}_1^T & - \\ - & \mathbf{v}_2^T & - \\ - & \mathbf{v}_3^T & - \end{bmatrix}\]<br>where \(U_3\) is \(4 \times 3\), \(\Sigma_3\) is \(3 \times 3\), and \(V_3^T\) is \(3 \times 5\). |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 24: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: q+.VNuw?7g
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Pseudoinverse from the SVD: \(A = U \Sigma V^\top\)
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Pseudoinverse from the SVD: \(A = U \Sigma V^\top\)
\(A^\dagger = V \Sigma^\dagger U^\top\)where \(\Sigma^\dagger\) is obtained from \(\Sigma\) by taking the reciprocal (\(\frac{1}{\sigma_i}\)) of each non-zero singular value, leaving the zeros in place, and transposing the matrix.
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
Pseudoinverse from the SVD: \(A = U \Sigma V^\top\) |
| Back |
|
\(A^\dagger = V \Sigma^\dagger U^\top\)where \(\Sigma^\dagger\) is obtained from \(\Sigma\) by taking the reciprocal (\(\frac{1}{\sigma_i}\)) of each non-zero singular value, leaving the zeros in place, and transposing the matrix. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 25: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: qn2vol8}8V
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Every symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has a real eigenvalue \(\lambda\).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Every symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has a real eigenvalue \(\lambda\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Every symmetric matrix \(A \in \mathbb{R}^{n \times n}\) has {{c1::a real eigenvalue \(\lambda\):: existence}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 26: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: qr+Ln*lsd_
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Any symmetric matrix has only real eigenvalues.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Any symmetric matrix has only real eigenvalues.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Any symmetric matrix has {{c1:: only real eigenvalues:: fact about the EWs}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 27: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: s(K/=VnA_Y
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
We can write \(A\) as the sum of rank \(1\) matrices: \[A = {{c2::\sum_{k = 1}^n \lambda_i v_i v_i^\top}}\]where \(v_1, \dots, v_n\) are an orthonormal basis of eigenvectors (the \(V\) in diagonalisation) and \(\lambda_1, \dots, \lambda_n\) the associated eigenvectors.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
We can write \(A\) as the sum of rank \(1\) matrices: \[A = {{c2::\sum_{k = 1}^n \lambda_i v_i v_i^\top}}\]where \(v_1, \dots, v_n\) are an orthonormal basis of eigenvectors (the \(V\) in diagonalisation) and \(\lambda_1, \dots, \lambda_n\) the associated eigenvectors.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
We can write \(A\) as the sum of {{c1::rank \(1\) matrices}}: \[A = {{c2::\sum_{k = 1}^n \lambda_i v_i v_i^\top}}\]where {{c2:: \(v_1, \dots, v_n\) are an orthonormal basis of eigenvectors (the \(V\) in diagonalisation) and \(\lambda_1, \dots, \lambda_n\) the associated eigenvectors}}.<br> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem
Note 28: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: s-`w^:1S}3
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Given \(n\) vectors \(v_1, \dots, v_n \in \mathbb{R}^n\) we call their Gram Matrix the {{c2::\(n \times n\) matrix of inner products \(G_{ij} = v_i^\top v_j\)}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Given \(n\) vectors \(v_1, \dots, v_n \in \mathbb{R}^n\) we call their Gram Matrix the {{c2::\(n \times n\) matrix of inner products \(G_{ij} = v_i^\top v_j\)}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Given \(n\) vectors \(v_1, \dots, v_n \in \mathbb{R}^n\) we call their {{c1::Gram Matrix}} the {{c2::\(n \times n\) matrix of inner products \(G_{ij} = v_i^\top v_j\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::3._Gram_Matrix
Note 29: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: tQtZJZ|Ls+
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the non-zero eigenvalues of \(A^\top A\) are the same ones of \(AA^\top\).
Both matrices are symmetric and PSD.
Proof Included
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the non-zero eigenvalues of \(A^\top A\) are the same ones of \(AA^\top\).
Both matrices are symmetric and PSD.
Proof Included
Proof \(G = AA^\top\) and \(G = A^\top A\) are PSD.
- \(x^\top G x = x^\top (A^\top A ) x = (Ax)^\top (Ax) = ||Ax||^2 \geq 0\)
- \(x^\top G x = x^\top AA^\top x = (A^\top x)^\top (A^\top x) = ||A^\top x||^2 \geq 0\)
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
<div>Given a real matrix \(A \in \mathbb{R}^{n \times n}\), the non-zero eigenvalues of \(A^\top A\) are the same ones of \(AA^\top\).</div><div>Both matrices are {{c3::<em>symmetric</em> and <i>PSD</i>}}.</div><div><i>Proof Included</i><br></div> |
| Extra |
|
<div><b>Proof</b> \(G = AA^\top\) and \(G = A^\top A\) are PSD.</div><div><ul><li> \(x^\top G x = x^\top (A^\top A ) x = (Ax)^\top (Ax) = ||Ax||^2 \geq 0\) </li><li>\(x^\top G x = x^\top AA^\top x = (A^\top x)^\top (A^\top x) = ||A^\top x||^2 \geq 0\)</li></ul></div><div><br></div><div></div> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 30: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: u^,*N^?pR/
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is Positive Definite if and only if {{c2::\(x^\top Ax > 0\) for all \(x \in \mathbb{R}^n \setminus \{0\}\)}}.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is Positive Definite if and only if {{c2::\(x^\top Ax > 0\) for all \(x \in \mathbb{R}^n \setminus \{0\}\)}}.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
A symmetric matrix \(A \in \mathbb{R}^{n \times n}\) is {{c1::Positive Definite}} if and only if {{c2::\(x^\top Ax > 0\) for all \(x \in \mathbb{R}^n \setminus \{0\}\)}}. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::2._Symmetric_Matrices_and_the_Spectral_Theorem::2._Positive_(Semi)definite
Note 31: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: x5dCNKqS
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Does a diagonalisable / diagonalised matrix have to be invertible?
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Does a diagonalisable / diagonalised matrix have to be invertible?
No it can have \(0\) eigenvalues.
Field-by-field Comparison
| Field |
Before |
After |
| Front |
|
Does a diagonalisable / diagonalised matrix have to be invertible? |
| Back |
|
No it can have \(0\) eigenvalues. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 32: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: xFvw{LdP48
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
In the SVD:
- \(\Sigma \in \mathbb{R}^{m \times n}\) is{{c1::a diagonal matrix (in the sense that \(\Sigma_{ij} = 0\) when \(i \neq j\) and the diagonal values are non-negative and ordered in descending order)}}.
- \(U^\top U = I\) and \(V^\top V = I\) (\(U, V\) are orthogonal).
- The columns \(u_1, \dots, u_m\) of \(U\) are called the left-singular vectors of \(A\) and are orthonormal.
- The columns \(v_1, \dots, v_n\) of \(V\) are called the right-singular vectors of \(A\) and are orthonormal.
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
In the SVD:
- \(\Sigma \in \mathbb{R}^{m \times n}\) is{{c1::a diagonal matrix (in the sense that \(\Sigma_{ij} = 0\) when \(i \neq j\) and the diagonal values are non-negative and ordered in descending order)}}.
- \(U^\top U = I\) and \(V^\top V = I\) (\(U, V\) are orthogonal).
- The columns \(u_1, \dots, u_m\) of \(U\) are called the left-singular vectors of \(A\) and are orthonormal.
- The columns \(v_1, \dots, v_n\) of \(V\) are called the right-singular vectors of \(A\) and are orthonormal.
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
In the SVD:<br><ol><li>\(\Sigma \in \mathbb{R}^{m \times n}\) is{{c1::a diagonal matrix (in the sense that \(\Sigma_{ij} = 0\) when \(i \neq j\) and the diagonal values are non-negative and ordered in descending order)}}.</li><li>{{c2::\(U^\top U = I\) and \(V^\top V = I\) (\(U, V\) are orthogonal)::Property of V and U}}.</li><li>The columns \(u_1, \dots, u_m\) of \(U\) are called {{c3::the left-singular vectors of \(A\) and are orthonormal}}.</li><li>The columns \(v_1, \dots, v_n\) of \(V\) are called {{c3::the right-singular vectors of \(A\) and are orthonormal}}.</li></ol> |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::3._SVD
Note 33: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Cloze
GUID: y9BDY!a~h?
added
Previous
Note did not exist
New Note
Front
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a matrix \(A \in \mathbb{R}^{n \times n}\) and an eigenvalue \(\lambda\) of \(A\) we call the dimension \(\dim(N(A - \lambda I))\) the geometric multiplicity of \(\lambda\).
Back
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Given a matrix \(A \in \mathbb{R}^{n \times n}\) and an eigenvalue \(\lambda\) of \(A\) we call the dimension \(\dim(N(A - \lambda I))\) the geometric multiplicity of \(\lambda\).
Field-by-field Comparison
| Field |
Before |
After |
| Text |
|
Given a matrix \(A \in \mathbb{R}^{n \times n}\) and an eigenvalue \(\lambda\) of \(A\) we call {{c2::the dimension \(\dim(N(A - \lambda I))\)}} the {{c1::geometric multiplicity}} of \(\lambda\). |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Note 34: ETH::LinAlg
Deck: ETH::LinAlg
Note Type: Horvath Classic
GUID: yN#xD80(rp
modified
Before
Front
blank::1._Diagonalisation ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Give an example of a non-diagonalisable matrix that does not have a full set of eigenvectors.
Back
blank::1._Diagonalisation ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Give an example of a non-diagonalisable matrix that does not have a full set of eigenvectors.
\[ A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} \]
After
Front
blank::1._Diagonalisation ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Give an example of a non-diagonalisable matrix that does not have a full set of eigenvectors but still is invertible.
Back
blank::1._Diagonalisation ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
Give an example of a non-diagonalisable matrix that does not have a full set of eigenvectors but still is invertible.
\[ A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} \]
Field-by-field Comparison
| Field |
Before |
After |
| Front |
Give an example of a non-diagonalisable matrix that does <b>not</b> have a full set of eigenvectors. |
Give an example of a non-diagonalisable matrix that does <b>not</b> have a full set of eigenvectors but still is invertible. |
Tags:
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD
ETH::1._Semester::LinAlg::9._Diagonalisable_Matrices_and_the_SVD::1._Diagonalisation
blank::1._Diagonalisation