본 글은 주재걸교수님의 인공지능을 위한 선형대수 강의를 듣고 정리한 내용입니다.
Diagonalization
- a given square matrix 𝐴 ∈ ℝ𝑛×𝑛 into a diagonal matrix via the following form
- 𝑉 ∈ ℝ𝑛×𝑛 is an invertible matrix and 𝐷 ∈ ℝ𝑛×𝑛 is a diagonal matrix. This is called a diagonalization of 𝐴.
- V는 A와 같은 dimension의 정사각행렬
- 𝐷 = 𝑉^(−1)𝐴𝑉 ⟹ 𝑉𝐷 = 𝐴𝑉
𝐴𝑉 = 𝑉𝐷 ⟺ [𝐴𝐯1 𝐴𝐯2 ⋯ 𝐴𝐯𝑛] = [𝜆1𝐯1 𝜆2𝐯2 ⋯ 𝜆𝑛𝐯𝑛] => 𝐴𝐯1 = 𝜆1𝐯1, 𝐴𝐯2 = 𝜆2𝐯2, …, 𝐴𝐯𝑛 = 𝜆𝑛𝐯𝑛
∴ 𝐯1, 𝐯2, …, 𝐯𝑛 should be eigenvectors and 𝜆1, 𝜆2, …, 𝜆𝑛 should be eigenvalues.
- 𝑉 to be invertible, 𝑉 should be a square matrix in ℝ𝑛×𝑛 , and 𝑉 should have 𝑛 linearly independent columns.
- Recall columns of 𝑉 are eigenvectors. Hence, 𝐴 should have 𝑛 linearly independent eigenvectors.
- • It is not always the case, but if it is, 𝐴 is diagonalizable.
'Study > 선형대수학' 카테고리의 다른 글
5-1 특이값 분해 (0) | 2022.01.07 |
---|---|
4-5. 고유값 분해와 선형변환 (1) | 2022.01.06 |
4-3. 특성방정식 (0) | 2022.01.05 |
4-2. 영공간과 직교여공간 (0) | 2022.01.05 |
4-1. 고유벡터와 고유값 (0) | 2022.01.05 |
댓글