Matrix Algebra
Matrix Operations - Basics
Matrix Addition & Subtraction
\((A \pm B)_{ij} = A_{ij} \pm B_{ij}\) (same dimensions required)
Matrix Multiplication
\((AB)_{ij} = \sum_{k=1}^{n} A_{ik}B_{kj}\)
-
\(A_{m \times n} \cdot B_{n \times p} = C_{m \times p}\)
-
Generally \(AB \neq BA\)
Transpose
\((A^T)_{ij} = A_{ji}\)
-
\((A^T)^T = A\)
-
\((AB)^T = B^T A^T\)
Special Matrices
Identity Matrix
\(I_n = \begin{pmatrix} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{pmatrix}\)
\(AI = IA = A\)
Symmetric Matrix
\(A = A^T\)
Diagonal Matrix
\(A_{ij} = 0\) for \(i \neq j\)
Orthogonal Matrix
\(AA^T = A^T A = I\)
-
\(A^{-1} = A^T\)
-
\(|A| = \pm 1\)
Matrix Inverse
Definition
\(A^{-1}\) exists if \(AA^{-1} = A^{-1}A = I\)
Properties
-
\((A^{-1})^{-1} = A\)
-
\((AB)^{-1} = B^{-1}A^{-1}\)
-
\((A^T)^{-1} = (A^{-1})^T\)
-
\(A^{-1}\) exists \(\Leftrightarrow\) \(\det(A) \neq 0\)
For 2×2 Matrix
\(A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\)
\(A^{-1} = \frac{1}{ad-bc} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\)
Determinants
Properties
-
\(\det(AB) = \det(A)\det(B)\)
-
\(\det(A^T) = \det(A)\)
-
\(\det(A^{-1}) = \frac{1}{\det(A)}\)
-
If any row/column is zero, \(\det(A) = 0\)
-
Swapping rows changes sign of determinant
Calculation Methods
-
Cofactor expansion
-
Row/Column operations
-
For triangular matrix: product of diagonal elements
Matrix Rank
Definition
Rank of matrix = Number of linearly independent rows (or columns)
Properties
-
\(\text{rank}(A) \leq \min(m,n)\) for \(A_{m \times n}\)
-
\(\text{rank}(A) = \text{rank}(A^T)\)
-
\(\text{rank}(AB) \leq \min(\text{rank}(A), \text{rank}(B))\)
-
Full rank: \(\text{rank}(A) = \min(m,n)\)
Finding Rank
-
Convert to Row Echelon Form (REF)
-
Count non-zero rows
Systems of Linear Equations
System of Linear Equations
General Form
\(Ax = b\) where \(A_{m \times n}\), \(x_{n \times 1}\), \(b_{m \times 1}\)
Augmented Matrix
\([A|b] = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} & | & b_1 \\ a_{21} & a_{22} & \cdots & a_{2n} & | & b_2 \\ \vdots & \vdots & \ddots & \vdots & | & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} & | & b_m \end{pmatrix}\)
Solution Methods
-
Gaussian Elimination
-
Gauss-Jordan Method
-
Matrix Inversion (if \(A\) is square and invertible)
-
Cramer’s Rule (for square systems)
Solution Types
Let \(r(A) =\) rank of \(A\), \(r(A|b) =\) rank of augmented matrix
Consistent System
\(r(A) = r(A|b)\)
-
Unique Solution: \(r(A) = r(A|b) = n\)
-
Infinite Solutions: \(r(A) = r(A|b) < n\)
Inconsistent System
\(r(A) \neq r(A|b)\) \(\Rightarrow\) No solution
Homogeneous System (\(Ax = 0\))
-
Always consistent (trivial solution \(x = 0\))
-
Non-trivial solution exists \(\Leftrightarrow\) \(r(A) < n\)
Cramer’s Rule
For Square System \(Ax = b\) with \(\det(A) \neq 0\)
where \(A_i\) is matrix \(A\) with \(i\)-th column replaced by \(b\)
Example for 2×2
\(\begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} p \\ q \end{pmatrix}\)
\(x = \frac{\begin{vmatrix} p & b \\ q & d \end{vmatrix}}{\begin{vmatrix} a & b \\ c & d \end{vmatrix}}\), \(y = \frac{\begin{vmatrix} a & p \\ c & q \end{vmatrix}}{\begin{vmatrix} a & b \\ c & d \end{vmatrix}}\)
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Definition
For square matrix \(A\):
Characteristic Equation
Steps to Find Eigenvalues & Eigenvectors
-
Solve \(\det(A - \lambda I) = 0\) for \(\lambda\)
-
For each \(\lambda_i\), solve \((A - \lambda_i I)v = 0\) for \(v\)
Properties of Eigenvalues
Important Properties
-
Sum of eigenvalues = Trace of \(A\) = \(\sum a_{ii}\)
-
Product of eigenvalues = \(\det(A)\)
-
Eigenvalues of \(A^T\) = Eigenvalues of \(A\)
-
Eigenvalues of \(A^{-1}\) = \(\frac{1}{\lambda_i}\) (if \(A\) invertible)
-
Eigenvalues of \(A^k\) = \(\lambda_i^k\)
For Symmetric Matrices
-
All eigenvalues are real
-
Eigenvectors are orthogonal
-
Always diagonalizable
Diagonalization
Definition
Matrix \(A\) is diagonalizable if \(A = PDP^{-1}\) where \(D\) is diagonal
Condition for Diagonalization
\(A\) is diagonalizable \(\Leftrightarrow\) \(A\) has \(n\) linearly independent eigenvectors
Diagonalization Process
-
Find eigenvalues \(\lambda_1, \lambda_2, \ldots, \lambda_n\)
-
Find corresponding eigenvectors \(v_1, v_2, \ldots, v_n\)
-
Form \(P = [v_1 | v_2 | \cdots | v_n]\)
-
Form \(D = \text{diag}(\lambda_1, \lambda_2, \ldots, \lambda_n)\)
-
Then \(A = PDP^{-1}\)
Quick Reference Formulas
Key Formulas for GATE
-
Determinant 2×2: \(ad - bc\)
-
Determinant 3×3: Use cofactor expansion
-
Inverse 2×2: \(\frac{1}{\det(A)} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\)
-
Characteristic polynomial: \(\det(A - \lambda I) = 0\)
-
Trace = Sum of diagonal elements = Sum of eigenvalues
Common GATE Topics
-
Rank calculation using row operations
-
Solution types of linear systems
-
Eigenvalue problems for 2×2 and 3×3 matrices
-
Properties of symmetric matrices
-
Orthogonal matrices and their properties
Practice Tips
For GATE Preparation
-
Master 2×2 and 3×3 determinant calculations
-
Practice row operations for finding rank
-
Memorize eigenvalue properties
-
Focus on symmetric matrix properties
-
Practice Cramer’s rule for small systems
-
Understand geometric interpretation of eigenvectors
Time-Saving Tips
-
Use properties of special matrices
-
For triangular matrices: eigenvalues = diagonal elements
-
Check answer using trace and determinant properties
-
Use elimination techniques efficiently