What's Next?

Congratulations

You have implemented 15 linear algebra concepts in Python: vectors, element-wise operations, dot products, norms, matrices, transpose, multiplication, determinants, invertibility, solving systems, eigenvalues, least squares, symbolic factoring, symbolic solving, and symbolic matrix algebra.

This is the mathematical foundation for machine learning, graphics, and scientific computing.

What to Explore Next

  • Singular Value Decomposition (SVD)np.linalg.svd. Every matrix has an SVD. It powers image compression (JPEG), recommendation systems (Netflix), and dimensionality reduction.
  • Principal Component Analysis (PCA) — compute eigenvectors of the covariance matrix to find the directions of maximum variance. The foundation of dimensionality reduction.
  • NumPy Broadcasting — operations between arrays of different shapes. Eliminates explicit loops for most tensor operations.
  • SciPy — builds on NumPy with sparse matrices, FFT, optimization, and statistics.
  • Matplotlib — visualize vectors, transformations, eigenspaces, and regression lines.

Build Something

  • Image compression — load a grayscale image as a matrix, compute its SVD, reconstruct with k singular values, and compare quality vs. size.
  • Linear regression — implement gradient descent from scratch using matrix operations on the Boston housing dataset.
  • 2D transformation visualizer — apply rotation/scaling/shear matrices to a set of points and plot before/after.
  • PageRank — model the web as an adjacency matrix and find the dominant eigenvector.

References

← Previous