SI231B: Matrix Computations

Course Descriptions

Matrix analysis and computations are widely used in engineering fields  —  such as statistics, optimization, machine learning, computer vision, systems and control, signal and image processing, communications and networking, smart grids, and many more  —  and are considered key fundamental tools.

SI231B Matrix Computations (Matrix Methods for Data Analysis, Machine Learning, and Signal Processing) covers topics at an advanced or research level especially for people working in the general areas of Data Analysis, Machine Learning, and Signal Processing.

This course consists of several topics. In each topic, the following parts will be mainly elaborated.

  • [Theory - Analysis] The first part focuses on matrix basic operations and properties such as matrix inversion lemmas and Schur compliment and the characterizations of various matrix decompositions, such as LU decomposition, QR decomposition, eigendecomposition, Schur decomposition, symmetric decomposition, singular value decomposition, and so on.

  • [Theory - Computations and Optimization] The second part considers numerical algorithms for important matrix operations such as subspace projections, Kronecker product, Hadamard product, and the vectorization operator, matrix factorizations such as LU decomposition, QR decomposition, and eigendecomposition, solutions to matrix problems such as linear system of equations, least squares, and matrix equations, and matrix optimization. Sensitivity aspects will be briefly introduced. The introduced algorithms and numerical techniques are also important for solving linear/nonlinear systems and optimization problems.

  • [Models and Applications] The third part explores fundamental and frontier matrix applications such as time series prediction for flu activity analysis, image compression, subspace analysis for multiple signal classification, signal denoising, PageRank, compressive sensing (or managing undetermined systems of equations via sparsity), dictionary learning, image denoising, deblurring, super-resolution, tomography, MRI, sensor network localization, OFDM, MIMO communications, molecular conformation, etc. Application content will be in part adapted to student interests.

(Acknowledgement to Wing-Kin (Ken) Ma (CUHK) for material for part of this course.)

Announcements

  1. Piazza: https://piazza.com/shanghaitech.edu.cn/spring2024/si231b

Prerequisites

  1. Compulsory: Linear Algebra, Mathematical Analysis or Advanced Calculus, Probability and Statistics.

  2. Recommended Postrequisites: Mathematical and Numerical Optimization, Machine Learning.

Textbooks and Optional References

Textbook

  1. Gene H. Golub and Charles F. Van Loan, Matrix Computations (Fourth edition), The John Hopkins University Press, 2013.

  2. Roger A. Horn and Charles R. Johnson, Matrix Analysis (Second Edition), Cambridge University Press, 2012.

References

  1. David G. Luenberger and Yinyu Ye, Linear and Nonlinear Programming, Springer, 2016.

  2. Jan R. Magnus and Heinz Neudecker, Matrix Differential Calculus with Applications in Statistics and Econometrics (Third Edition), John Wiley and Sons, New York, 2007.

  3. Gilbert Strang, Linear Algebra and Learning from Data, Wellesley-Cambridge Press, 2019.

  4. Lloyd N. Trefethen and David Bau III, Numerical Linear Algebra, SIAM (Society for Industrial and Applied Mathematics), 1997.

  5. Carl D. Meyer, Matrix Analysis and Applied Linear Algebra, SIAM (Society for Industrial and Applied Mathematics), 2000.

  6. Xianda Zhang, Matrix Analysis and Applications, Cambridge University Press, 2017.

  7. Xianda Zhang, Matrix Analysis and Applications (Second Edition), Tsinghua University Press, 2013. (a ref. in Chinese)

Schedule (Subject to Change)

Regular Topics:

  1. Topic 0: Overview

  2. Topic 1: Basic Concepts

  3. Topic 2: Linear Systems and LU Decomposition

  4. Topic 3: Least Squares

  5. Topic 4: Orthogonalization and QR Decomposition

  6. Topic 5: Eigenvalues, Eigenvectors, and Eigendecomposition

  7. Topic 6: Positive Semidefinite Matrices

  8. Topic 7: Singular Values, Singular Vectors, and Singular Value Decomposition

  9. Topic Op-1: Sparse Optimization

  10. Topic Op-2: Low-Rank (Reduced-Rank) Matrix Optimization

  11. Topic 8: Review

Note: Course materials are available on Piazza.com.

Assessment

30% assignments, 40% mid-term exam, 30% final project.

Intended Learning Outcomes

On successful completion of the course, students will be able to:

  • Carry out various data analysis tasks over matrices by using matrix analysis and computation techniques.

  • Apply unconstrained and constrained optimization techniques on data optimization problems.

  • Use matrix computation and optimization techniques together to solve real-life problems in data analysis, machine learning, and signal processing.

Academic Integrity Policy

Group study and collaboration on problem sets are encouraged, as working together is a great way to understand new materials. Students are free to discuss the homework problems with anyone under the following conditions:

  • Students must write down their own solutions. Plagiarism is never allowed. Similar answers, MATLAB/Python/R codes, etc., found in HWs will invite you into suspected plagiarism investigation.

  • Students must list the names of their collaborators (i.e., anyone with whom the assignment was discussed).

  • Students can not use old solution sets from other classes under any circumstances, unless the instructor grants special permission.

Students are encouraged to read the ShanghaiTech Policy on Academic Integrity.