SI231B: Matrix Computations

Course Descriptions

 

Matrix analysis and computations are widely used in engineering fields  —  such as statistics, optimization, machine learning, computer vision, systems and control, signal and image processing, communications and networking, smart grids, and many more  —  and are considered key fundamental tools.

SI231B Matrix Computations (Matrix Methods for Data Analysis, Machine Learning, and Signal Processing) covers topics at an advanced or research level especially for people working in the general areas of Data Analysis, Machine Learning, and Signal Processing.

This course consists of several topics. In each topic, the following parts will be mainly elaborated.

  • [Theory - Analysis] The first part focuses on matrix basic operations and properties such as matrix inversion lemmas and Schur compliment and the characterizations of various matrix decompositions, such as LU decomposition, QR decomposition, eigendecomposition, Schur decomposition, symmetric decomposition, singular value decomposition, and so on.

  • [Theory - Computations and Optimization] The second part considers numerical algorithms for important matrix operations such as subspace projections, Kronecker product, Hadamard product, and the vectorization operator, matrix factorizations such as LU decomposition, QR decomposition, and eigendecomposition, solutions to matrix problems such as linear system of equations, least squares, and matrix equations, and matrix optimization. Sensitivity aspects will be briefly introduced. The introduced algorithms and numerical techniques are also important for solving linear/nonlinear systems and optimization problems.

  • [Models and Applications] The third part explores fundamental and frontier matrix applications such as time series prediction for flu activity analysis, image compression, subspace analysis for multiple signal classification, signal denoising, PageRank, compressive sensing (or managing undetermined systems of equations via sparsity), dictionary learning, image denoising, deblurring, super-resolution, tomography, MRI, sensor network localization, OFDM, MIMO communications, molecular conformation, etc. Application content will be in part adapted to student interests.

(Acknowledgement to Wing-Kin (Ken) Ma (CUHK) for material for part of this course.)

Announcements

  1. [Course Syllabus]

  2. Time: Mon/Wed 3:00pm-4:40pm, Venue: Rm. 1D-106, SIST Building.

  3. Gradescope: see the HW's.

Prerequisites

  1. Compulsory: Linear Algebra, Mathematical Analysis or Advanced Calculus, Probability and Statistics.

  2. Recommended Postrequisites: Mathematical and Numerical Optimization, Machine Learning.

Textbooks and Optional References

Textbook

  1. Gene H. Golub and Charles F. Van Loan, Matrix Computations (Fourth edition), The John Hopkins University Press, 2013.

References

  1. Roger A. Horn and Charles R. Johnson, Matrix Analysis (Second Edition), Cambridge University Press, 2012.

  2. Jan R. Magnus and Heinz Neudecker, Matrix Differential Calculus with Applications in Statistics and Econometrics (Third Edition), John Wiley and Sons, New York, 2007.

  3. Gilbert Strang, Linear Algebra and Learning from Data, Wellesley-Cambridge Press, 2019.

  4. Carl D. Meyer, Matrix Analysis and Applied Linear Algebra, SIAM (Society for Industrial and Applied Mathematics), 2000.

  5. Alan J. Laub, Matrix Analysis for Scientists & Engineers, SIAM (Society for Industrial and Applied Mathematics), 2004.

  6. Xianda Zhang, Matrix Analysis and Applications, Cambridge University Press, 2017.

  7. Xianda Zhang, Matrix Analysis and Applications (Second Edition), Tsinghua University Press, 2013. (a ref. in Chinese)

Schedule

Regular Topics:

  1. Topic 0: Overview (0.5 week)

  2. Topic 1: Basic Concepts (1.5 weeks)

  3. Topic 2: Linear Systems (1.5 weeks)

  4. Topic 3: Least Squares (1.5 weeks)

  5. Topic 4: Orthogonalization and QR Decomposition (1 week)

  6. Topic 5: Eigenvalues, Eigenvectors, and Eigendecomposition (2 weeks)

  7. Topic 6: Positive Semidefinite Matrices (1 week)

  8. Topic 7: Singular Values, Singular Vectors, and Singular Value Decomposition (1 week)

  9. Topic 8: Least Squares Revisited (1 week)

  10. Topic 9: Kronecker Product and Hadamard Product (0.5 week)

  11. Topic 10: Review (0.5 week)

  12. Guest Lecture: Solving Random Quadratic Systems of Equations

Selected Topics:

  1. S-Topic: Numerical Linear Algebra Softwares (acknowledgement to Stephen P. Boyd) [slides]

  2. S-Topic: Numerical Optimization Softwares and Scripting Language “CVX” (acknowledgement to Stephen P. Boyd) [slides]

Assessment

30% assignments, 40% mid-term exam, 30% final project.

Academic Integrity Policy

Group study and collaboration on problem sets are encouraged, as working together is a great way to understand new materials. Students are free to discuss the homework problems with anyone under the following conditions:

  • Students must write down their own solutions. Plagiarism is never allowed. Similar answers, MATLAB/Python/R codes, etc., found in HWs will invite you into suspected plagiarism investigation.

  • Students must list the names of their collaborators (i.e., anyone with whom the assignment was discussed).

  • Students can not use old solution sets from other classes under any circumstances, unless the instructor grants special permission.

Students are encouraged to read the ShanghaiTech Policy on Academic Integrity.