Instructor | Xiaohui Xie |
---|---|

Lectures | Virtual MW 11:00-12:20 |

Office Hours | TBD |

Course Code | 34720 |

Other Links | Piazza |

- Course description Overview of widely used principles and methods in scientific computing, including basic concepts and computational methods in computational linear algebra and convex optimization.
- numerical linear algebra
- QR factorization and least squares
- Conditioning and stability
- Systems of equations
- Eigenvalues
- Graph and graph Laplacian
- Iterative methods
- Convex optimization
- Convex sets and convex functions
- Optimality conditions and duality
- Unconstrained optimization
- Gradient descend, stochastic gradient descend, and Newton and quasi-Newton's mehod
- Constrained optimization
- Interior point methods
- Prerequisites
- multivariate calculus, linear algebra
- Grading and homework policy
- Grading: Based on midterm (40%), final exam (50%) and class participation (10%).
- Lecture schedule
- Introduction
- Revie of Linear Algebra
- Analytic Geometry
- Matrix-vector multiplication
- reading: NLA 1
- Orthogonal vectors and matrices
- reading NLA 2
- Vector norms, matrix norms
- reading NLA 3
- Singular value decomposition (SVD)
- reading NLA 4,5
- QR factorization, Gram-Schmidt Orthogonalization
- reading NLA 7,8
- Projectors, Modified Gram-Schmidt Orthogonalization
- reading NLA 6,8
- Householder triangularization and least squares problems
- reading NLA 10,11
- Eigenvalue problems, Rayleigh quotient, Inverse iteration, Rayleigh
quotient iteration
- reading NLA 24,25,26,27
- QR algorithm for eigenvalue problems
- reading NLA 28,29
- Hessenberg or tridiagonal form, Computing SVD
- reading NLA 26,31
- Overview of optimization problems
- Convex Optimzation Problems
- Introduction
- Linear programming, Simplex method
- Convex sets
- Convex Sets
- reading CO chapter 2
- Convex functions
- Convex Functions
- reading CO chapter 3
- Optimization problems, Duality
- reading CO chapter 5
- Optimality conditions, KKT conditions
- reading CO chapter 5
- Duality
- Unconstrained optimization, gradient descent, Newton's method
- reading CO chapter 9
- Unconstrained minimization
- Stochastic Gradient Descent (SGD)
- Quasi-Newton, BFGS, L-BFGS
- Constrained optimization, interior point methods
- reading: CO chapter 10,11
- Equality constrained minimization
- Interior-point methods
- Textbooks
- Numerical Linear Algebra by Trefethen and Bau - NLA
- Convex Optimization by Boyd and Vandenberghe - CO
- Mathematics for Machine Learning
- Software
- Piazza
- We will be using Piazza for class discussion. The system is highly catered to getting you help fast and efficiently from classmates and myself. Rather than emailing questions to me, I encourage you to post your questions on Piazza. If you have any problems or feedback for the developers, email team@piazza.com.
Find our class page at: https://piazza.com/uci/winter2021/cs206/home

- Academic Honesty
- Academic honesty is a requirement for passing this class. Any student who compromises the academic integrity of this course is subject to a failing grade. The work you submit must be your own. Academic dishonesty includes, but is not limited to copying answers from another student, allowing another student to copy your answers, communicating exam answers to other students during an exam, attempting to use notes or other aids during an exam, or tampering with an exam after it has been corrected and then returning it for more credit. If you do so, you will be in violation of the UCI Policies on Academic Honesty (see link). It is your responsibility to read and understand these policies. Note that any instance of academic dishonesty will be reported to the Academic Integrity Administrative Office for disciplinary action and may be cause for a failing grade in the course.

The primary objective is to introduce mathematical foundations for machine learning.

Tentative topics include