Course Objective

Course Information


Topics Readings # of lectures
Approximation errors and approximating single variable functions
Floating point number system and error in number representation, review of derivatives, Taylor Series, finding optima of single variable functions Ch 3, 4 NME 1
Finding roots of single variable functions – Bisection, Secant and Newton-Raphson Method Ch 5, 6 NME 1
Vectors and Matrices
Vectors – review of vector notation, vector operations, linear and affine multivariable functions, complex vectors, complexity of vector computations,
applications: vector representation of data (e.g., images, documents, timeseries, features), vector representation of linear and affine functions (e.g., regression, Linear (Taylor) approximation of multivariable function functions)
Ch 1,2 VMLS 2
Norms and distances – Euclidean norm and distances, properties (Cauchy-Schwarz and triangle inequalities, Pythagorean theorem), statistical measurements of data: average, rms, standard deviation, and angle between vectors and correlation, covariance; representation of hyperplanes,
application: single variable linear regression, k-means clustering
Ch 3,4 VMLS 2
Direct Methods for Solving System of Linear Equations
Solving system of linear equations using LU decomposition,  application: Polynomial interpolation and Vandermonde matrix, applications of solving system of linear equations Ch 8 VMLS 2
Matrix Inverses: Left and right inverses, solving system of linear equations using matrix inverses, Gram matrix and Pseudo-inverse Ch 5, 11 VMLS 2
Orthogonality and Least Square Methods
Basis, orthogonality and inner products: basis and change of basis, Orthogonal basis, Gram-Schmidt, modified-Gram Schmidt algorithms, QR decomposition of matrices, *Householder reflections,
application: solving system of linear equations using QR factorization, *lower dimensional data representation
Ch 5, 10, 11 VMLS 2
Linear least-Squares: solution to over-determined systems, normal equation and pseudo inverse of a matrix, computing pseudo inverse using QR and Cholesky factorization, solving least squares using matrix-vector derivates,
application: data fitting and least-square regression, feature engineering, Least-square classification, regularized least square data fitting, *least square function approximation
Ch 12-14 VMLS; Ch 17 NME 3
Interpolation using monomial and Lagrange bases will be discussed in Linear equation lecture. *Interpolation using other basis functions: Newton, Legendre, Chebyshev bases, Hermite interpolation, cubic spline interpolation Ch 18 NME 2
Numerical Differentiation and Integration
Finite divided difference approximation of derivatives, Trapezoidal rule, Simpson’s rule Ch 22, 23 NME 1
Problem Condition, Algorithm Stability Ch 6, 7 (notes) 2