• A
  • A
  • A
  • АБВ
  • АБВ
  • АБВ
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Бакалаврская программа «Программная инженерия»

Applied Methods of Linear Algebra

2019/2020
Учебный год
ENG
Обучение ведется на английском языке
5
Кредиты
Статус:
Курс по выбору
Когда читается:
3-й курс, 1, 2 модуль

Преподаватели

Course Syllabus

Abstract

In the lecture course, we consider some topics of linear algebra beyond the standard first year course which are extremely important for applications. Mostly, these are applications to data analysis and machine learning, as well as to economics and statistics. We begin with inversions of rectangle matrices, that is, we discuss pseudo-inverse matrices (and their connections to the linear regression model). Among others, we discuss iteration methods (and their using in models of random walk on a graph applied to Internet search such as PageRank algorithm), matrix decompositions (such as SVD) and methods of dimension decreasing (with their connection to some image compression algorithms), and the theory of matrix norms and perturbation theory (for error estimates in matrix computations). The course includes also symbolic methods in systems of algebraic equations, approximation problems, Chebyshev polynomials, matrix functions such as exponents etc. We plan to invite some external lecturers who successfully apply linear algebra in their work. The students are also be invited to give their own talks on additional topics of applied or theoretical linear algebra.
Learning Objectives

Learning Objectives

  • The aim the course is to provide both theoretical background and practical experience of solutions of linear algebra problems which appear in computer science, data analysis, mathematical modelling, machine learning, and economical models. The course covers some topics of matrix analysis and numerical methods of linear algebra as well as some elements of functional analysis and mathematical statistics. We provide a number of useful algorithms which can be implemented and used by students. A number of of these algorithms are in the core of modern machine learning and data analysis.
Expected Learning Outcomes

Expected Learning Outcomes

  • Upon completion of this course students would be able to apply linear algebra tools to varios parctical problems.
Course Contents

Course Contents

  • Topic 1. Pseudoinverse matrix, the least squares method, and linear regression
    The pseudoinverse (aka Moore-Penrose inverse) matrix, its definitions, main properties, and methods of computations. Full rank decomposition, calculation of the pseudoinverse using full rank decomposition. Calculation of the pseudoinverse using the singular value decomposition (SVD). The idea of least squares method. Solution of a linear problem by the least squares method using the pseudoinverse. Formula of general solution of a linear system using the pseudoinverse matrix. The linear regression problem and its solution in terms of the least squares method. Examples of practical problems solution (such as weather forecasts and market price forecasts).
  • Topic 2. Polynomial interpolation.
    The polynomial interpolation problem. Lagrange interpolation polynomial and Vandermonde determinant. Hermite interpolation, Hermite polynomial. Uniqueness of the solution of the Hermite interpolation problem. Quadratic and cubic splines. Bézier curves. Bézier splines and their applications in desing.
  • Topic 3. Metrics and norms.
    Metric spaces. Triangles and balls in metric spaces. Finite metric spaces and graphs. Normed vector spaces, p-norms. Norms and dot products. A parallelogram criterion for a normed vector space to be a Eucledian space. Equivalence of all norms in a finite-dimensional real vector space. Minkovski’s theorem about unit balls in finite-dimensional real vector spaces. Matrix norms, their compatibility with vector norms. Frobenius norm. Spectral radius as a lower bound for matrix norms. Matrix norm induced by a vector norm. Examples: the matrix norms induced by the 1-norm and the infinity-norm. Singular radius as a matrix norm induced by the Euclidean vector norm.
  • Topic 4. Chebyshev polynomials and polynomial approximation.
    Chebyshev polynomials of the first kind. Recurrent definition and main properties, trigonometric definition. Chebyshev polynomials of the second kind. Relations between Chebyshev polynomials of the first and the second kinds. Explicit expressions for the Chebyshev polynomials. Some important norms and dot products in functional spaces. Polynomials approximating the zero function. Chebyshev theorem about polynomials minimizing the infinity-norm. Polynomials minimizing the 1-norm (Zolotarev’s theorem). Orthogonal systems of polynomials. Orthogonality properties of the Chebyshev polynomials. Approximation of functions by polynomials.
  • Topic 5. Elements of perturbation theory and evaluation of errors.
    The condition number of a matrix, its properties. Bounds for the conditional number in terms of the eigenvalues. The conditional number with respect to the Eucledian norm and singular values. Error bounds of approximate solutions of systems of linear equations in terms of the errors of the evaluation of the right-hand side and the matrix of the system. Examples of approximate solutions of linear systems and error evaluations. Approximate inverse matrix and evaluation of the error of the approximation.
  • Topic 6. Iterative methods and systems of linear equations.
    The basic iteration method for systems of linear equations. Convergence, norms and spectral radius. Changing the form of the system for convergence of the basic iteration method. Gauss–Seidel method. Jacobi method.
  • Topic 7. The eigenvalues problem and the Gershgorin’s theorems.
    Location and perturbation of eigenvalues, the first and the second Gershgorin’s theorems. Cyclic cells and the Frobenius normal form of a matrix. Krylov’s method of finding the minimal polynomial of a matrix. Danilevski’s method. Interpolation method of finding the characteristic polynomial of a matrix. Iterative methods for eigenvalues.
  • Topic 8. Nonnegative matrices and PageRank.
    Nonnegative and positive matrices. Irreducible nonnegative matrices and strongly connected graphs. Properties of powers and of the inverse of the irreducible nonnegative matrices. The Perron-Frobenius theorem. Linear productive model, productive matrices, and the Leontiev inverse. The foundation of the Input-Output analysis in macroeconomics. Brief overview of some linear algebra problems appearing in modern Input-Output analysis. Random walk on a graph. PageRank algorithm and the Internet search. PageRank and social influence indices.
  • Topic 9. Functions of matrices.
    Power series of matrices and convergence. Using the Jordan normal form to calculate an analytical function of a matrix. Calculation of a value of an analytical function of a matrix using the Hermite (or the Lagrange−Sylvester) polynomials.
  • Topic 10. Low rank approximation and dimensionality reduction.
    A problem of approximation of a matrix by a matrix of bounded rank. Using SVD for the approximation of a matrix by a matrix of bounded rank and minimal norm in the case of a unitary invariant norm. Examples with image compression and handwritten digits recognition.
  • Applied Methods of Linear Algebra. An introduction.
    In the lecture course, we consider some topics of linear algebra beyond the standard first year course which are extremely important for applications. Mostly, these are applications to data analysis and machine learning, as well as to economics and statistics. We begin with inversions of rectangle matrices, that is, we discuss pseudo-inverse matrices (and their connections to the linear regression model). Among others, we discuss iteration methods (and their using in models of random walk on a graph applied to Internet search such as PageRank algorithm), matrix decompositions (such as SVD) and methods of dimension decreasing (with their connection to some image compression algorithms), and the theory of matrix norms and perturbation theory (for error estimates in matrix computations). The course includes also symbolic methods in systems of algebraic equations, approximation problems, Chebyshev polynomials, matrix functions such as exponents etc.
Assessment Elements

Assessment Elements

  • non-blocking A homework with a list of individual problems.
  • non-blocking A homework with a list of individual problems.
Interim Assessment

Interim Assessment

  • Interim assessment (2 module)
    The final mark is calculated by the formula Total = max(10, (IntermadiateTest + FinalExam)/2 +BonusScore), where IntermadiateTest means the mark for the written test after the 1st module, FinalExam means the mark for the final exam, and BonusScore means the additional score for the students who either prepare a special talk or demonstrate enormous activity during the classes. The talks are evaluated according to the scientific and practical value of the talk, the quality of the presentation, and the number of students participated in the project.
Bibliography

Bibliography

Recommended Core Bibliography

  • F. Aleskerov, H. Ersel, D. Piontkovski. Linear Algebra for Economists. Springer, 2011

Recommended Additional Bibliography

  • Anthony, M., & Harvey, M. (2012). Linear Algebra : Concepts and Methods. Cambridge, UK: Cambridge eText. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=443759