# Orthogonality and Symmetric Matrices and the SVD

Massive Open Online Course
• Overview
• Course Content
• Requirements & Materials
Overview

## Orthogonality and Symmetric Matrices and the SVD

Course Description

In the first part of this course you will explore methods to compute an approximate solution to an inconsistent system of equations that have no solutions. Our overall approach is to center our algorithms on the concept of distance. To this end, you will first tackle the ideas of distance and orthogonality in a vector space. You will then apply orthogonality to identify the point within a subspace that is nearest to a point outside of it. This has a central role in the understanding of solutions to inconsistent systems. By taking the subspace to be the column space of a matrix, you will develop a method for producing approximate (“least-squares”) solutions for inconsistent systems.

You will then explore another application of orthogonal projections: creating a matrix factorization widely used in practical applications of linear algebra. The remaining sections examine some of the many least-squares problems that arise in applications, including the least squares procedure with more general polynomials and functions.

This course then turns to symmetric matrices. arise more often in applications, in one way or another, than any other major class of matrices. You will construct the diagonalization of a symmetric matrix, which gives a basis for the remainder of the course.

Course Content

INNER PRODUCT, LENGTH, AND ORTHOGONALITY

ORTHOGONAL SETS

ORTHOGONAL PROJECTIONS

THE GRAM-SCHMIDT PROCESS

LEAST-SQUARES PROBLEMS

LEAST-SQUARES AND LINEAR MODELS

DIAGONALIZATION OF SYMMETRIC MATRICES

CONSTRAINED OPTIMIZATION

THE SINGULAR VALUE DECOMPOSITION

Requirements & Materials
Requirements

Prerequisites

Recommended

• High school algebra, geometry, and pre-calculus

Required

• Linear Equations (DL 0050M)
• Matrix Algebra (DL 0051M)
• Determinants and Eigenvalues (DL 0049M)
Materials

Required

• Internet connection (DSL, LAN, or cable connection desirable)

### Who Should Attend

This course is designed for undergraduate students, advanced high school students, who are interested in pursuing any career path or degree program that involves linear algebra, or industry employees who are seeking a better understanding of linear algebra for their career development. ### What You Will Learn

• Orthogonal projections and distances to express a vector as a linear combination of orthogonal vectors
• How to construct vector approximations using projections
• How to characterize bases for subspaces, and construct orthonormal bases
• The iterative Gram Schmidt Process
• The QR decomposition
• Orthogonal basis construction
• How to compute general solutions and least squares errors to least squares problems using the normal equations and the QR decomposition
• How to apply least-squares and multiple regression to construct a linear model from a set of data points
• A spectral decomposition of a matrix
• Quadratic forms using eigenvalues and eigenvectors
• The SVD for a rectangular matrix ### How You Will Benefit

• Apply theorems related to orthogonality and least-squares to construct mathematical models for real-world data .
• Apply the Singular Value Decomposition (SVD) to characterize the structure of a matrix and its invertibility.
• Apply theorems related to orthogonal complements, and their relationships to Row and Null space, to characterize vectors and linear systems.
• Apply eigenvalues and eigenvectors to solve optimization problems that are subject to distance and orthogonality constraints.
• Analyze mathematical statements and expressions involving linear systems and matrices. For example, to describe how well a mathematical model fits measured data.
• ##### Taught by Experts in the Field
•  - Abe Kani
President