Summary of Linear Algebra

I've spent one more week studying linear algebra. And I think linear algebra is more abstract than calculus because most of the time I can understand what the textbook is talking about, but if confronted with a specific linear algebra problem, I don't know how to get started to solve it. But after reading the whole textbook, I am on track now. Generally speaking, linear algebra concerns itself with vector spaces and linear mappings among such spaces. The term space here refers to a specific field that satisfies conditions described below:

Let K be a subset of complex numbers C, K is a field if

1, if x and y belong to K, then x + y and xy belong to K

2, if x belongs to K, then -x belongs to K;1/x also belongs to K

3, 0 and 1 belong to K

the term mapping here refers to the transformations of some vectors from one space to other spaces. Recall that in high school, we define a vector as the combination of both direction and length. In traditional Cartesian coordinate system, a vector has two elements, namely x and y, and (x, y) forms the vector's angle relative to x and y axis, and we define the length of the vector as (x^2 + y^2)^1/2. We call this kind of vectors 2-dimensional vectors. If we extend this to let a vector has more than two elements, we have 3-dimensional, 4-dimensional,..., and n-dimensional vectors. To simplify things and make notations easier, they invented the term space to denote this. So, actually, a space refers to the base coordinates of a system, and all vectors in a space or field can be expressed by the base coordinates of that space. In linear algebra, they call this the space base. Here, we may think that a coordinate is a scalar, not really. A base coordinate can be a vector, or several vectors. Space base coordinates cannot be linear dependent, meaning that they cannot be formed by the linear combination of each other. I think these two concepts are the essential parts of linear algebra. Other things like eignvectors and eignvalues, singular component decomposition and the rest are based on these two concepts. For machine learning, linear algebra provides a good way to handle features, and optimize multivariate models, and also a good way to do numeric calculations.



要查看或添加评论,请登录

Zhenhong Li的更多文章

  • Linear Regression Notes

    Linear Regression Notes

    1, find the objective/loss/penalty function in accordance with a given scenario In linear regression, our goal is to…

  • Probability & Statistics Notes

    Probability & Statistics Notes

    I just finished the probability and statistics courses. I used the online Khan academy courses.

  • Partial Differentiation Notes

    Partial Differentiation Notes

    1, Suppose f(x,y) is a function. We say that lim f(x,y) = L when (x,y) approaches (a,b) if for every epsilon 0 there…

  • Limit theorem and derivative notes

    Limit theorem and derivative notes

    1,Suppose that y is a function of x, say y = f(x). Derivatives tells people how sensitive the value of y is to small…

  • Machine Learning Roadmap

    Machine Learning Roadmap

    I am going to study machine learning, and wanna take notes here. Since I majored in English during my college days, I…

    2 条评论

社区洞察

其他会员也浏览了