A Beginner’s Guide to Eigenvectors, PCA, Covariance and Entropy
Diego Marinho de Oliveira
Gen-AI Search, RecSys | ex-SEEK, AI Lead, Data Scientist Manager and ML Engineer Specialist
"This post introduces eigenvectors and their relationship to matrices in plain language and without a great deal of math. It builds on those ideas to explain covariance, principal component analysis, and information entropy.
The eigen in eigenvector comes from German, and it means something like “very own.” For example, in German, “mein eigenes Auto” means “my very own car.” So eigen denotes a special relationship between two things. Something particular, characteristic and definitive. This car, or this vector, is mine and not someone else’s.
Matrices, in linear algebra, are simply rectangular arrays of numbers, a collection of scalar values between brackets, like a spreadsheet. All square matrices (e.g. 2 x 2 or 3 x 3) have eigenvectors, and they have a very special relationship with them, a bit like Germans have with their cars."
Read full article at https://bit.ly/1MZFlos
Me no playz linkedin game: unless you have a reason to connect, please, pass by, have a nice day, thank you.
9 年Great intro, thank you.
Software craftsman
9 年Such a nice illustration of both the beauty and practical usability of maths !
Hi Diego, isn't 9 equal to 100 base 3, not 30 base 3 ?
Engineering & Program Management Executive
9 年Excellent explanation. I wish I had this when I was doing my Statistics degree in the late 70's. It would make things so much easier to comprehend.
Director ? Data Science & Artificial Intelligence (AI-CoE) - Fidelity Investments || Author (Machine Learning, Deep Learning) || AWS || Deep Learning || Reinforcement Learning|| Generative AI||
9 年Hi Diego...the link is not working for me.Is there a way to get thia article.It would be really kind if you can share the soft copy at [email protected] Thanks