recentpopularlog in

eigenvectors

« earlier   
How to intuitively understand eigenvalue and eigenvector?
Eigenvectors are like the roots of a polynomial. They constrain the transformation in particular ways.

If you have a line or plane which is invariant then there is only so much you can do to the surrounding space without breaking the limitations. So in a sense eigenvectors are not important because they themselves are fixed but rather they limit the behavior of the linear transformation. Each eigenvector is like a skewer which helps to hold the linear transformation into place.
eigenvalues  eigenvectors  intuition  constraint  intuitive 
april 2018 by drmeme
CNN activation visualizations - Part 2 - Deep Learning Course Forums
Here are some CNN activations for Starry Night. If you've visualized activations, please post some here so others can see.

You can click an image to zoom in. You can click the zoomed-in image to zoom in once more.

In…
visualise  visualisation  cnn  imagenet  eigenvector  eigenvectors  deep-learning  fast.ai 
september 2017 by nharbour
Deeplearning4j - Eigenvectors, PCA, Covariance and Entropy - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM
A Beginner’s Guide to Eigenvectors, PCA, Covariance and Entropy

This post introduces eigenvectors and their relationship to matrices in plain language and without a great deal of math. It builds on those ideas to explain covariance, principal component analysis, and information entropy.
math  deeplearning  stats  pca  eigenvectors 
june 2017 by w1nt3rmut3

Copy this bookmark:





to read