Geometric Algebra

Vector Product Aesthetics The Beauty: Inner Product Think back for a minute to your first Linear Algebra course: Remember how nice inner products were to compute? Try to think of how to do it off the top of your head. If it’s a bit blurry: it’s just taking each component of the vectors, multiplying them and adding all the results: $$\mathbf{a \cdot b} = \sum_{i=0}^{N} a_i b_i$$ Calculating it gives you a scalar that says something about the angle between the two. It is as simple to do in 687D as it is in 2D. ...

9 November, 2023

Understanding Neural Networks

Understanding of Neural nets from first-principles: Brain dump So I was reading my company’s IT newsletter the other day where one of the topics was sparse modeling (discussing this Forbes article) and it got me thinking again about some things I was reading the past months, about trying to understand how and why (mainly) Deep Learning works. First-principles In a way, sure we know how it works on the microscopic level of each individual neuron (activation functions, matrix multiplications, gradient descent and all that), and we also often describe it at a high level (where we tend to greatly anthropomorphize it: “the model learned to do X because in all its examples it saw this object from the same angle, ..”). But there are many questions in between where it seems we never connected the dots, instead relying on empiricism, often crude observations and post-hoc justifications for choices here: ...

1 November, 2023