top | item 45506883

(no title)

dawnofdusk | 4 months ago

I really like the second part of the blogpost but starting with Gaussian elimination is a little "mysterious" for lack of a better word. It seems more logical to start with a problem ("how to solve linear equations?" "how to find intersections of lines?"), show its solution graphically, and then present the computational method or algorithm that provides this solution. Doing it backwards is a little like teaching the chain rule in calculus before drawing the geometric pictures of how derivatives are like slopes.

discuss

order

egonschiele|4 months ago

Author here – I think you're probably right. I wrote the Gaussian elimination section more as a recap, because I figured most readers have seen Gaussian elimination before, and I was keen to get to the rest of it. I'd love to hear if other folks had trouble with this section. Maybe I need to slow it down and explain it better.

maybewhenthesun|4 months ago

I actually really liked the gaussian elimination part. It's a term you hear often and 'demystifying' it is good imho.

Only nitpick I have is that it's a pity you use only 1 and 2 in the example with the carbs. Because of the symmetry it makes it harder to see which column/row matches which part of the vector/matrix because there's only 1s and 2s and it fits both horizontally and vertically...

Syntonicles|4 months ago

Loved the article, and also the shoutout to Strang's lectures.

I agree with the order, the Gaussian should come later I almost closed the article - glad I kept scrolling out of curiosity.

Also I felt like I had been primed to think about nickles and pennies as variables rather than coefficients due to the color scheme, so when I got to the food section I naturally expected to see the column picture first.

When I encountered the carb/protein matrix instead, I perceived it in the form:

[A][x], where the x is [milk bread].T

so I naturally perceived the matrix as a transformation and saw the food items as variables about to be "passed through" the matrix.

But another part of my brain immediately recognized the matrix as a dataset of feature vectors, [[milk].T [bread].T], yearning for y = f(W @ x).

I was never able to resolve this tension in my mind...

emmelaich|4 months ago

To some, "Now we can add the two equations together to eliminate y: might need a little explanation.

The (an) answer is that since the LHS and RHS are equal, you can choose to add or subtract them to another equation and preserve equality.

If I remember correctly, substitution (isolating x or y) was introduced before this technique.

rzz3|4 months ago

I hadn’t, and your article lost me there to be honest. You didn’t explain the what, why, or when behind it, and it didn’t make sense to me at all. That said, I’m abnormally horrible at math.

DwnVoteHoneyPot|4 months ago

You're assumption worked for me... I've seen gaussian elimination before (but not the linear algebra) which gave me an idea of what we were doing.

turingbook|4 months ago

Do you have any plan to turn it into a full book—maybe called Grokking Linear Algebra ?

barrenko|4 months ago

Or something like to the tune of "what does it mean that we can eliminate", which is still unclear to me. But a lovely article, the way you (op) introduce the column perspective and really hepful for a novice such as myself.

+ there are many textbooks on LA. Not a lot of them introduce stuff in the same order or in the same manner. I think that's part of why LA is difficult to teach, and difficult to comprehend, and maybe there is no unique way to do it, so we kinda need all the perspectives we can get.