top | item 45017382

(no title)

dawnofdusk | 6 months ago

Whenever I read content like this about Big O notation I can't help but think the real solution is that computer science education should take calculus more seriously, and students/learners should not dismiss calculus as "useless" in favor of discrete math or other things that are more obviously CS related. For example, the word "asymptotic" is not used at all in this blog post. I have always thought that education, as opposed to mere communication, is not about avoiding jargon but explaining it.

discuss

order

crystal_revenge|6 months ago

> students/learners should not dismiss calculus as "useless"

This seems to be quite a bit of a strawman to me.

ML is such a major part of the field today and at a minimum requires a fairly strong foundation in calc, linear algebra, and probability theory that I earnestly don't believe there are that many CS students who view calculus as "useless". I mean, anyone who has written some Pytorch has used calculus in a practical setting.

Now in pure software engineering you will find a lot of people who don't know calculus, but even then I'm not sure any would decry it as useless, they would just admit they're scared to learn it.

If anything, I'm a bit more horrified at how rapidly peoples understanding of things like the Theory of Computation seem to be vanishing. I've been shocked with how many CS grads I've talked to that don't really understand the relationship between regular languages and context free grammars, don't understand what 'non-determinism' means in the context of computability, etc.

dawnofdusk|6 months ago

>This seems to be quite a bit of a strawman to me.

Not really, if you ever listen to CS undergrads or people in non-traditional schooling (bootcamps, etc.) talk about software engineering this opinion is essentially ubiquitous. People interested in ML are less likely to hold this exact opinion, but they will hold qualitatively identical ones ("do you really need multivariable calculus/linear algebra to do ML?"). It is precisely because people (primarily Americans) are scared to learn mathematics that they rationalize away this fear by saying the necessary mathematics must not be essential, and indeed it is true that many people get away without knowing it.

marcosdumay|6 months ago

Most people from almost every specialty mostly dismiss calculus as useless, don't even remember linear algebra is a thing, and never learn or forgets everything about statistics. (But the reaction to probability varies a lot.)

I have no idea what's up with those disciplines, but it's an almost universal reaction to them. Unless people are very clearly and directly using them all the time, they just get dismissed.

samwho|6 months ago

Part of the problem is that a lot of people that come across big O notation have no need, interest, or time to learn calculus. I think it's reasonable for that to be the case, too.

ndriscoll|6 months ago

The thing is, this is like saying lots of mechanical engineers have no need, interest, or time to learn derivatives; they just want to get on with "forces" and "momentum" and cool stuff like "resonance". Saying you have no interest in learning limits and asymptotes but you want to know what people are talking about when they mention asymptotic analysis doesn't make sense.

If you want know what calculus-y words mean, you're going to need to learn calculus. People use calculus-y words to quickly convey things professionally. That's why it's a "topic" for you to learn. The thing under discussion is a limit.

oooyay|6 months ago

I find myself in the odd position of disagreeing with you and the person you responded to.

First, software engineering doesn't just consist of Computer Science majors. We have a lot of people from accounting, physics, or people who have no degree at all. Teaching this concept in CS fixes very little.

Second, and complimentary to the first, is that asymptotic behavior is derivative of the lessons you learn in Calculus. You can't really full understand it beyond a facade unless you have a rudimentary understanding of Calculus. If you want to put this theory to the test then ask someone with a functional understanding of Big-O to write asymptotic notation for a moderately complex function.

I don't have a degree and in order to really understand asymptotics (and Big-O as well as the others) I read a primer on Calculus. It doesn't take a ton of knowledge or reading but a decent background is what will get you there. I do think we need a lot better continuing education in software that goes beyond O'Reilly style technical books that could fill this gap.

growthwtf|6 months ago

I'm not the original commentator, that makes a lot of sense! I had assumed there was a huge overlap, personally.

lenkite|6 months ago

Apparently Big-O notation was invented by Paul Bachmann in 1894. Its not a johnny-come-lately.