top | item 3632267

Kill Math

199 points| noonespecial | 14 years ago |worrydream.com | reply

85 comments

order
[+] LesZedCB|14 years ago|reply
Something that I think is important in math is the practical and theoretical going hand-in-hand. I'm still an undergrad student, but I have really enjoyed my math education so far. I have always appreciated both the beauty of the abstraction of math, as well as the beauty of applying that abstraction to solve very real problems.

That being said, I think something might be lost when you take away the abstraction of math. I think a lot of math -and math abstraction especially- is difficult because of the way variables are understood. Since the beginning of algebra one, we learn that we can have an unknown, say 'x', and we can solve for it trivially by manipulating an equation to set 'x' equal to some value. But people always seem to get caught up in the naming idea. I remember in high school math classes, many of my peers would get so hung up as soon as the 'x' was turned into a 'y'. I felt like it was hopeless trying to explain that 'x' and 'y' are just names for something.

This same problem percolates into every bit of math. After enough time, functions in math become first class citizens as well. That throws even more people. The exact same problem is seen in the programming world when people move from functions as an idea of a subroutine to being treated like a first class citizen. And lets not even mention Haskell or another functional language, with currying and stuff like that.

I guess that's the problem that I see in math; the idea of an abstraction is difficult. However, I'm not totally convinced that making complex math more concrete is the best way to do that. At some point, having linked numbers like in interface builder looses its usefulness. So much of math is proofs. How can you do proofs when you are constantly instantiating concepts, rather than dealing with an abstraction. At some point, "there exists" needs to become "for all," which I think might be difficult in this type of mathematical environment.

Again, concrete and abstraction really go best hand-in-hand, in my opinion.

[+] brian_campbell|14 years ago|reply
This article isn't necessarily about making ALL math less abstract / challenging. That's arguably impossible. It's about making math more accessible to the masses.

Most college educated Americans have never learned ANYTHING about differential equations, linear algebra, or discrete math... That is a serious shame and you could argue impacts our national productivity/potential. Most people simply will not try to learn higher level math out of fear of failure, challenge, or whatever.

Making SOME higher level math more intuitive is a great objective. I would much rather live in a world where more Americans ultimately understand a higher level of math (regardless if the path to get there was a little less challenging) than leaving math education in it's current state.

Analogy: How many average American's tried to use computers with just command prompts? I would argue that it was the creation of an intuitive interface ("physical" folders where you store documents) that really started mass adoption of computers throughout the world. Is this less challenging/abstract than command prompts- yes. Is the world vastly more productive because of it? Absolutely.

[+] forrestthewoods|14 years ago|reply
I'm not sure how I feel about this. Sometimes complicated things are just complicated. I'm not sure making pretty pictures to make things appear more simple than they are is necessarily a good thing. Abstract thinking is difficult. Breaking the problems down such that it no longer requires abstract thought somewhat defeats the purpose.
[+] icandoitbetter|14 years ago|reply
Don't delude yourself with this Protestant hard-work-is-necessary mentality. Better representations are possible and they can save us a lot of work. It's easier to perform multiplication using Hindu-Arab numerals than it is with Roman numerals. It's easier to program with Python than Assembly. Difficulty is a property of the representation, not of some underlying 'abstract thing' that is being represented.
[+] JumpCrisscross|14 years ago|reply
Reminds me of the thesis behind the Learn Python/C/etc. The Hard Way series. By abstracting away the abstraction and complexity we may be turning out people who are very comfortable with the specific task but, due to a lack of familiarity with the primitives underlying what they're doing, are incapable at generalising it to even slightly different problems.

There is another set of "abstract symbols" we have a "freakish knack" for manipulating: the alphabet. Literacy was once, too, deemed beyond the reach of commoners.

[+] Gravityloss|14 years ago|reply
Two things:

1. Needless complexity 2. Needed complexity

1.: Think how programming in assembler requires you to mentally keep track of registers, and how higher level programming languages did away with that by enabling variable names. The resulting program is still mostly the same thing underneath. You can get the same logical results but the former is much harder.

There is a question here though: why is it easier for most humans to program in C rather than in assembler? Sure, there's more housekeeping in assembler. But I think it's mostly because there's one less layer of representation. a=5; b=10; c=a+b; Vs "load 5 into register A1", "load 10 into register A2", "calculate sum from registers A1 and A2 and store it to register A3".

So here in "less layered math", instead of saying: Let alpha represent the angle, a the closer cathete and b the further cathete and c the hypotenuse, then sin(alpha) = b/c You instead say: sin is the relation of the further cathete to the hypotenuse.

Again, a large portion of people can get the former way of explaining sin, perhaps when they see it all at once on the blackboard or book and go back and forth... (what was b again?) but the latter way of explaining does away with the whole exercise of variables. (Note that I picked a subject of convention, not something that can be deduced from more basic principles, I think those again can be taught somewhat differently.)

The problem is, many math teacher love variables and think how beautiful all that symphony of x:s and y:s is. But for the average school student math is just one subject among others. They just want the essentials with the least amount of extra crap and layers on top.

Lots of caveats here. I think people need to learn basic algebra, but it makes me sad that so few people can't use it for problems they are trying to solve. I think these would make interesting psychology research subjects. It's a very important field.

2.: Yet you can't go beyond some point in simplification.

[+] Geee|14 years ago|reply
Have you seen this Bret's talk about his principle http://vimeo.com/36579366 ? It's not really about simplification of things.

The principle is basically "to see what's going on", or "to have immediate connection with what you are doing". Basically when you design something, you have to try to think or simulate in your head what's going on. Bret wants to remove that barrier so you don't have to think.

There's some great examples in the video where the principle is applied to graphics, game design, electronics, programming etc.

[+] DanBC|14 years ago|reply
I am hopeless at math. This is something that I am ashamed of. But there are people who seem to be proud of their ignorance; they're shocked if you haven't read any Shakespeare but happily admit they can't do percentages.

Will those people be helped by the author's approach? I don't know, but I don't think so. These people will see a number and throw their hands up, saying "Oh maths! I can't do sudoku, how do you expect me to do this!". This attitude is not quite as prevalent in the UK as it used to be, but it's still there. See, for example, the number of esoteric arts programmes on BBC compared to the number of advanced science programmes. (I'm not aware of any science programming that would be beyond an enthusiastic 14 year old. I do know of hours of arts programming that is unashamedly elitist. Elitist is fine, but it'd be great for some balance.)

What is needed is better maths education. (I finished school many years ago; maybe things are different now.) Math is not blindly mashing numbers and symbols and hoping for the best. Maths includes a large element of careful thinking, exploring the problem, listing the known information, listing what you want to find.

New techniques for using math would help reduce the gender inequality in math results too.

Put the normal "more research needed" caveats around this, but: There's some suggestion that girls use inefficient techniques and "just struggle through", they manage to get correct results and so they don't get extra help. Boys tend to just stop when it gets too laborious, and thus they get taught new better techniques.

(http://news.bbc.co.uk/1/hi/education/4587466.stm)

[+] tikhonj|14 years ago|reply
This is an interesting idea, and I think his final analogy summed it up perfectly: symbolic math is like a command line. But I reject his assumption that a command line is a bad interface. And that mirrors my thoughts on this post: for the layperson, a GUI may indeed be better than a command line; for a professional (a programmer) the reverse is true.

A command line lets you combine and recombine different programs and easily do many things its creators never dreamed of. Symbolic manipulation and algebra are just like that!

Let's imagine you know how to differentiate viscerally; you understand what a tangent line looks like and how to plot that and you care not for silly equations like d/dx x^3 = 3x^2. Naturally, you understand basic arithmetic in the same way and not as mechanical manipulations on symbols. You are perfectly well equipped to deal with real problems, and perhaps find it easier than shuffling symbols around. But you would never come up with a way to get the derivative of, say, an algebraic data type[1][2]. (For the curious, this is how you can define a zipper on a type.)

[1]: http://blog.lab49.com/archives/3011 [2]: http://strictlypositive.org/diff.pdf

And that's the problem really: mechanical manipulation of symbols lets you divorce the mathematical idea from the underlying "reality". It lets you generalize patterns with complete disregard for what they mean. And, somehow, this produces indubitably useful, nontrivial results. That's the real magic of math, and that's the magic that you don't get in your high school courses. (I think linear algebra is the first subject like this, but I'm just learning it now.)

[+] LesZedCB|14 years ago|reply
Also, I think a point that could be made that if you were to teach a 'layperson' how to use a command line, given they were willing to learn it -and not pull the "I'm too dumb, this is for you computer people" card- I think they would quickly discover that there are few things they do daily that they could do quicker.
[+] platz|14 years ago|reply
I think Bret may be targeting laypeople from what I've seen, so he will always come down on the side of enabling growth for larger numbers of people, as opposed to an 'elite' few (comparatively).
[+] noonespecial|14 years ago|reply
FTA: "By comparison, consider literacy. The ability to receive thoughts from a person who is not at the same place or time is a similarly great power. The dramatic social consequences of the rise of literacy are well known."

I'm thinking it might be a bit on the difficult side to express the relationships written equations make easy if all you've got is an animated graph full of dials or a "scrubbing calculator".

Written math is wonderfully dense and meaningful and allows the transmission of a very particular kind of knowledge with the simplest of mediums.

[+] firefoxman1|14 years ago|reply
That's a good point. Sort of like when I first discovered regular expressions, I thought "Wow, I can express a whole sentence worth of very specific commands in a few characters." The same goes for Math. I don't really like Math, but I certainly won't deny its power and ability to be expressed in such a compact form.
[+] moxiemk1|14 years ago|reply
I'm worried that in presenting math with visible, tangible intuition dominant will do more to silo the knowledge inside the ivory tower.

N-variable calculus isn't something that you can draw a picture of. Lots of (very practical) Linear Algebra is derived from the pure "Linear Transformation" interpretation rather than the "Matrix" one. Stochastic...anything (seems to be all the rage these days) is not made with helpful pictures.

If we train people to do math in a non-abstract way, they won't be as easily able to grapple with the real problems, which are only approachable in the abstract.

[+] enjalot|14 years ago|reply
I disagree, at least that pictures can't help with higher dimensional math. Well, regular pictures may not help so much but interaction really can. A lot of my understanding of higher level math depends on my programming ability to "touch" some of the abstract concepts.

When learning something new you need to be able to associate the new concept with things you already understand, and visuals can give you an intuitive sense of how things are behaving (especially in time)

I don't think the visuals could necessarily take the place of training to use the abstract symbols or lines of code any time soon, but I think they are a really necessary step in the right direction to get more people to approach math.

[+] aspensmonster|14 years ago|reply
One question: How were those demonstrations programmed? I have a feeling they were developed by "manipulating abstract symbols."

Math doesn't need a new interface, any more than a POSIX compliant shell does. Math is a beautiful and expressive language. Its strength is in communicating difficult, abstract concepts in a language that is manipulable. This manipulation is of the utmost importance in understanding the relationships between different concepts and even in developing new ones. Removing the single strongest aspect --the "abstract symbols"-- of mathematics is to neuter it.

[+] maxerickson|14 years ago|reply
I think part of his goal is to help people that do not yet understand the abstraction have a way to manipulate a system involving it.

Then play can lead to understanding, in a way that is unlikely for 2x+1=5.

[+] Tycho|14 years ago|reply
I get the sense that there's two levels of understanding for most mathematics: a superficial level where you just have to manipulate symbols and be comfortable applying rules, and a deeper level where you grasp the underlying relationships and some fundamental 'why' of the problem. I used to think that if you were bad at the first, then the second would be an even tougher challenge. But I'm starting to suspect that actually people who are good at the first often just don't care about the second. And in fact it might be the very need for things to 'make sense' to you on a deeper level that is causing discomfort and difficulty at the superficial level. Some people lack curiosity about the deeper relationships, and perversely this might actually help them get along at the superficial level.

For instance in finance, why is Macaulay duration(1) approximately equal to effective duration? One measures time in years, the other measures sensitivity in percentage form. How the heck do they come out at the same number approximately? For some reason i can't find any literature that's in a rush to explain this relationship... even though the two measures share the name 'duration' so presumably there's some intuitive understanding to be reached.

(1) when the yield is expressed continuously compounding, at least

[+] Jach|14 years ago|reply
I had some initial thoughts when I first saw this, I don't think they've changed much since. It'll be interesting to see where it ends up, for sure. I wrote again about this page after reading the somewhat recent Dijkstra lecture about the radical novelty that is programming (and other topics). Here's a slightly modified copy/paste, I'll warn that it kind of wanders after "Things Other People Have Said" so you're invited to stop reading at that point.

While I sympathize with the opening, because many neat things have been made/discovered without the person having any formal math knowledge like what the "U"-looking symbol in an equation stands for:

>"The power to understand and predict the quantities of the world should not be restricted to those with a freakish knack for manipulating abstract symbols."

I heavily disagree with the conclusion:

>"They are unusable in the same way that the UNIX command line is unusable for the vast majority of people. There have been many proposals for how the general public can make more powerful use of computers, but nobody is suggesting we should teach everyone to use the command line. The good proposals are the opposite of that -- design better interfaces, more accessible applications, higher-level abstractions. Represent things visually and tangibly.

>And so it should be with math. Mathematics, as currently practiced, is a command line. We need a better interface."

I think the notion that they're unusable by the vast majority of people because of something fundamental about people is false. At some point in time reading and writing were not done by the vast majority of people, then they came into existence. Even as recent as 500 years ago, the vast majority of the population neither read nor wrote. Along came public education and that proportion flip flopped insanely fast such that the vast majority are capable of reading and writing (regardless of how good they are at it). Reading and writing were just as radical novelties as computing, just because something is a radical novelty doesn't mean most humans can't be proficient at it eventually.

I think we should teach everyone to use the command line. Well, not Windows' CMD.EXE, but bash preferably in a gnome-terminal. Here is a short quote expressing why I think this is a good idea:

>Linux supports the notion of a command line or a shell for the same reason that only children read books with only pictures in them. Language, be it English or something else, is the only tool flexible enough to accomplish a sufficiently broad range of tasks. -- Bill Garrett

I think we should teach everyone how to interact with the computer at the most general way--which means programming. Which means commanding the computer. Describing the inverse square law in terms of pictures and intuition isn't going to make it any more of a tool than some other method, people are still going to think of it as something one is taught. The only way to make it seem like a tool is to use it as a tool, this means programming for some purpose. Maybe a physics simulation. And the beauty of tools is why the software world has exploded with utility despite Dijkstra's depression at programmers' inability to program extremely well. The beauty of tools is that they can be used without understanding the tool, just what the tool is useful for.

The "Things Other People Have Said" at the end is more interesting than the essay.

I wonder what Dijkstra would think of it. My two best guesses are "This is just a continuation of infantizing everything" and "We did kill math, with programming." I think a lot of people's difficulties with symbolic manipulation are due to the symbols not having immediately interpreted meaning. Dijkstra seems to recommend fixing this by drilling symbol manipulation of the predicate calculus with uninterpreted symbols like "black" and "white". My own approach I have been using more and more is to just use longer variable names in my math, whether written or typed. It really seems like this simple step can be a tremendous aid in understanding what's going on.

Over the past couple of years I've realized just how important sheer memorization can be as I see almost everyone around me struggle with basic calculus facts, which means they struggle with application of the facts. The latest example from a few weeks ago in a junior level stats course with calc 2 prerequisite (which many people take 2 or 3 times here apparently) was when apparently no one but me recognized (or bothered to speak up after 20 seconds) that (1+1/x)^x limited to infinity is the definition of e, which we then immediately used with (1 - const/x)^x = e^(-const). (Which is immediately related to the continual compound interest formula that changes (1+r/n)^nt to e^rt as n->infty which everyone should have seen multiple times in algebra 2 / pre calc when learning about logarithms! The two most common examples are money and radioactive decay.)

Sure that's a memorized fact for me, it's not necessarily intuitive apart from the 1/x being related to logs which are related to e, but come on. In my calc 3 course that I didn't get to waive out of, where it was just me and two others, I was the only one who passed, and when the teacher would try to make the class more interactive by asking us to complete-the-step instead of just a pure lecture the other two were seemingly incapable at remembering even the most basic derivative/integral pairs like D[sin(x)]=cos(x). A lot of 'education' is students cramming for and regurgitating on a test where the teacher hopes (often in vain) that the cramming resulted in long-term memorization. I do this too, cram and forget, but maybe I just do it less frequently or for less important subjects or my brain's wired to memorize things more easily than average (but I disfavor hypotheses that suppose unexplained gaps in human ability as largely static). My high school teacher's pre-calc and calc courses had frequent pure memorization tests such that there wasn't a need to cram elsewhere because that iterative cramming was enough to get long-term memorization (with cache warmups every so often; I did have to breeze through a calculus book in preparation for a waiver exam last year).

A person can only hold so many things in their active, conscious memory at once (some people think it's only 7 plus or minus 3 but that seems like too weird a definition of 'thing' that results in so little; human brains are not CPUs with only registers r0 to r6), so when you start looking at a math proof with x's and y's and r's and greek letters and other uninterpreted single-character symbols all over the place, it's incredibly easy to get lost. If you start in the middle, if your brain hasn't memorized that "x means this, y means this" for any value of "this", you have to do a conscious lookup and dereference the symbol to some meaning even if the actual math itself is clear. My control systems book uses C and R for output/input (I already forgot which is which) but it's so embedded in my brain that Y is typically output and X is input that I use those instead, as does the teacher. I agree with Dijkstra that practice at manipulating uninterpreted symbols makes it a bit easier, but there are quickly diminishing returns and ever since I started programming and saw how PHP uses the $var syntax to denote variables I've been thinking "Why hasn't this caught on in math? It makes so many things much clearer!" But it's not so much the "$" in $var (or @var and other sigils in Perl) but the "var". Saving your brain a dereference step is pretty useful.

Single-character symbols (and multiple character symbols) that not only hide tons of meaning through a dereference, which is naturally how language works, that also suggest a meaning themselves, are stupid and promote natural diseased human thinking. My poster-child here is global warming. Every winter you'll hear the same comments on "When's global warming going to hit here?" The poor choice of global warming as a variable that points to a bigger theory has cost it heavily in PR because humans look at only the phrase instead of what it points to, it's still so bad that people's every day experiences, which they use to form a subconscious prior probability in global warming as a theory, contain "it's really hot all the time and everywhere I've been!" and so they look at "global warming" and dismiss any evidence for/against it based purely on the name and how their experience seems to contradict the name. It's like a computer thinking that a pointer address that happens to correspond to 0xADD means it should invoke addition when it should figure out what 0xADD points to instead which could be anything.

Another use of long names is with Bayes' Theorem in probability. It is only through memorization of the uninterpreted symbols A,B,C themselves that I remember prob(A | B, C) = prob(A | C) * prob(B | A, C) / prob(B | C) But it never made intuitive sense to me and I never bothered memorizing until it was expressed as prob(hypothesis | data, bg_info) = prob(hypothesis | bg_info) * prob(data | hypothesis, bg_info) / prob(data | bg_info). (Sometimes data is replaced by model.) The notion that it relates reason with reality, that it encapsulates the process of learning and scientific reasoning, elevates the equation to the status of a tool instead of something you cram for and regurgitate on a test. An immediately useful application for the tool is using Naive Bayes to filter spam.

[+] jacobolus|14 years ago|reply
> (1+1/x)^x limited to infinity is the definition of e

This shouldn’t be “memorized” as a “fact” though. As you point out, it’s the very definition of e. Which is to say, to really understand what the exponential function means implies building up a mental model about lim [n→∞] (1 + x/n)^n and its behavior, interacting with it, connecting it to other functions, seeing what happens when you combine it with other ideas: trying to integrate/differentiate it; noticing how it reacts to fourier transform; relating it to rotations, areas, imaginary numbers; writing it as a taylor expansion or continued fraction, or with a functional equation, or as the solution to a differential equation. Connecting it to derived functions such as e.g. the normal distribution, or sine, or hyperbolic sine. Generalizing it to an operation on complex numbers, or quaternions, or matrices. Thinking about what exponentiation in general really means. Coming up with algorithms for computing e’s decimal expansion or proving that e is irrational and transcendental. Solving problems with it, like to start with, continuously compounded interest (&c. &c. &c.).

No one who had really learned about exponentials in a deep way would easily forget that this is the definition of e, and that has nothing to do with lists of facts or rote memorization.

> Single-character symbols (and multiple character symbols) that not only hide tons of meaning through a dereference, which is naturally how language works, that also suggest a meaning themselves, are stupid and promote natural diseased human thinking.

I think you should try writing out some very difficult complex math proofs before you make this assertion (say, for instance, a proof . Things are bad enough when we pack ideas down. Start expanding them in the middle of the computation, and it the steps become almost impossible to see or reason about.

The whole point of assigning things short simple names is that it gives a clear analogy (to past experience w/ the conventions) that provide a shortcut to anticipating how particular types of structures and operations behave. Cramming a matrix or a multivariate function or an operator down into a single symbol helps us to treat that symbol as a black box for the purpose of the proof or problem, which helps us bring our existing mathematical tools and understandings to bear. Sometimes, we run afoul of false impressions, as when we apply intuition about the real numbers to matrices in ways that don’t quite apply, or intuition about metric spaces to general topological spaces, &c. But this is I think an unavoidable cost, and it’s why we strive to be careful and precise in making mathematical arguments.

[+] jabkobob|14 years ago|reply
I must disagree with the command-line part. I don't think that the command line is fundamentally more powerful than any other interface.

Why is the command line powerful? Because it offers a large number of utilities that are highly configurable and that can be linked easily.

But you could have just the same expressiveness if the interface was eg. a circuit diagram, where you connect configurable commands with lines.

You know why the command line uses text input? Because it is the simplest to implement. The only people who need to know how to use the command line are people who need to use software where it doesn't pay off to make a more intuitive interface.

[+] abecedarius|14 years ago|reply
I remember Bayes as P(a|b) p(b) = p(a,b) = p(b|a) p(a). It's hardly the only way it's linked up in my personal mind-map, of course, but it's how I first memorized it.

I only skimmed this comment, I'm afraid.

[+] mbq|14 years ago|reply
This is so wrong... The point of math is that ideas can be generalized and re-used in previously unexpected places -- your ideas remind me Middle Ages where people were learning a mnemonic poems to remember "law of proportion", treating it like a precious magic gizmo and were perfectly sure they wouldn't be able to recreate it once forgotten.
[+] Riesling|14 years ago|reply
Have you read the article? One of his arguments is that math, due to its interface, is not about ideas but about abstract symbols and "symbol-pushing tricks". For example many people know how to use the chain rule when they derive a function, without even having the slightest grasp of what is going on. How is this any different from "learning a mnemonic poem to remember the law of proportion"?
[+] shadowmint|14 years ago|reply
I can't help but think this is completely missing the point.

TED has had people cover this a number of times, but the problem with math isn't that the the _syntax is too hard to parse_ its that the _problems in math class are stupid_.

"The bucket is depth X, diameter D and full of water. When you open the tap, how long will it take to drain if the water drains out at rate Z?"

I've never had to apply solving a problem like this, in my entire life (and if I were in a job that I _did_ have to, the complexity of a real life situation would mean I would still have to learn domain specific tools to solve the problem (eg. how big is the air intake? Does that limit the rate of flow? etc)).

Tangible problems in the real world require mathematical models (often probabilistic models) to solve them.

How do you take a real world problem, break it down into bits, and then use the mathematical tools available to solve them?

By creating a model, and then guessing what the rules that govern that model are, then comparing the model to reality, and refining the rules; and when you can't figure out the rules, thats when its time to whip out the text book and say, well, guess what, someone has had that problem before and this is how they solved it~

Teaching kids how to create models and reach out into the mathematical library available to them when they need it would be vastly more helpful than trying to creating more abstract alternative ways of understanding obscure math concepts that will never to relevant to them.

I've never been more frustrated than I was the other night when I was at a party and a boiler maker (who incidentally earns 3x what I do. damn mining boom) was telling me about all the cool math he's learnt since he started his job. It's all geometry and rate of flow differentials and he said "why did I have to learn matrices at school? total waste of time. they should have been teaching us useful things"

[+] erichocean|14 years ago|reply
I find the syntax of math very hard to parse (and I write and use parsers as part of my job).

For people who "get" math, I agree it's very difficult to understand how another, obviously intelligent person can find math difficult to understand and use – especially someone who has put in substantial, sustained effort and has zero problems with the command line, programming, algorithms, parsers and just computer science in general. It doesn't make any sense.

Except they do exist (I'm living proof).

Frankly, I'm baffled by the problem. I've read numerous books on math, taken tons of courses, and spent quite a bit of time trying to figure out why math, as a tool, is out of reach to me.

Sure, I can apply the "rules" at a purely syntactic level, but a good example of where math-like thinking is needed as a tool is with a language like Haskell, which I also find completely opposed to how I think about and see the world. I wish I knew why, because I'd like to use it. :(

So while I think the presentation is at least part of the problem, there's something about mathematics that needs to "click" before you can really make use of it, and that hasn't happened for me yet, despite literally _decades_ of trying.

If anyone has pointers for books, I'll be checking back here for comments.

[+] TeMPOraL|14 years ago|reply
> "why did I have to learn matrices at school? total waste of time.(...)" Unless you're doing game programming as a hobby, or encounter problems that can be modeled as a set of equations, which then can be easily solved using few matrix tricks, giving you lots of fun and $$$.

But I do agree with your sentiment; even though math can be fun on its own, it's easier (and probably better) to get people interested in it by showing how to use it to solve complex problem. After all, math is just formalized, applied rational thinking.

[+] im3w1l|14 years ago|reply
The problem with what you write is that it is very hard, if not impossible to understand these highly complex probabilistic models if you have not first practiced with simpler ones.
[+] pfortuny|14 years ago|reply
OK so it takes years to learn Japanese but it is the only way to truly enjoy haikus...

The la guage of modern maths is what has enabled the physics at the LHC. You cannot get the latter without the former, hard as you may try.

[+] nwatson|14 years ago|reply
It will be hard to come up with an accessible general framework to express general problem situations without resorting to either (a) an explosion of templates for solving the N-most-common kinds of problems (the browsing through which will be worse than learning the math and the programming); (b) a set of elaborate but limiting special-purpose programs (much like iPhone apps that might help, for example, a metal welder to choose stock feed rate, gas mix/flow, and electric current depending on metal type and thickness to be joined; or mortgage calculators).

Worse yet, it doesn't take much for interesting problems to get into the land of over- or under-constraint. In an under-constrained problem there are a multitude of solutions forming their own K-dimensional space, and then one likely should apply some secondary optimization criteria to determine the best solution. In over-constrained problems there is no exact solution, but again one needs to impose some optimization to get something "close enough" to the desired criteria, according to some norm. Explaining the need for optimization, letting the user explore the solution space, and having them express these optimization criteria will be tough.

A Google search for "geometric constraint solver" leads to this paper: http://www.cs.purdue.edu/homes/cmh/distribution/papers/Const... -- what appears to be a lot of work just in helping people/machines solve geometric problems (though surely more widely applicable when taken in the abstract). Coming up with even more general "hands-on" explorers/solvers will be a lot of work.

[+] dubya|14 years ago|reply
For the more mathematically inclined, there's a really nice little book, "The Computer as Crucible" by Borwein and Devlin, about experimental math. It makes a really convincing case, that I think couldn't have been made 15 years ago, that computers really have something essential to contribute to mathematics now. I think Borwein has written a number of articles about the same, and Devlin had a column in the AMS Notices about computer math.
[+] ypcx|14 years ago|reply
Tell me if I'm wrong, but I have this theory that the whole Math could be expressed on top of a single basic operation, which is addition.

For example 2 * 3 is (2 + 2 + 2).

  Or sin(7) is (7 * (1 + -0.1666666664*(7*7) + 0.0083333315*(7*7)*(7*7)
  + -0.0001984090*(7*7)*(7*7)*(7*7) + 0.0000027526*(7*7)*(7*7)*(7*7)*(7*7)
  + -0.0000000239*(7*7)*(7*7)*(7*7)*(7*7)*(7*7)))
(the multiply operator left in for brevity, but can be easily replaced by addition)

If my assumption is true, then Math is really just an arcane and archaic form of a computer-like language.

And at the same time, it's possible that the operations meant by Math's various symbols, however arcane, can be understood much better (especially with the use of modern tech, animations, etc.) than we understand them from classic texts, but those who learnt them haven't spent enough effort to create more visual explanations of it, based on which others, and also themselves, could understand those operations better, and then be able to build on top of them and even evolve Math further. (I know that I don't know what I don't know, but I have tendency to assume that I know a lot more than I really know, thus I don't explore the known deeper/further.)

[+] thristian|14 years ago|reply
It sounds like you're thinking of (or working towards) Peano Arithmetic¹, which (to oversimplify) basically starts with "0", "1", "=" and "+", and works from there. Whether that's "enough" rather depends on what you're trying to achieve, though. There's subgenres of mathematics that don't necessarily involve numbers at all, like geometry and topology. It would be difficult to reduce them to addition.

¹: http://en.wikipedia.org/wiki/Peano_axioms

[+] yequalsx|14 years ago|reply
Addition is a function and so functions would be a more basic concept. But functions are sets and sets are a more basic concept. Thus we should study sets. Everything comes from sets. Replace addition with sets and your first sentence would be closer to the truth.

I strongly disagree with your last paragraph though.

[+] super_mario|14 years ago|reply
So you think the whole of math is just numerical computation? You think there is nothing more to math than computing values of certain classes of functions? This is what you get when your only exposed to so called higher math is calculus and perhaps a few courses on differential equations, with emphasis on numerical methods.

There are areas of math that don't fit this simplistic view, things that don't have analytic in their name, like topology, non-Euclidean geometry, mathematical logic etc. Algebra in general is all about studying structure that ensues when you have bunch of objects and operations you can do with them with certain properties. This structure doesn't depend on the nature of object or operation. For example rotation group etc. Then there are things like measure theory, fractals, and things like non-commutative operator algebras that are useful for applications in quantum mechanics. All in all a huge chunk of math is not about computation. Bigger in fact than the part about actual computation, esp. computation of real numbers.

[+] jpdoctor|14 years ago|reply
> When most people speak of Math, what they have in mind is more its mechanism than its essence. This "Math" consists of assigning meaning to a set of symbols, blindly shuffling* around these symbols according to arcane rules, and then interpreting a meaning from the shuffled result. The process is not unlike casting lots.*

Wow, sounds like someone doesn't really understand math.

[+] deadsy|14 years ago|reply
Brett Victor is very accomplished. I'm pretty sure he understands math. Personally speaking if Brett Victor had a different opinion about something than me than I'd pay careful attention, because I'm probably missing something.
[+] lukev|14 years ago|reply
That's precisely his point. The "UI" for math obfuscates what it's actually about. Math is full of purity and simplicity, but the notation and algorithms we start out teaching kids are anything but.
[+] gmichnikov|14 years ago|reply
This is interesting, I especially like the Scrubbing Calculator. While I am not sure how it would handle math above Algebra 1, I think this could be great for building intuition in Pre-Algebra and Algebra 1, which are the most important years of math education, in my opinion.

I tutor some 11-14 year-olds who are studying these sorts of things, and it's incredible how many of them can look at something like 100/0.08 and have absolutely NO idea what neighborhood the answer should be in. (For example, they might accidentally multiply, get 8, and never think twice that the answer couldn't possibly be 8 because it has to be above 100.) Something like this might be really helpful in building intuition about the relationships between numbers.