top | item 46134591

(no title)

pathikrit | 2 months ago

I love math but the symbology and notations get in my way. 2 ideas:

1. Can we reinvent notation and symbology? No superscripts or subscripts or greek letters and weird symbols? Just functions with input and output? Verifiable by type systems AND human readable

2. Also, make the symbology hyperlinked i.e. if it uses a theorem or axiom that's not on the paper - hyperlink to its proof and so on..

discuss

order

youoy|2 months ago

Notation an symbology comes out of a minmax optimisation. Minimizing complexity maximizing reach. As with every local critical point, it is probably not the only state we could have ended at.

For example, for your point 1: we could probably start there, but once you get familiar with the notation you dont want to keep writing a huge list of parameters, so you would probably come up with a higher level data structure parameter which is more abstract to write it as an input. And then the next generation would complain that the data structure is too abstract/takes too much effort to be comunicated to someone new to the field, because they did not live the problem that made you come with a solution first hand.

And for you point 2: where do you draw the line with your hyperlinks. If you mention the real plane, do you reference the construction of the real numbers? And dimensionl? If you reason a proof by contradiction, do you reference the axioms of logic? If you say "let {xn} be a converging sequence" do you reference convergence, natural numbers and sets? Or just convergence? Its not that simple, so we came up with a minmax solution which is what everybody does now.

Having said this, there are a lot of articles books that are not easy to understand. But that is probably more of an issue of them being written by someone who is bad at communicating, than because of the notation.

sfpotter|2 months ago

Go ahead. Write a math paper with your proposed new notation with hyperlinks and submit it to a journal somewhere.

hodgehog11|2 months ago

(1) I always tell my students that if they don't understand why things are done a certain way, that they should try to do it in the way most natural to them and then iterate to improve it. Eventually they will settle on something very similar to most common practice.

(2) Higher-level proofs are using so many ideas simultaneously that doing this would be tantamount to writing Lean code from scratch: painful.

vjk800|2 months ago

1. I work in finance and here people sometimes write math using words as variable names. I can tell you it gets extremely cumbersome to do any significant amount of formula manipulation or writing with this notation. Keep in mind that pen and paper are still pretty much universally used in actual mathematical work and writing full words takes a lot of time compared to single Greek letters.

Large part of math notation is to compress the writing so that you can actually fit a full equation in your vision.

Also, something like what you want already exists, see e.g. Lean: https://lean-lang.org/doc/reference/latest/. It is used to write math for the purpose of automatically proving theorems. No-one wants to use this for actually studying math or manually proving theorems, because it looks horrible compared to conventional mathematics notation (as long as you are used to the conventional notation).

zwnow|2 months ago

I'd love getting rid of all the weird symbols in favor of clear text functions or whatever. As someone who never learnt all the weird symbols its really preventing me from getting into math again... It is just not intuitive.

Jensson|2 months ago

Those are used since it makes things easier, if you write everything out basically nobody would manage to learn math, that is how it used to be and then everything got shortened and suddenly average people could learn calculus.

gjulianm|2 months ago

I'm not sure that symbols are the thing actually keeping you away. Clear text functions might not be as clear, as it will be harder to scan and it will still contain names that you might not be familiar with. Those "weird symbols" are not there because people liked to make weird symbols. No one likes them, it's just that it makes things easier to understand.

kragen|2 months ago

Probably not. The conventional math notation has three major advantages over the "[n]o superscripts or subscripts or [G]reek letters and weird symbols" you're proposing:

1. It's more human-readable. The superscripts and subscripts and weird symbols permit preattentive processing of formula structures, accelerating pattern recognition.

2. It's familiar. Novel math notations face the same problem as alternative English orthographies like Shavian (https://en.wikipedia.org/wiki/Shavian_alphabet) in that, however logical they may be, the audience they'd need to appeal to consists of people who have spent 50 years restructuring their brains into specialized machines to process the conventional notation. Aim t3mpted te rait qe r3st ev q1s c0m3nt 1n mai on alterned1v i6gl1c orx2grefi http://canonical.org/~kragen/alphanumerenglish bet ai qi6k ail rez1st qe t3mpt8cen because, even though it's a much better way to spell English, nobody would understand it.

3. It's optimized for rewriting a formula many times. When you write a computer program, you only write it once, so there isn't a great burden in using a notation like (eq (deriv x (pow e y)) (mul (pow e y) (deriv x y)) 1), which takes 54 characters to say what the conventional math notation¹ says in 16 characters³. But, when you're performing algebraic transformations of a formula, you're writing the same formula over and over again in different forms, sometimes only slightly transformed; the line before that one said (eq (deriv x (pow e y)) (deriv x x) 1), for example². For this purpose, brevity is essential, and as we know from information theory, brevity is proportional to the logarithm of the number of different weird symbols you use.

We could certainly improve conventional math notation, and in fact mathematicians invent new notation all the time in order to do so, but the direction you're suggesting would not be an improvement.

People do make this suggestion all the time. I think it's prompted by this experience where they have always found math difficult, they've always found math notation difficult, and they infer that the former is because of the latter. This inference, although reasonable, is incorrect. Math is inherently difficult, as far as anybody knows (an observation famously attributed to Euclid) and the difficult notation actually makes it easier. Undergraduates routinely perform mental feats that defied Archimedes because of it.

______

¹ \frac d{dx}e^y = e^y\frac{dy}{dx} = 1

² \frac d{dx}e^y = \frac d{dx}x = 1

³ See https://nbviewer.org/url/canonical.org/~kragen/sw/dev3/logar... for a cleaned-up version of the context where I wrote this equation down on paper the other day.

zozbot234|2 months ago

> ... It's optimized for rewriting a formula many times.

It's not just "rewriting" arbitrarily either, but rewriting according to well-known rules of expression manipulation such as associativity, commutativity, distributivity of various operations, the properties of equality and order relations, etc. It's precisely when you have such strong identifiable properties that you tend to resort to operator-like notation in any formalism (including a programming language) - not least because that's where a notion of "rewriting some expression" will be at its most effective.

(This is generally true in reverse too; it's why e.g. text-like operators such as fadd() and fmul() are far better suited to the actual low-level properties of floating-point computation than FORTRAN-like symbolic expressions, which are sometimes overly misleading.)

bmacho|2 months ago

AsciiMath makes easy equations read easy.

1 and 2 would be

  1) d/dx e^y = e^y dy/dx = 1
  2) d/dx e^y = d/dx x = 1
edit: edited, first got them wrong

xigoi|2 months ago

What’s wrong with Greek letters? Would the number π be any easier to understand if it was written differently?