top | item 43790143

(no title)

jamesrom | 10 months ago

It's easy to think of notation like shell expansions, that all you're doing is replacing expressions with other expressions.

But it goes much deeper than that. Once my professor explained how many great discoveries are often paired with new notation. That new notation signifies "here's a new way to think about this problem". And that many unsolved problems today will give way to powerful notation.

discuss

order

veqq|10 months ago

> paired with new notation

The DSL/language driven approach first creates a notation fitting the problem space directly, then worries about implementing the notation. It's truly empowering. But this is the lisp way. The APL (or Clojure) way is about making your base types truly useful, 100 functions on 1 data structure instead of 10 on 10. So instead of creating a DSL in APL, you design and layout your data very carefully and then everything just falls into place, a bit backwards from the first impression.

xelxebar|10 months ago

You stole the words from my mouth!

One of the issues DSLs give me is that the process of using them invariably obsoletes their utility. That is, the process of writing an implementation seems to be synonymous with the process of learning what DSL your problem really needs.

If you can manage to fluidly update your DSL design along the way, it might work, but in my experience the premature assumptions of initial designs end up getting baked in to so much code that it's really painful to migrate.

APL, on the other hand, I have found extremely amenable to updates and rewrites. I mean, even just psychologically, it feels way more sensible to rewrite a couple lines of code versus a couple hundred, and in practice, I find the language to be very amenable for quickly exploring a problem domain with code sketches.

smikhanov|10 months ago

    APL (or Clojure) way is about making your base types truly useful, 100 functions on 1 data structure instead of 10 on 10
If this is indeed this simple and this obvious, why didn't other languages followed this way?

peralmq|10 months ago

Good point. Notation matters in how we explore ideas.

Reminds me of Richard Feynman. He started inventing his own math notation as a teenager while learning trigonometry. He didn’t like how sine and cosine were written, so he made up his own symbols to simplify the formulas and reduce clutter. Just to make it all more intuitive for him.

And he never stopped. Later, he invented entirely new ways to think about physics tied to how he expressed himself, like Feynman diagrams (https://en.wikipedia.org/wiki/Feynman_diagram) and slash notation (https://en.wikipedia.org/wiki/Feynman_slash_notation).

nonrandomstring|10 months ago

> Notation matters in how we explore ideas.

Indeed, historically. But are we not moving into a society where thought is unwelcome? We build tools to hide underlying notation and structure, not because it affords abstraction but because its "efficient". Is there not a tragedy afoot, by which technology, at its peak, nullifies all its foundations? Those who can do mental formalism, mathematics, code etc, I doubt we will have any place in a future society that values only superficial convenience, the appearance of correctness, and shuns as "slow old throwbacks" those who reason symbolically, "the hard way" (without AI).

(cue a dozen comments on how "AI actually helps" and amplifies symbolic human thought processes)

agumonkey|10 months ago

There's something about economy of thought and ergonomics.. on a smaller scale, when coffeescript popped up, it radically altered how i wrote javascript, because lambda shorthand and all syntactic conveniences. Made it easier to think, read and rewrite.

Same goes for sml/haskell and lisps (at least to me)

mac9|10 months ago

[deleted]