top | item 37414624

God writes Haskell

115 points| Ivoah | 2 years ago |hookrace.net

121 comments

order
[+] woolion|2 years ago|reply
With respect to the theological view of the question; this is always painful to me.— I am bewildered.— I had no intention to write atheistically. But I own that I cannot see, as plainly as others do, & as I should wish to do, evidence of design & beneficence on all sides of us. There seems to me too much misery in the Haskell world. I cannot persuade myself that a beneficent & omnipotent God would have designedly created the Haskell with recursive data structure yet leave the maximal size of tuples to arbitrary implementation details, have a restrictive typing system that would not include dependent types, or force the hand of the programmer to use the bang operator to manage memory manually to avoid consequences of laziness. Not believing this, I see no necessity in the belief that the Haskell was expressly designed.
[+] bckr|2 years ago|reply
I feel most deeply that the whole subject is too profound for the human intellect. A dog might as well speculate on the mind of Newton.
[+] bicx|2 years ago|reply
Those seemingly imperfect attributes were placed there by the devil (a Perl dev) to cause fear and doubt.
[+] misja111|2 years ago|reply
I would like to add that just like in our universe, in Haskell entropy only ever increases and never goes down. I.E., add more functionality to your program and the complexity will go up. This is true for all programming languages of course, but in my limited experience larger Haskell programs seem to become exponentially more complex.
[+] slacknatcher12|2 years ago|reply
this is an interesting observation. I have various explanation on why that might be the case.

on a commercial setting there is more pressure to deliver code then to review it. combine it with the lack of our benevolent dictator for life (that has been on the project the world life cycle, not just recently), there is no one with power to actually say no to changes.

language geeks are novelty seekers. they will use every feature of their language . stronger languages have more features to abuse. so on a commercial setting you will have all the features being used without much thought an architectural design that says you that no, we shouldn't do that.

that's also why I think projects with benevolent dictators for life on the open source ward don't fall these in paths even though they use languages that are stronger.

you could restrict yourself to use languages that have only one way to be used, so python as it was originally. or use a language little abstraction power. but you will suffer in other ways. as abstraction power is genuinely useful. accidental complexity has a way of getting in by expressive means or by social means.

[+] whateveracct|2 years ago|reply
The key is to not write large Haskell programs. Write many Haskell libraries and compose them. This scales infinitely and is an excellent way to build software. But it's also hard to do and takes good & deliberate technical leadership if you want to get 50+ engineers doing it.
[+] riku_iki|2 years ago|reply
I imagine there can be some language, which can automatically modify programs, e.g. you added some conditions, and compiled removed/disabled/moved to obsolete package all unnecessary branches which became unreachable or simplified logic.
[+] slacknatcher12|2 years ago|reply
not to argue your major point, but more entropy makes things more uniform/less complex. think of the heat death of the universe.
[+] tromp|2 years ago|reply
> Haskell beginners often use lists instead of arrays. You can’t do random access in a linked list, but only access the first element and then the rest of the list. The real world also doesn’t allow you random access, you are limited by the speed of light and have to go from one location to the next.

You don't need arrays for random access though. Haskell trees give you access to 2^n leaves within depth n, which also exceeds physical limitations like the speed of light.

[+] fatfingerd|2 years ago|reply
But Quantum entanglement seems to be traveling in God's channel, how does that happen if God is using linked lists?
[+] rowanG077|2 years ago|reply
This for sure doesn't violate speed of light. I can also access >>>>>2^n locations within n time in the real world.
[+] gnull|2 years ago|reply
What does it have to do with the speed of light?
[+] galfarragem|2 years ago|reply
I would bet that He writes Lisp. Ultimately everything is the same. Particles. Above it: particles made of particles.
[+] jexp|2 years ago|reply
God wrote in Lisp, Bob Kanefsky performed by Julia Ecklar. My favorite song.

https://www.prometheus-music.com/audio/eternalflame.mp3

Refrain (full lyrics): http://www.songworm.com/lyrics/songworm-parody/EternalFlame....

For God wrote in Lisp code

When he filled the leaves with green.

The fractal flowers and recursive roots:

The most lovely hack I’ve seen.

And when I ponder snowflakes, never finding two the same,

I know God likes a language with its own four-letter name.

[+] sph|2 years ago|reply
The next ground-breaking achievement in quantum physics will be the discovery of the two smallest particles everything derives from: a left paren and a right paren.
[+] 1letterunixname|2 years ago|reply
That's only used in the Old Testament.

Hell, Genesis was written in assembly for the 8008 in only 520 bits but it took a damn sandpile of support chips to bootstrap.

God uses Elixir with Rustler now because of concurrency and lazy evaluation gotchas of Haskell. (STM didn't cut it.)

Meanwhile, the devil imposes a standard of FORTRAN77 and COBOL62 with a spaghetti mess of uncommented code containing GOTOs, "god" functions, and meaningless identifiers.

[+] kevinlu1248|2 years ago|reply
This makes me believe simulation theory even more tbh. Quantum mechanics exist to fuse operations, altogether making simulating our universe more computationally inexpensive.
[+] lessaligned|2 years ago|reply
There's an even deeper way to thing about it: if you actually want to parallellize the simulation of multiple scenarios, or if you're running smth. that needs to compute smth in >4d, quantum mechanics + parallel universes" might be the computationally optimal way to do it!

...we don't think about it this way often because we'd be thinking about computational problems so huuuuge that we'd be like the quarks inside the atoms inside the transistors inside plannet-sized clusters spanning galaxies to even fathom computing it ...and it's not necesarily a feel-good perspective :)

I mean, even the speed-of-light limit and general relativity seem like optimizations you'd do in order to better parallelize something you need to compute on some unfathomable "hardware" in some baseline-reality that might not have the same constraints...

...and to finish the coffee-high-rant: if you want FTL you probably can't get it "inside" because it would break the simulation, you'd need to "get out" ...or more like "get plucked out" by some-thing/god :P (ergo, when we see alien artifacts UFOs etc. that seemed to have done FTL... we kind of need to start assuming MORE than _their_ existence and just them being 'more advanced' than us)

[+] eigenket|2 years ago|reply
People write this sort of thing a lot, and I don't really understand it. Simulating quantum systems is dramatically (formally speaking exponentially) more expensive than simulating classical ones (at least as far as our current understanding of complexity theory goes). If you're going to simulate a universe, and you want to cheap-out on computer power, then you should simulate a classical one.
[+] candiodari|2 years ago|reply
Let's face facts here: God just fell asleep on the keyboard, and by a staggering coincidence, or perhaps a weird shape of the head, the first 4 letters he typed were P, E, R, L.

He's still sleeping.

[+] silisili|2 years ago|reply
bless $thisComment;
[+] fluoridation|2 years ago|reply
>The real world also doesn’t allow you random access, you are limited by the speed of light and have to go from one location to the next.

"Random access" doesn't mean that accessing an item always takes the same time regardless of the size of the collection, it means that, if the size of the collection doesn't change, access times are uniform independently of which particular item is accessed.

For example, one might conceive of a storage device shaped like a sphere the size of the solar system, where an item is read by shining a laser onto the surface of the sphere and measuring how the laser is scattered on its way back. Such a device would be random access, even though it's impossible to grow the collection, and even though a collection with twice the radius and four times the storage size would have four times the latency.

[+] nurettin|2 years ago|reply
This kind of thinking happens when you are a strong expert in a field, but your frontal lobes stop receiving enough blood. When this happens, something simple as lazy evaluation becomes the key to the universe.
[+] lessaligned|2 years ago|reply
...it's useful to (over)generalize sometimes to get more explanatory power for things.

I mean, it probably says nothing useful about programming, but the other way around, thinking of "uncolapsed" wave-functions as lazy-evaluation could be useful. I'm not up-to-date on theoretical physics, but I think there might be something like that in Deutsch's constructor theory.

In programming I'd prefer more a language that makes syntactically/visually obvious what's lazy and what not and allows you to pick (eg. like Rust does with &mut), with some sigil maybe, but that's probably a low-prio for many language designers nowadays...

EDIT+: and you could say you practically get this already in mainstream languages... lazy-vals are just functions and it's probably good enough or better for most programmers to have them distinct/explicit.

[+] archibaldJ|2 years ago|reply
I want to ask God how to make my stack build process faster.. even turning off the optimization flag it still takes quite some time on my 2.6 GHz 6-Core Intel Core i7.. (or is it because I'm on a Mac? Does it build faster on Linux?)
[+] Xeamek|2 years ago|reply
So uh... When are we rewriting this in Rust?
[+] eggy|2 years ago|reply
No, in SPARK2014, a subset of Ada, that is everything Rust is trying to be ;)
[+] PartiallyTyped|2 years ago|reply
> Consider the wave-particle duality in quantum mechanics. Every particle behaves as a wave, as long as you haven’t interacted with it. Thanks to Haskell’s lazy evaluation values are also only evaluated once they are accessed (interacted with particles), and stay unevaluated thunks (waves) in the meantime.

Lazy evaluation is a beautiful thing, and in many ways, it is the solution self-reference.

Hofstadter in "I am a strange loop" and Gödel Escher Bach talks about this, well, he talks about many things, but he also talks about how Gödel's numbers can map to proofs that are self-referential, and relates that to humans, how out of very basic building-blocks, if enough representational power exists, self-reference and therefore consciousness exists.

He posits that humans, while self-referential, don't fall into infinite strange loops because they can assign the abstraction of "self" onto an "object" and evaluate only as needed. In essence, the "self" is lazily evaluated.

[+] taneq|2 years ago|reply
If God wrote any variant of lisp he’d have mentioned it repeatedly by now.
[+] mrkeen|2 years ago|reply
Dumb, but it's fun hearing it from the other side.

If you ever make a "Haskell is bad because it doesn't use state but the real world uses state" argument, this is what you sound like.

[+] stared|2 years ago|reply
This view on wave-particle duality and the quantum measurement is a (very) leaky abstraction. It is a process, governed by decoherence - for a nice overview, see e.g. "Decoherence, einselection, and the quantum origins of the classical" by Zurek (https://arxiv.org/abs/quant-ph/0105127).
[+] nice_byte|2 years ago|reply
forget grasping at straws, this is grasping at bose-einstein condensate :)
[+] sssilver|2 years ago|reply
Haven’t we actually established that God DOES in fact play dice, that Niels Bohr was right, and that Einstein was wrong?
[+] eigenket|2 years ago|reply
No, in fact I would say almost the exact opposite. Einstein's famous quote was expressing his distaste for the "Copenhagen interpretation" of quantum mechanics. Among people who seriously think about interpretations of quantum mechanics, many (but not all) think that there are serious flaws with the Copenhagen interpretation.
[+] mbfg|2 years ago|reply
Interestingly i don't think we know that the speed of light remains constants, nor could we devise a test to determine it.
[+] fluoridation|2 years ago|reply
Currently the speed of light is constant by definition of the meter. If we were to find certain cases where the speed of light appears to be different from c, that would be interpreted as compression or expansion of spacetime. For example, universal expansion can be reinterpreted as light being faster in the past.

I'm wondering whether it could be reinterpreted as time having a variable rate.