(no title)
boredguy8 | 9 years ago
"Go is famously a more complex game than chess, with its larger board, longer games, and many more pieces. Google’s DeepMind artificial intelligence team likes to say that there are more possible Go boards than atoms in the known universe, but that vastly understates the computational problem. There are about 10^170 board positions in Go, and only 10^80 atoms in the universe. That means that if there were as many parallel universes as there are atoms in our universe (!), then the total number of atoms in all those universes combined would be close to the possibilities on a single Go board."
http://www.slate.com/articles/technology/technology/2016/03/...
sametmax|9 years ago
In Go, the number of items is the number of pieces, and it's very small.
In the universe, the number of combinations of positions of all the atoms is, well, wonderful.
harryjo|9 years ago
tomrod|9 years ago
deepnet|9 years ago
Compared with a googolplex (10^(10^100)) the entire Evrettian metaverse is negligible as (10^(10^100) - 10^80^2 * (average quarks in atom) * leptons(10^200) * dark multiplier(10^2) = ~1 googolplex
Has anyone ever used a googolplex for anything ?
[For ~ read approximately]
Retric|9 years ago
*https://en.wikipedia.org/wiki/Ackermann_function
dsfuoi|9 years ago
randommodnar|9 years ago
The real comparison would be the number of pieces on a Go board (19x19 = 361) compared to the number of atoms in the universe. And then to compare the number of possible board positions in Go, with the number of possible atom positions in the universe, and in this case I think the universe wins.....
dack|9 years ago
IsaacL|9 years ago
However, other problems have even larger state spaces. Imagine writing an AI which read project Euler problem descriptions (in English) and output working code (in some given programming language). Keep outputs limited to 100-line scripts, max 80 characters per line.
There's roughly 100 usable characters in ASCII, so the possible space of 100-line programs is roughly:
(10^2)^(80 * 100) = 10^16000.
You could simplify this by having the AI work with predefined tokens rather than individual characters, but it's still a vast amount of combinations. Then consider 1000-line or 10000-line programs, and you see how high a mountain AI still has to climb. Humans are able to "compress" this state space via conceptual reasoning, which is much more complex than the "pattern recognition" many deep learning researchers are chasing.
(See "Introduction to Objectivist Epistemology" for more on how humans think in concepts - I'm planning to write more at some point on how this book shows where the practical limits of AI lie).
tossaway1|9 years ago
Close?? Wouldn't it still be roughly 10 billion times smaller...?
vidarh|9 years ago