top | item 37382674

(no title)

taywrobel | 2 years ago

You may be interested in what we’re working on at Symbolica AI.

We’re using formal logic in the form of abstract rewrite systems over a causal graph to perform geometric deep learning. In theory it should be able to learn the same topological structure of data that neural networks do, but using entirely discrete operations and without the random walk inherent to stochastic gradient descent.

Current experiments are really promising, and assuming the growth curve continues as we scale up you should be able to train a GPT-4 scale LLM in a few weeks on commodity hardware (we are using a desktop with 4 4090’s currently), and be able to do both inference and continual fine tuning/online learning on device.

discuss

order

KRAKRISMOTT|2 years ago

> We’re using formal logic in the form of abstract rewrite systems over a causal graph to perform geometric deep learning. In theory it should be able to learn the same topological structure of data that neural networks do, but using entirely discrete operations and without the random walk inherent to stochastic gradient descent.

Abstract rewrite like a computer algebra system's (e.g. Wolfram) term rewriting equation simplication method?

taywrobel|2 years ago

Heavily influenced by Wolfram's work on metamathematics and the physics project, in so far as using a rewrite system to uncover an emergent topology; we're just using it to uncover the topology of certain data (assuming that the manifold hypothesis is correct), rather than the topology of fundamental physics as he did.

pawelduda|2 years ago

Sounds cool, but what are the drawbacks?

taywrobel|2 years ago

Biggest drawback is that since the structure is all discrete, it is inherently weak at modeling statistical distributions. For example, it'll likely never best a neural network at stock market prediction or medical data extrapolation.

However, for things that are discrete and/or causal in nature, we expect it to outperform deep learning by a wide margin. We're focused on language to start, but want to eventually target planning and controls problems as well, such as self-driving and robotics.

Another drawback is that the algorithm as it stands today is based on a subgraph isomorphism search, which is hard. Not hard as in tricky to get right like Paxos or other complex algorithms; like NP-Hard, so very difficult to scale. We have some fantastic Ph.Ds working with us who focus on optimization of subgraph isomorphism search, and category theorists working to formalize what constraints we can relax without effecting the learning mechanism of the rewrite system, so we're confident that it's achievable, but the time horizon is unknown currently.

k__|2 years ago

It doesn't exist at scale yet.

paulsutter|2 years ago

Especially interested in learning directly on geometries, please keep us updated and share results

taywrobel|2 years ago

Would definitely recommend Bronstein et. al's work on geometric deep learning! https://geometricdeeplearning.com

That's effectively the right hand side of the bridge that we're building between formal logic and deep learning. So far their work has been viewed mainly as descriptive, helping to understand neural networks better, but as their abstract calls out: "it gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented". That's us (we hope)!

arthurcolle|2 years ago

I would like to subscribe to your newsletter, we'd be super interested in this at Brainchain AI.

Drop me a link at (my first name) @ brainchain dot AI if you'd like to chat, I'd love to hear more about what you're working on!

dmarchand90|2 years ago

Really cool stuff! Do you have any recommendations of where we could learn more?

krak12|2 years ago

[deleted]