> Furthermore, if we view transformers as Hopf algebras, one can bring convolutional models, diffusion models and transformers under a single umbrella.
I was so certain this was discussing Transformers like, the action figures, and have never been so confused looking at both a link and the comments section on HN before. Especially considering: https://github.com/xamat/TransformerCatalog/blob/main/02-01.... I'm just going to keep scrolling now :'D
When I was younger I would often encounter mentions of electrical transformers, and be quite disappointed when it wasn't related to the toys or the series. Even in my 40s I still have a bit of disappointment about it...
From the bottom of the page in question: "Figure 5: You can access the original table at https://docs.google.com/spreadsheets/d/
1ltyrAB6BL29cOv2fSpNQnnq2vbX8UrHl47d7FkIf6t4 for easier browsing across the different model features."
[+] [-] xamat|3 years ago|reply
[+] [-] adamnemecek|3 years ago|reply
https://arxiv.org/abs/2302.01834v1
The learning mechanism of transformer models was poorly understood however it turns out that a transformer is like a circuit with a feedback.
I argue that autodiff can be replaced with what I call in the paper Hopf coherence.
Furthermore, if we view transformers as Hopf algebras, one can bring convolutional models, diffusion models and transformers under a single umbrella.
I'm working on a next gen Hopf algebra based machine learning framework.
Join my discord if you want to discuss this further https://discord.gg/mr9TAhpyBW
[+] [-] amkkma|3 years ago|reply
[+] [-] erichocean|3 years ago|reply
Have you written any more about this?
[+] [-] erichocean|3 years ago|reply
[+] [-] amatecha|3 years ago|reply
[+] [-] visarga|3 years ago|reply
I don't think this affirmation is factual. There are people who played with this idea, but it is not part of chatGPT.
[+] [-] sva_|3 years ago|reply
> Extension:It can be seen as a generalization of BERT and GPT in that it combines ideas from both in the encoder and decoder
I believe this is an error? Text from BART. And a space missing.
[+] [-] visarga|3 years ago|reply
[+] [-] DerSaidin|3 years ago|reply
Are pages even needed anymore?
[+] [-] h_lezzaik|3 years ago|reply
[+] [-] theredlancer|3 years ago|reply
[+] [-] zndr|3 years ago|reply
[+] [-] sircastor|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] peresthe|3 years ago|reply
Yet of the 6 comments here, 2 of them are complaining about missing models and three more are arguing about the typesetting on figures.
[+] [-] pama|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] swyx|3 years ago|reply
[+] [-] mdp2021|3 years ago|reply
...Portable Document Format was /born/ to display vector (i.e. you just zoom in)... The error in the page was to embed a raster image of text!
[+] [-] nighthawk454|3 years ago|reply
https://arxiv.org/format/2302.07730
[+] [-] dylan604|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] abc20230215|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]