SamFold's comments

SamFold | 5 months ago | on: Alibaba's new AI chip: Key specifications comparable to H20

There’s a difference between raw numbers on paper and actual real world differences when training frontier models.

There’s a reason no frontier lab using AMD models for training, because the raw benchmarks for performance for a single chip for a single operation type don’t translate to performance during an actual full training run.

SamFold | 2 years ago | on: .NET 8

What would you say is the current best in class toolchain for .NET (assuming I'm running on a ARM Mac)?

SamFold | 2 years ago | on: Switching to Elixir

I wouldn't go as far as saying it's a "toy" for them, I know some engineers there and someone who joined recently who spent significant time learning Elixir before joining.

SamFold | 2 years ago | on: Rats have an imagination, new research suggests

I think it's somewhat of a stretch to say our "base model" had billions of years of evolution. Billions of years ago, mammals didn't exist and the only things around were more like plankton or algae and had nothing like a "base model" we could say we somehow inherited. The earliest ape-like creatures appeared around 10M years ago.

The first mammals appeared around 225M years ago so you could potentially argue that our "base model" first started evolving around then but I still think it's something of a stretch to compare this kind "training" to the ways we are training modern neural networks. The "base model" at this time was simply survival, eat, reproduce, survive, and enough circuitry to manage your base biological functions.

We are essentially running the entire volume of human knowledge through a neural network through billions of iterations and the model itself has 175 billion plus parameters. Humans nor any of our evolutionary ancestors never received this kind of "training", it's simply not comparable at all. Our mammalian ancestors were exposed to "basic" natural environments, they were not "pushed" into artifical situations to learn tool usage or language.

If we look at when apes first came about (10M years) ago and let's say since then the average ape or humanoid lived to 30-40 years, and estimate the average generation length for apes at 20 years (which is roughly accurate according to the latest research). This means that since the first recorded apes there have been about 500'000 generations of apes and humans. (12'000 generations for humans only).

So now if you compare how we are training our models, GPT-3 at 175B parameters and billions of iterations of training, GPT-4 we don't know. And again, extremely focused and specific training, feeding the entire human generated corpus of language, mathemetics, logic, etc etc into it, and we get something that does pretty well at human language.

Humans have a "base model" as you put it which really hasn't been trained for many generations and has been mostly exposed at random to external stimuli in an ad-hoc, unfocused way, and no single individual has ever been exposed to even a fraction of a fraction of a percent as much stimuli as a GPT model. So there is something different going on with our brain and neural networks and I think it can't really be compared at all: the mechanisms, numbers, and crucially, the results, do not match up in the slightest.

SamFold | 2 years ago | on: Researchers analysed novels to reveal six story types

I think my preference is Northrop Frye’s analysis in “Anatomy of Criticism”, his categories of “mythic”, “romantic”, “high mimetic”, “low mimetic”, “ironic” are particularly useful for analyzing the history of literature from mythic legends and epic poetry up to modern literature and fantasy.

Although this analysis isn’t so much for general plot structure as much as for looking at characters and particularly the main protagonist and their relationship to other characters and the environment of the novel.

SamFold | 2 years ago | on: The OpenAI Keynote

I tried this and it seemed to break ChatGPT, it blurted out something which made no sense and then offered to regenerate it. How is it supposed to work?

SamFold | 2 years ago | on: Sam Bankman-Fried is a feature, not a bug

Also, in the original stories, Robin actually was more like rob from the Church and give to the gentry. He was very friendly with the "right sort" of noblemen, he just really didn't like rich priests from the church. So we (in our modern age) have kind of perverted the idea of Robin Hood to fit our times but he never really was some kind of hero for the poor.

See here: https://en.wikipedia.org/wiki/A_Gest_of_Robyn_Hode#Summary

SamFold | 2 years ago | on: Telling GPT-4 you're scared or under pressure improves performance

One interesting point here is that humans can learn to do things without language at all. If you raised a human baby and never exposed it to any language it would still learn skills and behaviors. So while intelligence and reasoning in humans does still seem to be linked to language it’s not quite as simple as all knowledge and reasoning simply being encoded in language.

Whereas (obviously) ChatGPT is completely based on language and can’t do anything without language or anything that isn’t derived directly from language.

SamFold | 2 years ago | on: The Beatles Release 'Now and Then'

So apparently I'm in a HN minority (given the current comments) given that I absolutely love this single and in fact it almost immediately made me cry hearing John's voice and those lyrics. The song made me think about the passing of time, my own aging, and the people I've lost along the way.

I think it's a beautiful song and I think watching the "Making Of" video made the song more impactful for me, making me think of those four close friends and now with only two of them left... "Now and Then"

SamFold | 2 years ago | on: Researchers find a rare compound, plumbonacrite, in first layer of Mona Lisa

But there are many famous paintings out there that I know of because they are famous that I didn’t fall in love with.

The argument seems to be that I fell in love with the Mona Lisa as a result of it being famous, versus some other quality it has which I find appealing.

But for example, Van Gogh’s Sunflowers is also incredibly famous but I don’t find that painting appealing at all. For me, it’s boring. Ditto for many other famous paintings. What I’m saying is that I’m not sure so sure that’s it’s logical to say: “oh this thing is just popular/well loved because it’s famous, there’s no other reason”

SamFold | 2 years ago | on: Researchers find a rare compound, plumbonacrite, in first layer of Mona Lisa

How can you possibly argue the Mona Lisa isn’t inherently interesting or moving? It wouldn’t be the most famous painting in the world if it wasn’t interesting or moving.

And no, it’s not because it was stolen. Most people don’t even know it was stolen. I loved the painting and it moved me from the moment I saw it as a child.

SamFold | 2 years ago | on: Researchers find a rare compound, plumbonacrite, in first layer of Mona Lisa

It wouldn’t be as famous as it is if it didn’t have something remarkable about it. Art is about creating emotion in people and whether you think it has good “aesthetics” or not, the Mona Lisa captures people’s imaginations almost more than any other painting in the world.

For me there is a mystery and secret in the Mona Lisa which doesn’t exist in other paintings that might show great “skill” and in the end, art is about emotional impact, not what some critic thinks is “skillful”

SamFold | 2 years ago | on: Researchers find a rare compound, plumbonacrite, in first layer of Mona Lisa

If it didn’t have some other appeal the theft would never have led to the worldwide appeal of the painting. The theft may have kickstarted the recognition but this painting has something special about it, the woman, her expression, there’s a mystery about it.

I loved the painting before I ever knew it was stolen.

I think it’s fair to say there is something about this painting that isn’t in many other works or most other works, and it’s a human and psychological quality more than some special painting technique.

page 1