(no title)
AIPedant | 5 months ago
The most depressing thing about AI summers is watching tech people cynically try to define intelligence downwards to excuse failures in current AI.
AIPedant | 5 months ago
The most depressing thing about AI summers is watching tech people cynically try to define intelligence downwards to excuse failures in current AI.
entropyneur|5 months ago
He made it because he predicted that it will have some effects enjoyable to him. Without knowing David Lynch personally I can assume that he made it because he predicted other people will like it. Although of course, it might have been some other goal. But unless he was completely unlike anyone I've ever met, it's safe to assume that before he started he had a picture of a world with Mullholland Drive existing in it that is somehow better than the current world without. He might or might not have been aware of it though.
Anyway, that's too much analysis of Mr. Lynch. The implicit question is how soon an AI will be able to make a movie that you, AIPedant, will enjoy as much as you've enjoyed Mulholland Drive. And I stand that how similar AI is to human intelligence or how much "true understanding" it has is completely irrelevant to answering that question.
whilenot-dev|5 months ago
As it stands, AI is a tool and requires artists/individuals to initiate a process. How many AI made artifacts do you know that enjoy the same cultural relevance as their human made counterparts? Novels, music, movies, shows, games... anything?
You're arguing that the types of film cameras play some part in the significant identity that makes Mulholland Drive a work of art, and I'd disagree. While artists/individuals might gain cultural recognition, the tool on its own rarely will. A tool of choice can be an inspiration for a work and gain a certain significance (e.g. the Honda CB77 Super Hawk[0]), but it seems that people always strive to look for the human individual behind any process, as it is generally accepted that the complete body of works tells a different story that any one artifact ever can.
Marcel Duchamp's Readymade[1] (and the mere choice of the artist) gave impact to this cultural shift more than a century ago, and I see similarities in economic and scientific efforts as well. Apple isn't Apple without the influence of a "Steve Jobs" or a "Jony Ive" - people are interested in the individuals behind companies and institutions, while at the same time also tend to underestimate the amount of individuals that makes any work an artifact - but that's a different topic.
If some future form of AI will transcend into a sentient object that isn't a plain tool anymore, I'd guess (in stark contrast to popular perception) we'll all lose interest rather quickly.
[0]: https://en.wikipedia.org/wiki/Honda_CB77#Zen_and_the_Art_of_...
[1]: https://en.wikipedia.org/wiki/Fountain_(Duchamp)
gilleain|5 months ago
I mean ... he is David Lynch.
We seem to be defining "predicted" to mean "any vision or idea I have of the future". Hopefully film directors have _some_ idea of what their film should look like, but that seems distinct from what they expect that it will end up.
koonsolo|5 months ago
It's clear that humans consider humans as intelligent. Is a monkey intelligent? A dolphin? A crow? An ant?
So I ask you, what is the lowest form of intelligence to you?
(I'm also a huge David Lynch fan by the way :D)
mcv|5 months ago
Originally they thought: chess takes intelligence, so if computers can play chess, they must be intelligent. Eventually they could, and later even better than humans, but it's a very narrow aspect of intelligence.
Struggling to define what we mean by intelligence has always been part of AI research. Except when researchers stopped worrying about intelligence and started focusing on more well-defined tasks, like chess, translation, image recognition, driving, etc.
I don't know if we'll ever reach AGI, but on the way we'll discover a lot more about what we mean by intelligence.
AIPedant|5 months ago
I don't know what "the lowest form of intelligence" is, nobody has a clue what cognition means in lampreys and hagfish.
peterashford|5 months ago
throwawayqqq11|5 months ago
A prediction is just a reaction to a present state, which is the simplest definition of intelligence: The ability to (sense and) react to something. I like to use this definition, instead of "being able to predict", because its more generic.
The more sophisticated (and directed) the reaction is, the more intelligent the system must be. Following this logic, even a traffic light is intelligent, at least more intelligent than a simple rock.
From that perspective, the question of why a creator produced a piece of art becomes unimportant to determine intelligence, since the simple fact that he did is sign of intelligence already.
simianwords|5 months ago
peterashford|5 months ago
MrScruff|5 months ago
Reductive arguments may not give us an immediate forward path to reproducing these emergent phenomena in artificial brains, but it's also the case that emergent phenomena are by definition impossible to predict - I don't think anyone predicted the current behaviours of LLMs for example.
keeda|5 months ago
It also happens to be a leading theory in neuroscience: https://news.ycombinator.com/item?id=45058056
pu_pe|5 months ago
WithinReason|5 months ago