I just don't know if there's any point in fighting this fight. We either decide to ride the bandwagon and get our money, or we decide to just wait this out and wait for the bros to realize it's just ML dressed up all over again. It's just embarrassing to watch the hype cycle play out with the same suckers over and over.
Let me explain this hype to folks:
1. People suck at googling
2. People suck at information literacy - aka, the abstract ability to consume many sources and discern a sort of perceived truth from the supported commonality between them (read through threads here for an example) (and yes, this is inherently nuanced, so much so that again I'm not sure how to properly describe it).
3. People love being told what to do/think. (look at every influencer/podcasts, even including the ilk that is/was popular among the HN crowds)
4. "Take down Google", for many folks, is implicitly translated into "Microsoft can noticeably cut into Google's ad market revenue by making a better AI-powered search".
There are so, so, so many inferences, assumptions, pitfalls in #4 that I simply don't know how to explain it other than to just laugh and shrug and keep my head down on real work.
EDIT:
More and more it's becoming quite clear. Some folks are principled and care about information and education and society and understand the risks of misinformation at scale. And some folks see a way to get rich, or get some personal utility and can't or don't give a rats ass about the rest. Literally I just saw a thread where someone is pointing out clear CURRENT harm caused by ChatGPT powered products and the response was "IDC, it helped me, yolo".
I really just... want to jump ahead to a robot shooting me so I don't have to live through our ignorant enabling of silicon-valley-driven-killer-robots because I swear to god that's where the ignorant lot is driving us to.
nradov|3 years ago
Tv9m|3 years ago
bigbottomenergy|3 years ago
Let me explain this hype to folks:
1. People suck at googling
2. People suck at information literacy - aka, the abstract ability to consume many sources and discern a sort of perceived truth from the supported commonality between them (read through threads here for an example) (and yes, this is inherently nuanced, so much so that again I'm not sure how to properly describe it).
3. People love being told what to do/think. (look at every influencer/podcasts, even including the ilk that is/was popular among the HN crowds)
4. "Take down Google", for many folks, is implicitly translated into "Microsoft can noticeably cut into Google's ad market revenue by making a better AI-powered search".
There are so, so, so many inferences, assumptions, pitfalls in #4 that I simply don't know how to explain it other than to just laugh and shrug and keep my head down on real work.
EDIT:
More and more it's becoming quite clear. Some folks are principled and care about information and education and society and understand the risks of misinformation at scale. And some folks see a way to get rich, or get some personal utility and can't or don't give a rats ass about the rest. Literally I just saw a thread where someone is pointing out clear CURRENT harm caused by ChatGPT powered products and the response was "IDC, it helped me, yolo".
I really just... want to jump ahead to a robot shooting me so I don't have to live through our ignorant enabling of silicon-valley-driven-killer-robots because I swear to god that's where the ignorant lot is driving us to.