top | item 35818578 (no title) kyaghmour | 2 years ago Google's moat is its data set. Imagine training an generative AI LLM on the entire set of YouTube training videos. No one else has this. discuss order hn newest RecycledEle|2 years ago The entire set of YouTube training videos needs to be re-transcribed before they are useful for training LLMs. dopeboy|2 years ago This is the glaring omission in this piece.Googles know _so much_ about me. Is it not reasonable to assume powerful llm + personal data = personal tuned LLM?
RecycledEle|2 years ago The entire set of YouTube training videos needs to be re-transcribed before they are useful for training LLMs.
dopeboy|2 years ago This is the glaring omission in this piece.Googles know _so much_ about me. Is it not reasonable to assume powerful llm + personal data = personal tuned LLM?
RecycledEle|2 years ago
dopeboy|2 years ago
Googles know _so much_ about me. Is it not reasonable to assume powerful llm + personal data = personal tuned LLM?