Agreed, but maybe the step change there is refactoring the data model, not continuing to author “hairy” sql via LLM that’s all fine until it breaks… and you end having to mend the nastiness back into compliance the ol’ fashion way
You could definitely load your billions of records with millions of relationships into memory, denormalize, restructure and rewrite the data (flawlessly) a lot cheaper (computationally) than running a large LLM on all that hardware.
fijiaarone|4 months ago