top | item 44834134 (no title) asboans | 6 months ago It would be fun to train an LLM with a knowledge cutoff of 1900 or something discuss order hn newest twh270|6 months ago Someone tried this, I saw it one of the Reddit AI subs. They were training a local model on whatever they could find that was written before $cutoffDate.Found the GitHub: https://github.com/haykgrigo3/TimeCapsuleLLM ph4evers|6 months ago That’s been done to see if it could extrapolate and predict the future. Can’t find the link right now to the paper. creativeSlumber|6 months ago This one? "Mind the Gap: Assessing Temporal Generalization in Neural Language Models" https://arxiv.org/abs/2102.01951 load replies (1) yanis_t|6 months ago Not sure we have enough data for any pre-internet date. artursapek|6 months ago That would be hysterical
twh270|6 months ago Someone tried this, I saw it one of the Reddit AI subs. They were training a local model on whatever they could find that was written before $cutoffDate.Found the GitHub: https://github.com/haykgrigo3/TimeCapsuleLLM
ph4evers|6 months ago That’s been done to see if it could extrapolate and predict the future. Can’t find the link right now to the paper. creativeSlumber|6 months ago This one? "Mind the Gap: Assessing Temporal Generalization in Neural Language Models" https://arxiv.org/abs/2102.01951 load replies (1)
creativeSlumber|6 months ago This one? "Mind the Gap: Assessing Temporal Generalization in Neural Language Models" https://arxiv.org/abs/2102.01951 load replies (1)
twh270|6 months ago
Found the GitHub: https://github.com/haykgrigo3/TimeCapsuleLLM
ph4evers|6 months ago
creativeSlumber|6 months ago
yanis_t|6 months ago
artursapek|6 months ago