(no title)
x_may | 1 year ago
I also believe that the 200$ subscription they offer is just them allowing the TTC to go for longer before forcing it to answer.
If what you say is true, though, I agree that there is a huge headroom for TTC to improve results if the huggingface experiments on 1/3B models are anything to go off.
ankit219|1 year ago
> huge headroom for TTC to improve results ...1B/3B models
Absolutely. How this is productized remains to be seen. I have high hopes with MCTS and Iterative Preference Learning, but it is harder to implement. Not sure if Open AI has done that. Though Deepmind's results are unbelievably good [1].
[1]:https://arxiv.org/pdf/2405.00451v2
whimsicalism|1 year ago
HarHarVeryFunny|1 year ago
https://huggingface.co/spaces/HuggingFaceH4/blogpost-scaling...