NaiveBayesian | 1 year ago | on: AI models collapse when trained on recursively generated data
NaiveBayesian's comments
NaiveBayesian | 1 year ago | on: AI models collapse when trained on recursively generated data
So this means the sequence of μₙ will perform a kind of random walk that can stray arbitrarily far from 0 and is almost sure to eventually do so.
NaiveBayesian | 2 years ago | on: ERNIE, China's ChatGPT, cracks under pressure
ELMo: https://arxiv.org/abs/1802.05365 BERT: https://arxiv.org/abs/1810.04805 ERNIE: https://arxiv.org/abs/1904.09223v1 Big Bird: https://arxiv.org/abs/2007.14062
NaiveBayesian | 2 years ago | on: Google Maps Testing New Apple Maps-Inspired Map Style
[1] https://support.google.com/business/answer/7690269?hl=en
NaiveBayesian | 2 years ago | on: From Python to Elixir Machine Learning
The current workarounds to make this happen in python are quite ugly imho, e.g. Pytorch spawns multiple python processes and then pushes data between the processes through shared memory, which incurs quite some overhead. Tensorflow on the other hand requires you to stick to their Tensor-dsl so that it can run within their graph engine. If native concurrency were a thing, data loading would be much more straightforward to implement without such hacks.