20 years ago, I was working on a consumer device, doing indexing and searching of books. The indexer had about 1 MB of RAM available, and had to work in the background on a very slow, single core CPU, without the user noticing any slowdown. A lot of the optimization work involved trying to get algorithmic complexity and memory use closer to a function of the distinct words in books than to a function of the total words in books. Typical novels have on the order of 10 K distinct words and 100 K total words.If you're indexing numbers, which we did, this book has little difference between total words and distinct words because it has so many distinct numbers in it. It ended up being a regular stress test to make sure our approach to capping memory use was working. But, because it constantly triggered that approach to capping memory usage, it took far longer to index than more typical books, including many that were much larger.
nereye|26 days ago
The Croatia flag in particular took quite a while to trace/draw (by hand).
nanna|26 days ago
vlovich123|26 days ago
unknown|26 days ago
[deleted]