(no title)
holygoat | 9 years ago
Then an hour of writing good tests.
Then lots of manual and automated testing on four or five platforms, and fixing the weird issues you get on Windows XP SP2, or only on 64-bit Linux running in a VM, or whatever.
Then making sure you don't regress startup performance (which you probably will unless you have a really, really slow disk).
Then implementing rock solid migration so you can land this in Nightly through to Beta.
Then a rollback/backout plan, because you might find a bug, and you don't want users to lose their session between Beta 2 and Beta 3.
Large-scale software engineering is mostly not about writing code.
acqq|9 years ago
No, for example, LZ4 is unbelievably fast:
https://github.com/Cyan4973/lz4
almost 2 GB per second in decompression!
I've just tried compressing some backupXX.session file (the biggest I've managed to find, just around 2 MB) and it compressed to 70% of the original, probably not enough to implement the compression -- and I suspect the reason is that the file contains too much base64 encode image files which can't be much compressed?
So the answer to having sane session files can be first to stop storing the favicons (and other images(?)) there? I still believe somebody should analyze the big files carefully to get the most relevant facts. For the start, somebody should make a tool that extracts all the pictures from the .session file (should be easy with python or Perl, for example), just that we know what's inside.
acqq|9 years ago
bsdetector|9 years ago
Add the code that's able to load compressed session backups and leave it in for a couple versions.
Once enough versions have passed enable the code that writes compressed session backups.
It's really not that hard to do unless you want to enable it now.