top | item 44038789

(no title)

kbouck | 9 months ago

if a heap dump is a copy of all the bytes in memory, then wouldn't "thousands of heap dumps" likely be larger than 410GB?

napkin math:

  410GB/1000 dumps = 410MB per dump?

  410GB/2000 dumps = 205MB per dump

discuss

order

diggan|9 months ago

Might be filtered somewhat, like extracted all ASCII text then compile that into the dump, rather than just the raw dump files.

Edit: reading the description on the dump again, seems exactly what they did:

> Some of the archived data includes plaintext messages while other portions only include metadata, including sender and recipient information, timestamps, and group names. To facilitate research, Distributed Denial of Secrets has extracted the text from the original heap dumps.

https://ddosecrets.com/article/telemessage

coolcase|9 months ago

Kubernetes pods?