(no title)
gbletr42 | 2 years ago
I can't reproduce this. These are the commands I used, with it freshly compiled on a ubuntu docker container, both the v0.1 release and the master tree.
./bef -c -i Makefile -o Makefile.bef
dd if=/dev/zero of=Makefile.bef bs=1 oseek=300 count=1
./bef -d -i Makefile.bef -o Makefile2
cmp Makefile Makefile2 || echo "failed!"
edit: oh, I see, you 'removed a character'. Depending on what character you removed or corrupted from the output, you could've either hit the issue described above with inserting noise, but this time removing information, or you caused corruption in the header by either corrupting the magic number, hash type, or hash itself. The command line utility automatically truncates the output before calling the deconstruct function. The header is sadly the biggest single point of failure in the tool/format, which is why I introduced the --raw flag for those who don't want it.
doubloon|2 years ago
dd if=/dev/urandom of=x bs=1M count=2
./bef -c -i x -o x.bef
dd if=/dev/zero of=x.bef bs=1 seek=20000 count=100 conv=notrunc
./bef -d -i x.bef -o x2
sha256sum x x2
i also tried count of 1000 and even 10000 and it still worked!!!! pretty awesome
gbletr42|2 years ago
Mind you, bursts greater than a certain size, around 8000 bytes for the defaults, are a game of probability in what damage they will do. In most cases it'll work like you did for you, but in some unlucky offsets 10000 byte corruptions with default settings could do something like this.
1 byte corrupts fragment x of block n, 4096 bytes decimate fragment x of block n+1, 4096 bytes decimate fragment x of block n+2, 1807 bytes corrupts fragment x+1 of block n.
The defaults only have 1 parity fragment, so the fact two fragments were corrupted for block n means it can't be faithfully reconstructed. In the average case you should be fine, but it is something I should probably communicate more effectively in the README (I only hint at it with the 8K per 192K comment).