top | item 16192809

(no title)

kripke | 8 years ago

This is interesting, and thanks for trying to clean up the protocol, but isn't this an orthogonal issue? The problems you are describing seem to be related to using an undocumented protocol, which is unrelated to using a custom serialization format vs building upon an existing one.

Building upon an existing, well-documented, and relatively sane serialization format (protobuf, capn't proto, message pack, json, heck even bencode for all I care) is usually a good thing, and so is decoupling the messages from the details of an implementation's internals. Language and framework internal serializers (such as Python's pickle or, apparently, Qt's serializer) tend to make it harder to achieve both goals.

discuss

order

kuschku|8 years ago

> Building upon an existing, well-documented, and relatively sane serialization format

The problem with that is that whatever format seems well-documented and relatively sane today might become an obscure, unknown protocol 10 years down the road.

kentonv|8 years ago

FWIW, Protobuf has now been open source for a decade and has been used for basically everything inside Google since about the turn of the century. Protobuf predates JSON, and I would wager that, worldwide, much more data is stored in Protobuf format and many more cycles are spent parsing Protobuf format than JSON. For Protobuf to die out, Google itself would have to die, as would quite a few other companies that heavily rely on it. It doesn't seem likely to happen any time soon.

I unfortunately am not in a position to make such strong statements about Cap'n Proto. However, implementations exist in C++, Java, JavaScript, Rust, Go, Python, and a bunch of other languages, so it should at least be much easier to deal with than Qt serialization.

(Disclosure: I'm the author of Cap'n Proto and of the first open source release of Protobuf.)