top | item 37855674

(no title)

w23j | 2 years ago

Can you name some of these reasons? Or give me link? Honest question!

discuss

order

alpaca128|2 years ago

One reason would be massively reduced syntax overhead and better readability. I've seen plenty of XML files where XML syntax makes up more than 50% of the file's content, and trying to read the actual content is tedious. Now JSON isn't ideal either - technically you could get rid of all commas, colons, and the quotes around most keys - but I sure prefer `{"foo": "some \"stuff\""}` over something like `<foo><![CDATA[some <stuff>]]></foo>`

w23j|2 years ago

I agree, I would prefer JSON (or YAML) for example for configuration files. That is for stuff that humans actually read. I was thinking about using JSON/XML as a data exchange format between computers, because the context of this discussion has revolved about things like JSON/XML-Schema, JSON/XPath and SOAP/OpenAPI. There is a large trend to replace XML with JSON as data format for inter machine communication, and it is confusing to me.

tgv|2 years ago

XML is too unwieldy for human consumption. Editing it is error-prone, and those schema-directed editors are even worse, because everything requires clicking and clicking and clicking.

For machine-to-machine communication, it's very well suited, but most data is simple enough, and the XML libraries I've used tended to be --let's say-- over-engineered, while there are no hoops to jump through when you want to parse JSON.

And one thing I always disliked about XML was the CDATA section: it makes the message even harder to read, and it's not like you're going to use that binary data unparsed/unchecked.

XML just tried to formalize data transfer and description prematurely, which made it rigid and not even sufficiently powerful. I must say that XSLT and XPath were great additions, though.

eviks|2 years ago

It's unreadable