Removing XSLT from browsers was long overdue and I'm saying that as ex-maintainer of libxslt who probably triggered (not caused) this removal. What's more interesting is that Chromium plans to switch to a Rust-based XML parser. Currently, they seem to favor xml-rs which only implements a subset of XML. So apparently, Google is willing to remove standards-compliant XML support as well. This is a lot more concerning.
xmcp123|3 months ago
Having flashbacks of “<!--[if IE 6]> <script src="fix-ie6.js"></script> <![endif]-->”
granzymes|3 months ago
Not sure how you got from that to “Google is ignoring standards”.
Aurornis|3 months ago
waitwot|3 months ago
otabdeveloper4|3 months ago
jillesvangurp|3 months ago
I'm not so sure that's problematic. Probably browser just aren't a great platform for doing a lot of XML processing at this point.
Preserving the half implemented frozen state of the early 2000s really doesn't really serve anyone except those maintaining legacy applications from that era. I can see why they are pulling out complex C++ code related to all this.
It's the natural conclusion of XHTML being sidelined in favor of HTML 5 about 15-20 years ago. The whole web service bubble, bloated namespace processing, and all the other complexity that came with that just has a lot of gnarly libraries associated with it. The world kind of has moved on since then.
From a security point of view it's probably a good idea to reduce the attack surface a bit by moving to a Rust based implementation. What use cases remain for XML parsing in a browser if XSLT support is removed? I guess some parsing from javascript. In which case you could argue that the usual solution in the JS world of using polyfills and e.g. wasm libraries might provide a valid/good enough alternative or migration path.
fithisux|3 months ago
Also it is not complexity if XSLT lives in a third-party library with a well defined interface.
Thei problem is control. They gain control in 2 ways. They will get more involved in xml code base and the bad actors run in the JS sandbox.
That is why we have standards though. To relinquish control through interoperability.
pjmlp|3 months ago
microtherion|3 months ago
svieira|3 months ago
> Google is willing to remove standards-compliant XML support as well.
> They're the same picture.
To spell it out, "if it's inconvenient, it goes", is something that the _owner_ does. The culture of the web was "the owners are those who run the web sites, the servants are the software that provides an entry point to the web (read or publish or both)". This kind of "well, it's dashed inconvenient to maintain a WASM layer for a dependency that is not safe to vendor any more as a C dependency" is not the kind of servant-oriented mentality that made the web great, not just as a platform to build on, but as a platform to emulate.
akerl_|3 months ago
Aurornis|3 months ago
This is an attempt to rewrite history.
Early browser like NCSA Mosaic were never even released as Open Source Software.
Netscape Navigator made headlines by offering a free version for academic or non-profit use, but they wanted to charge as much as $99 (in 1995 dollars!) for the browser.
Microsoft got in trouble for bundling a web browser with their operating system.
The current world where we have true open source browser options like Chromium is probably closer to a true open web than what some people have retconned the early days of the web as being.
zetafunction|3 months ago
Disclaimer: I work on Chrome and have occasionally dabbled in libxml2/libxslt in the past, but I'm not directly involved in any of the current work.
inejge|3 months ago
Ygg2|3 months ago
zzo38computer|3 months ago
(However, I also think that generally you should not require too many features, if it can be avoided, whether those features are JavaScripts, TLS, WebAssembly, CSS, and XSLT. However, they can be useful in many circumstances despite that.)
jjkaczor|3 months ago
dietr1ch|3 months ago
Which seems to be a sane decision given the XML language allows for data blow-ups[^0]. I'm not sure what specific subset of XML `xml-rs` implements, but to me it seems insane to fully implement XML because of this.
[^0]: https://en.wikipedia.org/wiki/Billion_laughs_attack
_heimdall|3 months ago
fithisux|3 months ago
Are XML technologies better or safer? Probably. However practice sets the standards. Is it a good thing? It remains to be seen.
Personally I am not satisfied with the "Web" experience. I find it unsafe, privacy disrespecting, slow and non-standards compliant.
cptskippy|3 months ago
What in particular do you find objectionable about this implementation? It's only claiming to be an XML parser, it isn't claiming to validate against a DTD or Schema.
The XML standard is very complex and broad, I would be surprised if anyone has implemented it in it's entirety beyond a company like Microsoft or Oracle. Even then I would question it.
At the end of the day, much of XML is hard if not impossible to use or maintain. A lot of it was defined without much thought given to practicality and for most developers they will never had to deal with a lot of it's eccentricities.
gnatolf|3 months ago
James_K|3 months ago