top | item 46220418

(no title)

brainbag | 2 months ago

With context, this article is more interesting than the title might imply.

> The Sanitizer API is a proposed new browser API to bring a safe and easy-to-use capability to sanitize HTML into the web platform [and] is currently being incubated in the Sanitizer API WICG, with the goal of bringing this to the WHATWG.

Which would replace the need for sanitizing user-entered content with libraries like DOMPurify by having it built into the browser's API.

The proposed specification has additional information: https://github.com/WICG/sanitizer-api/

discuss

order

crote|2 months ago

Yeah, I was expecting something closer to "because that's what people Google for".

A big part of designing a security-related API is making it really easy and obvious to do the secure thing, and hide the insecure stuff behind a giant "here be dragons" sign. You want people to accidentally do the right thing, so you call your secure and insecure functions "setHTML" and "setUnsafeHTML" instead of "setSanitizedHTML" and "setHTML".

cess11|2 months ago

get_magic_quotes_gpc() and mysql_real_escape_string() had quite a bit to teach in this area.

_jzlw|2 months ago

The author really needs to start with that. They say "the API that we are building" and assume I know who they are and what they're working on, all the way until the very bottom. I just assumed it's some open source library.

> HTML parsing is not stable and a line of HTML being parsed and serialized and parsed again may turn into something rather different

Are there any examples where the first approach (sanitize to string and set inner html) is actually dangerous? Because it's pretty much the only thing you can do when sanitizing server-side, which we do a lot.

Edit: I also wonder how one would add for example rel="nofollow noreferrer" to links using this. Some sanitizers have a "post process node" visitor function for this purpose (it already has to traverse the dom tree anyway).

crote|2 months ago

> Are there any examples where the first approach (sanitize to string and set inner html) is actually dangerous?

The article links to [0], which has some examples of instances in which HTML parsing is context-sensitive. The exact same string being put into a <div> might be totally fine, while putting it inside a <style> results in XSS.

[0]: https://www.sonarsource.com/blog/mxss-the-vulnerability-hidi...

tobr|2 months ago

> They say "the API that we are building" and assume I know who they are and what they're working on, all the way until the very bottom.

This is a common and rather tiresome critique of all kinds of blog posts. I think it is fair to assume the reader has a bit of contextual awareness when you publish on your personal blog. Yes, you were linked to it from a place without that context, but it’s readily available on the page, not a secret.

LegionMammal978|2 months ago

They had a link in their post [0]: it seems like most of the examples are with HTML elements with wacky contextual parsing semantics such as <svg> or <noscript>. Their recommendation for server-side sanitization is "don't, lol", and they don't offer much advice regarding it.

Personally, my recommendation in most cases would be "maintain a strict list of common elements/attributes to allow in the serialized form, and don't put anything weird in that list: if a serialize-parse roundtrip has the remote possibility of breaking something, then you're allowing too much". Also, "if you want to mutate something, then do it in the object tree, not in the serialized version".

[0] https://www.sonarsource.com/blog/mxss-the-vulnerability-hidi...

rebane2001|2 months ago

> Because it's pretty much the only thing you can do when sanitizing server-side

I'd suggest not sanitizing user-provided HTML on the server. It's totally fine to do if you're fully sanitizing it, but gets a little sketchy when you want to keep certain elements and attributes.

masklinn|2 months ago

> Are there any examples where the first approach (sanitize to string and set inner html) is actually dangerous?

The term to look for is “mutation xss” (or mxss).