top | item 44490206

(no title)

resurrectedcyb | 7 months ago

Is there any tool for Rust that does profiling that detects what part of compilation time is caused by what? Like, a tool that reports:

- Parsing: x ms

- Type checking: y ms

- LLVM IR generation: z ms

And have there been any statistics done on that across open-source projects, like mean, median, percentiles and so on?

I am asking because it should depend a lot on each project what is costly in compile time, making it more difficult to analyse. And I am also curious about how many projects are covered by "edge cases", if it is 1%, 0.1%, 0.01%, and so on.

discuss

order

steveklabnik|7 months ago

The post my original comment is on discusses doing this at length.

> And have there been any statistics done on that across open-source projects, like mean, median, percentiles and so on?

I am not aware of any. But in all the posts on this topic over the years, codegen always ends up being half the time. It’s why cargo check is built the way it is, and why it’s always faster than a full build. If non-codegen factors were significant with any regularity, you’d be seeing reports of check being super slow compared to build.

3836293648|7 months ago

Yes, there's a -Ztime-passes, but it's nightly only (or stable with the bootstrap env var set)

steveklabnik|7 months ago

My understanding is that -Ztime-passes isn't very accurate anymore, as it's not integrated into the query system well, or something.