Regarding "The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%)"
Let the golang developers "cook", I am pretty sure that they are going to do what would be right for the language.
"The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%). Given the magnitude and inconsistency of the regression, these benchmarks appear to be rather insensitive to this change. One hypothesis is that the occasional regression may be due to an out-of-date PGO profile, but remains to be investigated."
So they are going to be investigated and definitely a reason why this occurs and how to fix it would also come before you or I would use it in 1.26 (since they are saying it would most likely be shipped in 1.26)(If I remember correctly?) so there is no need to boo I guess.
Well, I don't love that reported performance regressions are handwaved away as not the new gc, but doing something wrong or abnormal.
Will wait for more real world cases showing substantial improvements, but existing(and possibly bad) code exists and it shouldn't be blamed for regressions.
I didn't see anyone "handwaving away" performance regressions in the thread. The closest was a special case where a Golang program was auto-tuning caching decisions based on heap size metrics, and this led to an apparent regression due to the improved metrics w/ the new GC leading to excessive caching. That's hardly the common case!
(In general though, if you take the authors' concerns about the increased future impact of memory bandwidth and memory non-locality seriously, the obvious answer is "don't use GC in the first place, except when you really, really can't avoid it. And even then, try to keep your object graphs as small and compact as possible wrt. memory use; don't have a single "tracing" phase that ends up scanning all sorts of unrelated stuff together." Of course this is unhelpful if you need to work w/ existing codebases, but it's good to keep in mind for new greenfield projects!)
Holy buzzwords. There's no such thing as the GC design, just like there's no the car engine design that's suitable for every vehicle in existence. The right GC design is one that fits the language it's designed for.
Therefore, if you have reason to believe those qualities are a good fit for a Go GC, it'd be great if you could go into detail as to why instead of just throwing out buzzwords left and right.
I'm not a GC expert, but as far as I know, compaction isn't needed because due to various Go-isms, there's not much fragmentation happening. No reason to constrain a design for a feature that won't give much benefit.
Golang GC is mostly concurrent, not stop-the-world. There's a tiny STW pause at the end of the 'mark' phase that could in principle be avoided, but it's not a huge issue wrt. performance.
brianolson|8 months ago
- Yay!
"The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%)"
- Boo
"Green Tea is available as an experiment at tip-of-tree and is planned as to be available as an opt-in experiment in Go 1.25"
I definitely know some application code that spends 30% of CPU time in GC that needs to try this.
Imustaskforhelp|8 months ago
Let the golang developers "cook", I am pretty sure that they are going to do what would be right for the language.
"The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%). Given the magnitude and inconsistency of the regression, these benchmarks appear to be rather insensitive to this change. One hypothesis is that the occasional regression may be due to an out-of-date PGO profile, but remains to be investigated."
So they are going to be investigated and definitely a reason why this occurs and how to fix it would also come before you or I would use it in 1.26 (since they are saying it would most likely be shipped in 1.26)(If I remember correctly?) so there is no need to boo I guess.
Great job from the golang team.
Imustaskforhelp|8 months ago
silisili|8 months ago
Will wait for more real world cases showing substantial improvements, but existing(and possibly bad) code exists and it shouldn't be blamed for regressions.
zozbot234|8 months ago
(In general though, if you take the authors' concerns about the increased future impact of memory bandwidth and memory non-locality seriously, the obvious answer is "don't use GC in the first place, except when you really, really can't avoid it. And even then, try to keep your object graphs as small and compact as possible wrt. memory use; don't have a single "tracing" phase that ends up scanning all sorts of unrelated stuff together." Of course this is unhelpful if you need to work w/ existing codebases, but it's good to keep in mind for new greenfield projects!)
rurban|8 months ago
Mark & sweep is only really useful for external references, but golang has not many, much less than lisp.
Mawr|8 months ago
Therefore, if you have reason to believe those qualities are a good fit for a Go GC, it'd be great if you could go into detail as to why instead of just throwing out buzzwords left and right.
I'm not a GC expert, but as far as I know, compaction isn't needed because due to various Go-isms, there's not much fragmentation happening. No reason to constrain a design for a feature that won't give much benefit.
zozbot234|8 months ago
rastignack|8 months ago