Why haven't places updated already? It's not that much work to update. Where I work we always go to the new LTS version as soon as it's supported by gradle.
The bigger the project, the more painful the upgrade. Package systems are convenient to avoid reinventing the wheel, until you have to upgrade any piece of it. Then you're stuck trying to figure out which versions of each package go together.
If Package A won't run on JDK 17 your entire project is stuck on JDK 11. If Package B is upgraded but has conflicts with Package A, you have to dig through old versions until you find one that works -- and you don't get upgrades.
The more games somebody has played with reflection, undocumented features, deprecations, etc. the more likely you are to have a conflict. And since package managers encourage you to depend on somebody else's code, you end up depending on everybody else's code.
The smaller and greener the project is the more likely it is you can just pull the latest versions and be happy about it. A project that was written when Java 8 was current, and continued to develop, is going to be a nightmare.
"Oh look, I need to upgrade mockito and Spring. Oh, now I upgraded Spring I need to update the spring JPA plugin. Oh now I upgraded that I need to upgrade Hibernate. Oh now I need to upgrade the library built on it that that team over there maintains. Oh, they're not interested." etc. etc.
1) dependencies need to be upgraded. for example, not all versions of Gradle support all Java versions. So you need to upgrade Gradle to upgrade Java.
2) other things are deemed to have higher priority.
3) people are satisfied with existing features and don't want to spend energy to upgrade to something that doesn't provide immediate value.
4) folks aren't educated on what the benefit of switching would be so why would it be prioritized? This is a case of "they don't know what they don't know".
I work on a team using Java 8 daily. It's fine. It's got things I wish it didn't (no null in switch statements for example) but I don't care about that so much that I'm going to go through the pain of upgrading 7-9 services in the mono repo, their dependencies, and then test them all to be on a new version of Java.
1) is garbage. Since grade 6 you can run on 22 ea with no issues. Use toolchains, as they say on their docs.
2) no shit. What business user is every in their mind prioritising upgrading their language version? It's not up to them to push the upgrade. It's yours.
3) of course they are. People don't desire what they don't want. Invest in people who are actually interested in improvement of their software.
4)the java team have been pushing heavily via twitter / youtube / infoq / hacker news / other open jdk providers all the new features for every single java version during their 6 months release cycles. If your devs / your team don't know about it, then maybe again youre not encouraging people to want to improve on what they have, or take interest in the tech they work in.
I mean that is fine, do I give a shit what java version in using for my take home salary? No...but I enjoy using the newest, most interesting and useful tools. And you best believe those people are more attractive to other companies and you working on some 15 year old java 8 tech.
You’re just delaying and making the upgrade worse when the time comes. It’s much easier to upgrade now to 11 and then 17 and then 21 rather than try to upgrade from 8 to 27 when 8 is finally EOL.
Whether you perceive there to be no immediate benefit (hint: there is, Java 8 is an antiquated runtime) or not, delaying upgrading until Java 8 EOL is a way larger risk than upgrading now.
This is a niche case, but I spent months trying to upgrade one of our services from one LTS version to the next (I forget which). We encountered a weird bug where services running on the latest JRE would mysteriously corrupt fields when deserializing thrift messages, but only after running for a little while.
After an enormously unpleasant debugging cycle, we realized that the JIT compiler was incorrectly eliminating a call to System::arrayCopy, which meant that some fields were left uninitialized. But only when JIT compiled, non-optimized code ran fine.
This left us with three possible upgrade paths:
* Upgrade thrift to a newer version and hope that JIT compilation works well on it. But this is a nightmare since A) thrift is no longer supported, and B) new versions of thrift are not backwards compatible so you have to bump a lot of dependent libraries and update code for a bunch of API changes (in a LARGE number of services in our monorepo...). With no guarantee that the new version would fix the problem.
* File a bug report and wait for a minor version fix to address the issue.
* Skip this LTS release and hope the JIT bug is fixed in the next one.
* Disable JIT compilation for the offending functions and hope the performance hit is negligible.
I ultimately left the company before the fix was made, but I think we were leaning towards the last option (hopefully filing a bug report, too...).
There's no way this is the normal reason companies don't bump JRE versions as soon as they come out, but it's happened at least once. :-)
In general there's probably some decent (if misguided) bias towards "things are working fine on the current version, why risk some unexpected issues if we upgrade?"
I encountered a weird bug with deserializing JSON in a JRuby app during an OpenJDK upgrade - it would sporadically throw a parse error for no apparent reason. I was upgrading to OpenJDK 15, but another user experienced the same regression with an LTS upgrade from 8 to 11.
The end result of my own investigation led to this quite satisfying thread on hotspot-compiler-dev, in which an engineer starts with my minimal reproduction of the problem and posts a workaround within 24 hours: https://mail.openjdk.org/pipermail/hotspot-compiler-dev/2021...
There's also a tip there: try a fastdebug build and see if you can convert it into an assertion failure you can look up.
For an example, my team owns a dozen services and they have hundreds of direct and transient dependencies. Of those, maybe a dozen or two to need work to support the new version but that's a dozen different teams that have to put the work on their roadmap and prioritize it. When the entitlement is 'devs want to use shiny feature X with hard to quantify productivity benefit' it's difficult to prioritize. When there's an efficiency benefit then things move fast because a 10% efficiency improvement means 10% lower server costs and that's easy math.
The services I work on pump the entire business revenue from start to finish. A few nice to haves for devs aren't any where close in the risk calculation if something breaks
jfengel|2 years ago
If Package A won't run on JDK 17 your entire project is stuck on JDK 11. If Package B is upgraded but has conflicts with Package A, you have to dig through old versions until you find one that works -- and you don't get upgrades.
The more games somebody has played with reflection, undocumented features, deprecations, etc. the more likely you are to have a conflict. And since package managers encourage you to depend on somebody else's code, you end up depending on everybody else's code.
The smaller and greener the project is the more likely it is you can just pull the latest versions and be happy about it. A project that was written when Java 8 was current, and continued to develop, is going to be a nightmare.
Macha|2 years ago
brnt|2 years ago
Brystephor|2 years ago
2) other things are deemed to have higher priority.
3) people are satisfied with existing features and don't want to spend energy to upgrade to something that doesn't provide immediate value.
4) folks aren't educated on what the benefit of switching would be so why would it be prioritized? This is a case of "they don't know what they don't know".
I work on a team using Java 8 daily. It's fine. It's got things I wish it didn't (no null in switch statements for example) but I don't care about that so much that I'm going to go through the pain of upgrading 7-9 services in the mono repo, their dependencies, and then test them all to be on a new version of Java.
belfthrow|2 years ago
2) no shit. What business user is every in their mind prioritising upgrading their language version? It's not up to them to push the upgrade. It's yours.
3) of course they are. People don't desire what they don't want. Invest in people who are actually interested in improvement of their software.
4)the java team have been pushing heavily via twitter / youtube / infoq / hacker news / other open jdk providers all the new features for every single java version during their 6 months release cycles. If your devs / your team don't know about it, then maybe again youre not encouraging people to want to improve on what they have, or take interest in the tech they work in.
I mean that is fine, do I give a shit what java version in using for my take home salary? No...but I enjoy using the newest, most interesting and useful tools. And you best believe those people are more attractive to other companies and you working on some 15 year old java 8 tech.
vips7L|2 years ago
Whether you perceive there to be no immediate benefit (hint: there is, Java 8 is an antiquated runtime) or not, delaying upgrading until Java 8 EOL is a way larger risk than upgrading now.
krzyk|2 years ago
We ditched spock because of groovy, and never looked back. Now at jdk 21, previously at 20.
defatigable|2 years ago
After an enormously unpleasant debugging cycle, we realized that the JIT compiler was incorrectly eliminating a call to System::arrayCopy, which meant that some fields were left uninitialized. But only when JIT compiled, non-optimized code ran fine.
This left us with three possible upgrade paths:
* Upgrade thrift to a newer version and hope that JIT compilation works well on it. But this is a nightmare since A) thrift is no longer supported, and B) new versions of thrift are not backwards compatible so you have to bump a lot of dependent libraries and update code for a bunch of API changes (in a LARGE number of services in our monorepo...). With no guarantee that the new version would fix the problem.
* File a bug report and wait for a minor version fix to address the issue.
* Skip this LTS release and hope the JIT bug is fixed in the next one.
* Disable JIT compilation for the offending functions and hope the performance hit is negligible.
I ultimately left the company before the fix was made, but I think we were leaning towards the last option (hopefully filing a bug report, too...).
There's no way this is the normal reason companies don't bump JRE versions as soon as they come out, but it's happened at least once. :-)
In general there's probably some decent (if misguided) bias towards "things are working fine on the current version, why risk some unexpected issues if we upgrade?"
Freaky|2 years ago
The end result of my own investigation led to this quite satisfying thread on hotspot-compiler-dev, in which an engineer starts with my minimal reproduction of the problem and posts a workaround within 24 hours: https://mail.openjdk.org/pipermail/hotspot-compiler-dev/2021...
There's also a tip there: try a fastdebug build and see if you can convert it into an assertion failure you can look up.
dihrbtk|2 years ago
kevan|2 years ago
yCombLinks|2 years ago
jabradoodle|2 years ago
coldtea|2 years ago
unknown|2 years ago
[deleted]
dingi|2 years ago
1. Use Maven 2. Use BOMs to manage related dependencies 3. No lombok