top | item 38331329

(no title)

KenoFischer | 2 years ago

I don't really know what kind of rebuttal you're looking for, but I will link my HN comments from when this was first posted for some thoughts: https://news.ycombinator.com/item?id=31396861#31398796. As I said, in the linked post, I'm quite skeptical of the business of trying to assess relative buginess of programming in different systems, because that has strong dependencies on what you consider core vs packages and what exactly you're trying to do.

However, bugs in general suck and we've been thinking a fair bit about what additional tooling the language could provide to help people avoid the classes of bugs that Yuri encountered in the post.

The biggest class of problems in the blog post, is that it's pretty clear that `@inbounds` (and I will extend this to `@assume_effects`, even though that wasn't around when Yuri wrote his post) is problematic, because it's too hard (arguably impossible) to write correctly. My proposal for what to do instead is at https://github.com/JuliaLang/julia/pull/50641.

Another common theme is that while Julia is great at composition, it's not clear what's expected to work and what isn't, because the interfaces are informal and not checked. This is a hard design problem, because it's quite close to the reasons why Julia works well. My current thoughts on that are here: https://github.com/Keno/InterfaceSpecs.jl but there's other proposals also.

discuss

order

lheck|2 years ago

> Another common theme is that while Julia is great at composition, it's not clear what's expected to work and what isn't, because the interfaces are informal and not checked.

This is THE huge issue when combined with the other variables:

- you often have hundreds of depedencies that interlock

- you often have dependencies that work with multiple packages, and they will often pull them all into your tree (there are sometimes "bridge packages" written, but that is O(n^2) in the number of packages)

- many maintainers do not care about good versioning practices, because they are not enforced, and it is not their problem when they break a bajillion other packages

- many people writing Julia code are not software developers. this is great! but they usually don't look at what their testing coverage looks like. often the CI always fails on a package and it's ignored.

Would Julia only allow multiple dispatch as a link between those packages, the situation would look a little better. But often the packages also talk through simple accesses of values like `X.someproperty`. I've seen situations where maintainers would add and remove these properties and break their own other libraries. Better "enforcement" of these types of things - however that would look like - would be a huge improvement for the time sink that is maintaining a Julia package.

Sukera|2 years ago

> But often the packages also talk through simple accesses of values like `X.someproperty`. I've seen situations where maintainers would add and remove these properties and break their own other libraries. Better "enforcement" of these types of things - however that would look like - would be a huge improvement for the time sink that is maintaining a Julia package.

I think this is a cultural issue to a large degree. I think even if there were better enforcement, it's just a question of coding of discipline to then actually not break something. If it were easier to anticipate what sort of breakage a change could entail (say, by making clear & documented _by the core language_ what is considered a breaking change, and then actually not even doing "technically breaking" changes and using deprecations instead), this COULD change.

That being said, that requires a very different development workflow than what is currently practiced in the main repos of the language, so there's quite a bit of an uphill battle to be had (though such issues happen there all the time, so solving that would actually help here the most, ironically).

grandinj|2 years ago

That sounds like Julia needs some kind of test suite, that lives alongside the major libraries used in Julia, and verifies that things that interact do so in a reasonable manner e.g. things that implement addition obey the rules of addition, etc, etc. With a little reflection and some automation, such a tool could be taught to test new Julia libraries and catch problems early.

i.e. what I am talking about is effectively informally encoding protocols/interfaces and suchlike in a test-suite, rather than seeking to make it a part of the compiler/language.

Not as sexy, no doubt, but flexible and with a short time to get feedback.

ChrisRackauckas|2 years ago

This is precisely what we did with the SciML ecosystem. You can see a blog post from a year back: https://sciml.ai/news/2022/10/08/error_messages/. There's a lot of high level interface checking that goes on now through a trait system to ensure that the pieces are solvable before hitting code, and throwing high level error messages back based on those.

Over the last year we've been growing the domain of these as well, and have seen a decrease in bug reports come from it.