I have such mixed feelings about dynamically typed languages. I've designed a whole pile of hobby programming languages, and dynamically typed languages are at least an order of magnitude simpler to design and for users to learn and start using.
At the same time, they inevitably seem to lead to user stories like this where a user really does know exactly what types they're working with and wants the language to know that too (for performance or correctness), and they end up jumping through all sorts of insane hoops to get the optimizer to do exactly what they want.
I think Common Lisp got it exactly right. Strongly typed dynamic language where you can optionally specify types for the extra performance/correctness if need be (especially with SBCL).
Honestly, I think weak typing is more of an issue than dynamic typing and people cry for static types when they suffer mostly from the former.
Dynamic typing is great because it allows you to have extremely complex types for basically free. It allows for insane expressiveness. It also makes prototyping much easier and does not force you into over-specifying in you types early on. In dynamic language most of your types are the most general type that would work by default while static types forces you to use very specific types (especially when lacking structural typing.)
If you want to allow just half the expressiveness of dynamic languages in your static language you will quickly find huge complexity with dependent types, long compile time, cryptic error messages and whatnot.
Generally, I think gradual typing is rising in popularity for good reason. It allows for quick prototyping but also to drill down on your types when you want to. Best of both worlds.
I like dynamic languages too. But I don't like the idea of "optimization", and I would be super interested in a dynamic language that didn't attempt to divorce performance from correctness. The worst part about jumping through insane hoops to enchant the optimizer is that it can all go wrong with the tiniest change--a flag here, a different usage pattern there, a new version, etc., and suddenly your program doesn't do what you need it to, as though an operation taking 1000x longer is just a matter of degree.
I'm happy with dynamic languages for almost everything I do and generally do not want to sacrifice flexibility, which is the price to pay for a static type system. However, certain parts of a program become more crystalline over time, whether for performance or correctness reasons, and being able to express those parts using a static type system makes a lot of sense. PreScheme [0] is an example of a statically typed Scheme that composes with the host Scheme. I'd like to see more work in this direction as Scheme already works well as a multi-paradigm language.
As a long-time Python and JavaScript user, I've come to the conclusion that dynamic typing is just not a good idea for anything beyond exploratory or very small projects.
The problem is that you invariably have to think about types. If you mistakenly pass a string to a function expecting an integer, you better hope that that is properly handled, otherwise you risk having type errors at runtime, or worse—no errors, and silent data corruption. That function also needs to be very explicit about this, but often the only way to do that is via documentation, which is often not good enough, or flat out wrong. All of this amounts to a lot of risk and operational burden.
Python's answer has historically been duck typing, which doesn't guarantee correctness so it's not a solution, and is more recently addressing it with gradual typing, which has its own issues and limitations. Primarily that if specifying types is optional, most programmers will not bother, or will just default to `any` to silence the type checker. While for JS we had to invent entirely new languages that compile to it, and we've reached the point where nobody sane would be caught working with plain JS in 2024.
Static typing, in turn, gives you a compile time safety net. It avoids a whole host of runtime issues, reduces the amount of exhaustive and mechanical tests you need to write, while also serving as explicit documentation. Code is easier to reason about and maintain, especially on large projects. You do lose some of the expressiveness and succinctness of dynamic typing, but what you gain with static typing is far more helpful than these minor benefits.
I bounced off learning Haskell a few times (infinite regress to downloading category theory textbooks), but almost the moment I saw its type inference, I immediately wanted that in lisp.
Roc is looking pretty nice (especially once your editor can paste in the inferred type like they want to do), but I still think there’s an empty space for an imperative language where type inference makes it feel as untyped (or at least as unceremonious) as (pre-annotation) Python
Solid blog overall and I think it is pitched at the right level of granularity for the topic. However, if I were offering criticism, the place I think more detail would be super interesting is the 'Please Inline' section. In particular, I would really be interested in a slightly more detailed description of the optimizer's algorithm for inlining. I think the "define_inlinable" macro is a great example of macro usage, but it is clearly a way to circumvent the inliner's apparent short comings. I would like to be able to understand what heuristic the optimizer is using to see if there is a sensible default function construction style/idiom that is more appealing to the optimizer for inlining. However, I am reminded of the inlining discussion in Chandler Carruth's talk (Understanding Compiler Optimization) from a few years ago where he discusses how obscure and seemingly arbitrary, in general, inlining heuristics and their optimization passes are in practice. [1]
A walkthrough of Guile's optimization passes and the inlining heuristics would be great. I've been meaning to do a "part two" here but you know how these things go.
Andy's blog is on another level. He's also leading the Hoot project to compile Guile to WebAssembly and I work with him on that and try to absorb whatever compiler knowledge I can while doing so.
These sorts of optimizations can and should be handled by a (sufficiently smart (tm)) compiler.
Common Lisp/SBCL is usually sufficiently smart. I know not everyone likes Common Lisp, but at least I would have tested it with something more performant that Guile, like Chicken Scheme (my favorite!), Chez Scheme, etc.
I like Guile and its purpose as a universal scripting language. However, its performance issues are well known. Even compared to other scripting-first languages (Lua, Perl, Python etc).
I think that is why this blog is particularly interesting to me. As one of the other comments to this posting said, it is nice to see an analysis/detailed description of working to optimize code where the first step is not to rewrite in a language with a presumed better performance baseline. Also, I think there is also some props to be given for continuing to work within Guile's somewhat spartan tooling to do the optimization work, instead of switching to a language that may have more/better tooling for the task.
Not to take away from the general comparisons between various Lisp flavors and between various scripting languages (an activity I engage in quite often), but your lead off line is more prescriptive than I find advisable. I don't think a blanket statement that optimizations of runtime behavior of code "should" only be done via a compiler. Some devs enjoy the work, others have varied reasons for doing performance sensitive work in a given language/environment. But at the end of day, doing optimization is a valid usage of developer effort and time if that developer judges it so.
Part of the problem is that raw Scheme is spectacularly underspecified.
It also doesn't help that Schemes like Guile are also interactive. The domain of an interactive language and a "compiled" language are quite different.
Given the entirety of the program made available to the compiler all at once, there are high level derivations that can happen notably through flow analysis to let the compiler make better decisions. But do that in an interactive environment when the rug can be pulled out of any of the assumption the compiler made, and things get messy quite quickly.
One interesting tidbit for Java is that the developers of the compiler advocate "idiomatic" Java. Simply, write Java like Java, and don't try to trick the compiler. Let the compiler developers trick the compiler.
That's evident here in this article when they wrote the function that tests the types of the parameters. Raw Scheme, naturally, doesn't allow you to specify parameter types, not the way you can in Common Lisp, for example. And, either Guile does not have an specific extension to support this, or simply the compiler looks for this type checking pattern at the top of a function to make optimization determinations. On the one hand, it could be more succinct with a specialized facility, but on the other, this is "just Scheme".
So, in effect by checking for the type of the variable, they're implicitly declaring the type of the variable for the "sufficiently smart" compiler to make better decisions.
The counter example is the "define-inline" construct, and thus not standard Scheme (though readily replaced by a "no-op" macro if one was interested in porting the code).
In my experience, guile these days is often faster than chicken, even compiled (the csi interpreter is dog slow and not suitable for anything but interactive exploration). Chicken is just not a fast implementation.
Plus guile comes with a more comprehensive standard library; on the other hand, chicken's package manager and available packages do make up for that.
Guile has come a long way in the past decade or so! I think your info is quite out of date. Guile's compiler performs a number of state of the art optimizations. It compiles to bytecode so native code compilers like Chez win the race, naturally. The JIT is pretty good, though! Native compilation is on the roadmap for Guile, but maybe somewhat surprisingly we're getting AOT compilation to WebAssembly first. https://spritely.institute/hoot/
I switched from Guile to SBCL because I really like having things such as (declare (inline my-function)) and (declare (type Double-Float x y z)). Now if only it had case-lambda, named let, and a better deftype which can specify members of classes and/or structs.
The guile source->source Optimizer is such a nice tool to see what is going on. Especially when writing macros. I really recommend Kent Dybvig's "the macro writer's bill of rights" to see how useful it can be.
The monomorphic vs polymorphic argument is an interesting one. I think that you could explicitly get unboxing if you used something like CLOS style multimethods to dispatch based on the type, so that (add <float> <float>) would dispatch to the function that uses fadd on those operands. I never realized that you could use this kind functionality, multimethods or free monad interpreters, to write in-code optimizations that are conveniently abstracted away in actual code usage.
Edit: nevermind, that's also dynamic dispatch. You'd have to add static dispatch via macros or some external transpilation step.
The full numeric tower sounds like a great idea. But in retrospect you almost never want uint32 silently converting to bignum, or ending up with a low precision float.
I would say the opposite: In a high-level language for "everyday programming", as opposed to systems programming or high-performance programming, arbitrary precision signed integers are the right choice. They let you do math on things like a large file size in bytes, or a nanosecond-precision timestamp, without having to think about integer widths. You only need to think about "is this an integer or is this floating-point", which takes less mental effort than using a language like C with its large selection of integer types.
I love the numeric tower most of the time. Not having to worry about integer overflow bugs is great. I like that I can express the fraction 1/3 exactly rather than approximately with a float. It's only in the cases of very sensitive code that I have to worry about the details of how numbers are represented at runtime.
That's funny, my impression was the opposite: that you'd almost always want a fixed-width integer to promote to bignum when it transcends the limit. It's a lot more sensible than adding a bunch of integers together and ending up with one which is smaller than any of the ones in the input.
As soon as I saw the title, I thought of the streetfighter character, but this was actually an interesting read on a programming language, I had never heard of before
The slogan I've proposed for the language is "Guile goes with everything." Because Guile was designed from the outset to run embedded or standalone, and to transpile other extension languages to Scheme or a Scheme-compatible representation, I think that fitting. See: https://knowyourmeme.com/memes/guiles-theme-goes-with-everyt...
You probably shouldn’t do those things. The point of a high level language is to not have to think about such details. If you can’t get the performance you need, you should use a different tool, instead of trying to circumvent implicit limitations.
munificent|1 year ago
At the same time, they inevitably seem to lead to user stories like this where a user really does know exactly what types they're working with and wants the language to know that too (for performance or correctness), and they end up jumping through all sorts of insane hoops to get the optimizer to do exactly what they want.
cardanome|1 year ago
Honestly, I think weak typing is more of an issue than dynamic typing and people cry for static types when they suffer mostly from the former.
Dynamic typing is great because it allows you to have extremely complex types for basically free. It allows for insane expressiveness. It also makes prototyping much easier and does not force you into over-specifying in you types early on. In dynamic language most of your types are the most general type that would work by default while static types forces you to use very specific types (especially when lacking structural typing.)
If you want to allow just half the expressiveness of dynamic languages in your static language you will quickly find huge complexity with dependent types, long compile time, cryptic error messages and whatnot.
Generally, I think gradual typing is rising in popularity for good reason. It allows for quick prototyping but also to drill down on your types when you want to. Best of both worlds.
sctb|1 year ago
davexunit|1 year ago
[0] https://prescheme.org/
imiric|1 year ago
The problem is that you invariably have to think about types. If you mistakenly pass a string to a function expecting an integer, you better hope that that is properly handled, otherwise you risk having type errors at runtime, or worse—no errors, and silent data corruption. That function also needs to be very explicit about this, but often the only way to do that is via documentation, which is often not good enough, or flat out wrong. All of this amounts to a lot of risk and operational burden.
Python's answer has historically been duck typing, which doesn't guarantee correctness so it's not a solution, and is more recently addressing it with gradual typing, which has its own issues and limitations. Primarily that if specifying types is optional, most programmers will not bother, or will just default to `any` to silence the type checker. While for JS we had to invent entirely new languages that compile to it, and we've reached the point where nobody sane would be caught working with plain JS in 2024.
Static typing, in turn, gives you a compile time safety net. It avoids a whole host of runtime issues, reduces the amount of exhaustive and mechanical tests you need to write, while also serving as explicit documentation. Code is easier to reason about and maintain, especially on large projects. You do lose some of the expressiveness and succinctness of dynamic typing, but what you gain with static typing is far more helpful than these minor benefits.
zellyn|1 year ago
Roc is looking pretty nice (especially once your editor can paste in the inferred type like they want to do), but I still think there’s an empty space for an imperative language where type inference makes it feel as untyped (or at least as unceremonious) as (pre-annotation) Python
zem|1 year ago
throwaway17_17|1 year ago
1 - https://youtu.be/FnGCDLhaxKU?si=J3MhvJ-BmX5SG2N6&t=1550
davexunit|1 year ago
samatman|1 year ago
Andy Wingo is the maintainer and I get a kick out of everything he posts.
davexunit|1 year ago
atemerev|1 year ago
Common Lisp/SBCL is usually sufficiently smart. I know not everyone likes Common Lisp, but at least I would have tested it with something more performant that Guile, like Chicken Scheme (my favorite!), Chez Scheme, etc.
I like Guile and its purpose as a universal scripting language. However, its performance issues are well known. Even compared to other scripting-first languages (Lua, Perl, Python etc).
throwaway17_17|1 year ago
Not to take away from the general comparisons between various Lisp flavors and between various scripting languages (an activity I engage in quite often), but your lead off line is more prescriptive than I find advisable. I don't think a blanket statement that optimizations of runtime behavior of code "should" only be done via a compiler. Some devs enjoy the work, others have varied reasons for doing performance sensitive work in a given language/environment. But at the end of day, doing optimization is a valid usage of developer effort and time if that developer judges it so.
whartung|1 year ago
It also doesn't help that Schemes like Guile are also interactive. The domain of an interactive language and a "compiled" language are quite different.
Given the entirety of the program made available to the compiler all at once, there are high level derivations that can happen notably through flow analysis to let the compiler make better decisions. But do that in an interactive environment when the rug can be pulled out of any of the assumption the compiler made, and things get messy quite quickly.
One interesting tidbit for Java is that the developers of the compiler advocate "idiomatic" Java. Simply, write Java like Java, and don't try to trick the compiler. Let the compiler developers trick the compiler.
That's evident here in this article when they wrote the function that tests the types of the parameters. Raw Scheme, naturally, doesn't allow you to specify parameter types, not the way you can in Common Lisp, for example. And, either Guile does not have an specific extension to support this, or simply the compiler looks for this type checking pattern at the top of a function to make optimization determinations. On the one hand, it could be more succinct with a specialized facility, but on the other, this is "just Scheme".
So, in effect by checking for the type of the variable, they're implicitly declaring the type of the variable for the "sufficiently smart" compiler to make better decisions.
The counter example is the "define-inline" construct, and thus not standard Scheme (though readily replaced by a "no-op" macro if one was interested in porting the code).
shawn_w|1 year ago
Plus guile comes with a more comprehensive standard library; on the other hand, chicken's package manager and available packages do make up for that.
davexunit|1 year ago
tmtvl|1 year ago
eru|1 year ago
pjmlp|1 year ago
bjoli|1 year ago
anthk|1 year ago
On SICP, Guile badly needs a module for the picture language from the book (and srfi-203 + srfi-216).
ristos|1 year ago
Edit: nevermind, that's also dynamic dispatch. You'd have to add static dispatch via macros or some external transpilation step.
pmkary|1 year ago
tightbookkeeper|1 year ago
Has anyone had a positive experience?
pkkm|1 year ago
davexunit|1 year ago
tmtvl|1 year ago
orliesaurus|1 year ago
bitwize|1 year ago
abound|1 year ago
unknown|1 year ago
[deleted]
exitb|1 year ago
gus_massa|1 year ago