This is what people mean when they say Haskell is "opinionated."
Haskell shepherds you into separating out IO code from library code to such an extent that literally any function that has an IO action taints the value returned from that function, causing it to be an IO value, and trying to pass that IO value into another function makes the return type of that function IO, too. Parametric polymorphism is the default, too, so it also shepherds you into writing general purpose code. Haskell is full of these little decisions where it just won't let you do something because it's not "correct" code, and they kind of don't care if that makes coding in it a fight against the compiler.
Rust took that philosophy and applied it pointers. Every value has a lifetime and an ownership which makes it quite hard to do things that aren't memory safe.
Both Rust and Haskell wrap values that can fail in little boxes, and to get them out you have to check which type of value it is, and in C# there's nothing stopping you from returning null and not telling anyone that you can return null, and just assuming people will check for null all the time. Haskell has a philosophy of "make invalid code unrepresentable." The concept of a value being in a box, rather than null being a possible value makes it impossible to use that value without getting it out.
People who write Go love that concurrency is easy and Go fmt has enforced a single canonical style. Building these sorts of things into the language goes a long way in getting them adopted and becoming the norm.
I think we saw a rise of the easy, anything-goes, screw-performance scripting languages. I think the next fashion seems to be in enforcing "correct" coding style. They all have their place.
I'm very much in agreement with your assessment here. If you rely on programmers to "do the right thing", some people will break those rules, and systems will suffer varying levels of quality decay as a result. Language level enforcement of key concepts prevents--or at least makes it harder--for folks to make bad decisions in whatever areas those concepts apply. Clojure is another good example where they've provided concurrency primitives that allow you to avoid all the major pitfalls you typically see with multithreaded Java programs. As such, most Clojure code does concurrency the "right way", and in 10+ years of using it, I've never seen a deadlock.
>I think we saw a rise of the easy, anything-goes, screw-performance scripting languages. I think the next fashion seems to be in enforcing "correct" coding style. They all have their place.
Having freedom to do what you want is great. Even if you shoot yourself in the foot, you learn your lesson and become a better developer. But as you work with increasing numbers of people, many making the same mistakes you have made, and especially as you end up having to fix their mistakes, you begin to look for a tool that takes away their ability to shoot themselves in the foot. That was one appeal of Rust when I was learning it. It is a pain to fight the compiler over memory, especially coming from a garbage collected background, but it both protected me from myself and protected me from others. At a certain point, at least on large enough group projects, the benefits of that protection outweigh the costs.
> in C# there's nothing stopping you from returning null and not telling anyone that you can return null, and just assuming people will check for null all the time
Haskell is not opinionated. All in all, it's probably easier (but misses much of the point of Haskell) to just write "IO" and "do/return" on every function in your program than to use IO in a disciplined way.
Haskell even supports this with special do-syntax (and the fortuitously-named "return") to make monadic code look more imperative!
Paying that IO/do/return syntax tax (sin-tax? syn-tax?) is still cheaper than the signature/return boilerplate in competing compiled languages like C and Java. Haskell invites you avoud that syn-tax by writing principled IO.
One of the major complains of Haskell is that it is so expressive and powerful that there are so many incompatible ways of architecting modules. (see the incompatibilities in implementations of Monad Transformers / Effects, Lens, etc)
I wonder how much of an uptick Adacore has seen with people using Ada and especially Spark in projects lately. Ada has a different niche than Haskell and Rust, but they're obsessed with software quality and provability. I've only played with Ada, but really liked the code that came out of it. If only they could make strings less painful to deal with.
I'd be very curious if, in the same way that "TypeScript is a superset of JavaScript", there could be a superset of TypeScript that encouraged one to annotate when they're performing IO operations (reading from/writing to the DOM, the network, workers, storage, etc) and if you consumed a function that had said annotation it would further encourage you to annotate that function as well. Something like:
^ compiler complains that persistUsername writes to localStorage but does not have the *localStorage IO annotation. While I feel like TypeScript does a great job of working with arguments and return values there's still a whole class of issues that can crop up from things like unexpected DOM manipulation that still would be useful in detecting.
> Haskell is full of these little decisions where it just won't let you do something because it's not "correct" code, and they kind of don't care if that makes coding in it a fight against the compiler.
They care more about predicability and compositionality than about a novice's struggles. Professional programmers should prioritise those things. Certainly you can not care about those things for personal projects.
That said, Haskell could of course use plenty of ergonomic improvements, but the ones you describe are not among them.
> I think we saw a rise of the easy, anything-goes, screw-performance scripting languages. I think the next fashion seems to be in enforcing "correct" coding style. They all have their place.
This is the sane way of looking at it.
We noticed that a lot of tasks were not worth the trouble of ensuring correctness, and so dynamic languages too over.
But then a lot of system scaled to a point complexity was hard to manage. And those system had huge economic impact. Which made perf and correctness valuable again, especially since they had a lower price of entry.
I do a lot of Python, usually with dynamic types and mixing IO everywhere. It works surprising well for a ton of cases, and can scale quite far. But I recently wanted to make a system that would provide a plugin system that included a scriptable scenario of input collection that would be chained up to a rendering of some sort. This required to disconnect completely the logic of the scenario - controlled by the 3rd party dev writing it - from the source of the input, and to make the API contract very strict.
It was quite a pleasant experience, seing that Python was capable of all that. Type hints work well now, and coroutines are exactly made for that use case. You can make your entire lib Sans I/O using coroutines as contracts. The automatic state saving and step by step execution is not inelegant.
But you can feel that it's been added on top of the original language design. It's not seamless. It's not shepherding you at all, you need to have discipline, and a deep understanding of the concepts. Which is the opposite of how it feels for other more core features of Python: well integrated, inviting you to do the right thing.
I'm hoping the technology will advance enough so that we eventually get one language that can navigate both side of the spectrum. Giving you the ease of Python/Ruby scripting, data analysis, and agility for medium projects, but letting you transition to Haskell/Rust safety nets and perfs with strict typing + memory safety without GC + good I/O hygiene in a progressive way. Something that can scale up, and scale down.
Right now we always have to choose. I've looked at swift, go, v, zig, nim, various lisps and jvm/.net based products. They always have those sweet spots, but also those blind spots. Which of course people loving the language often don't see (I know some people reading this comment will want to shim in their favorite as candidate, don't bother).
Now you could argue that we can't have it all: choose the right tool for the right job. But I disagree. I think we will eventually have it all. IT is a young field, and we are just at the beginning of what we can do.
Maybe as a transition, we will have a low level runtime language that also include a high level language runtime. Like a rust platform with a python implementation, or what v-lang does with v-script. It won't be the perfect solution, but I'd certainly use something like that.
Instead of a reaction to scripting languages, or maybe in addition to, I think the current trends of shepherding languages are reacting to the flexibility of C and, even more so C++. C++ in particular is such a mind-boggling huge language. It presents so many choices that designing anything new involves searching a massive solution space. A task better left to experts.
Newbies (speaking from experience) need a framework to lean on. Something that provides a starting point for solving problems. Opinionated languages provide that out of the box.
> Both Rust and Haskell wrap values that can fail in little boxes, and to get them out you have to check which type of value it is, and in C# there's nothing stopping you from returning null and not telling anyone that you can return null, and just assuming people will check for null all the time.
F# is the .NET citizen that does the equivalent of the Rust or Haskell stuff.
Either you use an option type (https://fsharpforfunandprofit.com/posts/the-option-type/) which is an easy way of making a function that says 'user says to find a record with the name of Bob, and you will either get a return type of Some record(id:1,name:bob), or you will get a return type of None'
let GetThisRecord(name) =
if SomeDatabaseLookup(name).IsSome then
Some record(SomeDatabaseLookup(name).Value) // not idiomatic but works and is faster than a match
else
None
Or you use the Success/Failure type (see railway oriented programming)
The haskell situation sounds like generally a good thing but I am not sure I would like it very much if this also applies to logging.... It does not sound like great fun to have to change the signature of a function when it needs to log something and then change it again if it no longer needs to.
> literally any function that has an IO action taints the value returned from that function, causing it to be an IO value, and trying to pass that IO value into another function makes the return type of that function IO, too. Parametric polymorphism is the default, too, so it also shepherds you into writing general purpose code. Haskell is full of these little decisions where it just won't let you do something because it's not "correct" code, and they kind of don't care if that makes coding in it a fight against the compiler.
From a Haskell perspective, and a correctness perspective, and also Rust with its pointer tracking, all this makes sense. It's very helpful for correctness.
Yet, the IO monad "virality" reminds me of Java checked exceptions. Checked exceptions mean every function type signature includes the set of exceptions that function might throw.
When that was introduced, it was thought to be a good idea because it's part of the type-safety of Java and will ensure programmers write code that deals with exceptions correctly, one way or another.
But some years later, people started to argue that listing exceptions in the type signature is causing more software engineering problems than it solves (and C# designers took the decision to not include checked exceptions). Googling "checked exceptions harmful" yields plenty of essays on this.
For checked exceptions, there are people arguing both sides of it. Yet they are pretty much all fans of static typing for the rest of the language; it isn't an argument between people who favour static vs. dynamic typing.
So why are checked exceptions considered harmful by some? On the face of it, there's an argument against verbosity. But the deeper one is about software engineering. What I call "type brittleness".
When you have a large codebase, beautifully and carefully annotated with exact, detailed checked-exception signatures, then one day you have to add a trivial little something to one little function that might throw an exception not already in that function's signature... You may have to go through the large codebase, updating signatures on hundreds or thousands of functions which use the first little function indirectly.
And that's if you have the source. When you have libaries you can't change, you have to wrap and unwrap exceptions all over the place to allow them to propagate via libraries which call back to your own code. Sometimes there is no exception type explicitly allowed by the libaries, so you wrap and unwrap using Java's RuntimeException, the one which all functions allow.
The "viral effect" of so much effort for sometimes tiny changes is a brittleness issue. It leads people to resort to "catch and discard all" try-blocks, to confine the virality Sometimes it's "temporary", but you know how it is with temporary things. Sometimes it isn't temporary because the programmer can't find another clean way to do it while not modifying things they shouldn't or can't.
"Rust makes it quite hard to do things" generally as a result of that decision. Even just syntactically it's a large overhead. It does force you to explicitly manage lifetimes at every place in your code. Which is a good example of the wrong implementation of the wrong objective.
I agree with your assessment 100%. Does anyone else out there get frustrated with "bare hands" conventions? That's where you have to manually follow a verbose convention or write things like glue manually, when the compiler/runtime could do more of the heavy lifting automatically for us.
For example, say we want to hide low-level threading primitives due to their danger. So we implement a channel system like Go. But we run into a problem where copying data is expensive, so the compiler/runtime has an elaborate mechanism to pass everything by reference and verify that two threads don't try to write the same data. I'm glossing over details here, but basically we end up with Rust.
But what if we questioned our initial assumptions and borrowed techniques from other languages? So we decide to pass everything by value and use a mechanism like copy-on-write (COW) so that mutable data isn't actually copied until it's changed. Now we end up with something more like Clojure and state begins to look more like git under the hood. But novices can just be told that piping data between threads is a free operation unless it's mutated.
To me, the second approach has numerous advantages. I can't prove it mathematically, but my instincts and experience tell me that both approaches can be made to have nearly identical performance. So on a very basic level, I don't quite understand why Rust is a thing. And I look at tons of languages today and sense those fundamental code smells that nobody seems to talk about like boxing, not automatically converting for to foreach to higher level functions (by statically tracing side effects), making us manually write prototypes/headers, etc etc etc.
I really feel that if we could gather all of the best aspects of every language (for example the "having the system on hand" convenience of PHP, the vector processing of MATLAB, the "automagically convert this code to SIMD to run on the GPU" of Julia <- do I have this right?), then we could design a language that satisfies every instinct we have as developers (so that we almost don't need a manual) while at the same time giving us the formalism and performance of the more advanced languages like Haskell. What I'm trying to say is that I think that safe functional programming could be made to look nearly identical to Javascript, or even some of the spoken-language attempts like HyperTalk.
The handwaving around the bare hands stuff is what tires me out as a coder today because fundamentally I just don't view it as necessary. I really believe that there is always a better way, and that we can evolve towards that.
This is my main issue with C++. For a while my job was to get game engine codebases running, integrate tools and move on. So I saw a lot of big C++ codebases. Nearly every one had the same bad behaviors. Tons of globals. Configuring build options from code. Header mazes that made it clear people didn't actually know what code their classes needed.
I then worked for awhile developing a fairly fresh C++ code base. The programers I worked with were very willing to write maintainable code and follow a standard and it was still really damn hard to keep things like header hygene.
When I go back to the language I can't believe how much time I spend dealing with minor issues that stem from the bad habits it builds. For years I would refuse to say any language was good or bad. Always I insisted you use the right tool for the job. And there are some features of C++ that when you need them you have to use that language or maybe C in its place. But the shortcomings are unrelated to language's issues which largely seem to come from a focus on backward comparability. And so even used in its right application it seems incredibly flawed. And I pretty much believe it's a bad language now.
Disclaimer: I learned to program with C++, I understand its power and for years I loved the language. I also understand there are situations where despite its shortcomings it is the right choice.
Why are globals considered bad? I'm seriously asking. I, too, have been told hundreds of times over the course of my career, and I never questioned it. I want to question it now, because I've never understood why people work SO HARD to remove and avoid globals. I seriously doubt that the time and effort I've seen spent on removing and avoiding globals has been time well spent. And I'm quite sure that the effort spent on that is not comparable to the amount of problems prevented by not having globals. There's just no way globals can be dangerous enough to justify the size of globals-cleansing efforts I've seen.
Game development often has a very large global state, and game problems are often inherently global state manipulation problems; you need globals in order to even have the game in many cases.
I write or deal with a lot of C, unfortunately. I try very hard to not have global variables, and to minimize sharing between threads. When I pick up a C codebase, one of the first things I do is build it and inspect the object files to see what globals exist. The same can be done in C++, and should be. Use inheritance sparingly. Don't use exceptions if at all possible. Use modern C++ as much as possible, and borrow ideas from Haskell/Rust as much as possible. I'm thinking of https://stackoverflow.com/questions/9692630/implementing-has...
Game development is a rapid prototyping adventure that is fuelled by the fact what you are producing is ultimatly a form of art. Architectures are based on abstraction, and abstraction is ultimately mindful ignorance; in this case of specific requirements or specific goals which are going to change because you are creating art. You are going to find out as you continue to develop that technical debt builds because the changing requirements create conflicting workflows which is why you get spam in the header. It's a lot faster to prototype something through duplication, cobbling, or refactoring then later on use automation to remove the chunks of code that are not used and reduce line count by creating utility functions because at that point, part of the project is set in stone and the project is going in one direction. Things will gyrate back and forth between messy and clean, and hopefully you have the budget to refactor to clean before you ship as modders don't like dirty game code.
Games are a simulacrum of reality and reality doesn't say properties of two different objects can never, ever, interact with each other; that's why you have the abuse of global variables to store state and also why there's a rich speedrunning community using all sorts of hacks in games to speed up their playtime due to unforseen edge cases. If you build a model of reality, you're going to be doing R&D learning how it interacts with itself, just like we do today!
Nobody wants to play a game with a static workflow.
What stands out is how apologetic you are for pointing out that a language might be worse (gasp!) than another language. When did this "all languages are roughly equal, and if you say anything else you're a zealot" ideology get so widely entrenched in our industry?
> I also understand there are situations where despite its shortcomings it is the right choice.
Would you say the reason for choosing it are not inherent to the lang itself but to things like: experience of the team, availability of libraries/ecosystem, need for mature/fast compilers?
That's a great metaphor for language smells! Some more anecdotes:
- Python shepherds you into using list comprehensions, even when it's almost always premature optimization and much harder to read than a loop. As a language smell that's not bad, it's just the worst I could think of in a post-v2 world. Luckily there's `black`, `flake8`, `isort` and `mypy`.
- Bash shepherds you into using `ls` and backticks, useless `cat`s, treating all text as ASCII, and premature portability in the form of "POSIX-ish". Luckily `shellcheck` takes care of several common issues.
- Lisp shepherds you into building your own language.
There's also tool shepherding:
- IDEA shepherds you into refactoring all the time, since it's the only IDE which does this anywhere near reliably enough. (At least in Java. In other languages renaming something with a common name is almost guaranteed to also rename unrelated stuff.)
- Firefox shepherds you into using privacy extensions.
- Chrome shepherds you into using Google extensions.
- Android shepherds you into installing shedloads of apps you hardly ever use.
- *nixes other than Mac OS shepherd you into using the shell and distrusting software by default.
- Windows and Mac OS shepherd you into using GUIs for everything and trusting software by default.
It's related to, but definitely not the same as, linguistic relativism. Programming "language" might be a bit of a misnomer, because it creates a false equivalency to natural language. Just as different subfields of mathematics were created to solve different problems, so too were different programming "languages" inspired by different subfields and their notations. With that view, it's unsurprising that some ways of doing things highlight certain methods of solving problems and obscure or impede others.
Working in a Java/Kotlin environment, everyone always handles all null cases when working in Kotlin, but they are frequently overlooked in the Java applications. Many of the Java apps compensate with more levels of catch-all exception handlers targetting unexpected NPEs. The only time we get NPEs in Kotlin is when Kotlin allows them because of the Kotlin/Java interop problem.
Working with Javascript/Typescript, we need to rely on linters to enforce safe practices in Javascript.
Something i like about rust is it shepards you to fast running programs and away from null pointer errors.
something i like about go is it shepards you to write code any other go programmer can follow easily
something i dislike about c# is it has the tools to let you write very very fast code but shepards you to use non devirtualized interfaces over heap allocated classes tied together with linq overhead.
> something i like about go is it shepards you to write code any other go programmer can follow easily
Sure, the syntax and indentation levels are all the same, but that's not really the difficult parts of programming. The difficulty comes from abstractions, indirections and other abstract things that Go, just as any language, let's you do however you want.
There are of course codebases made in Go where the indirections makes no sense and are hard to follow, just as in any language.
What Go shepards you into is to make really verbose code (well, compared to most languages except Java I guess), where everything is explicit, unless hidden by indirection. This is both a blessing and a curse.
To allocate things on stack you have to either use only value types or use unsafe code. Which is fine for small performance critical sections but will introduce bugs and hinder productivity much if used for large code bases.
I spend a fair amount of time in C# and don’t think about performance a lot unless it’s obvious, O(N^2) type of stuff. I’m always trying to level up so I would appreciate some tips.
What tooling are you referring to that will make C# really fast?
Also, what are you referring to with non-devirtualized interfaces vs heap classes with LINQ?
It is the language plus the community. And not just the language.
As an example, there is nothing about Ruby that makes it more or less prone to monkey-patching than many other dynamic languages. But once a certain number of popular frameworks did that, there was no getting away from that. (Rails even has a convention around where you put your monkey patches.)
> there is nothing about Ruby that makes it more or less prone to monkey-patching than many other dynamic languages.
Python disallows making changes to fundamental types like `int` and `list`. It’s not possible for a Python framework to support something like Rails’ `2.days.ago`.
Interestingly, I don’t think this was an explicit decision made when designing Python - it’s just a side effect of the built-in types being written in C rather than in Python itself.
For anyone (like me) who doesn't know what monkey patching is, wikipedia says it is "dynamic modifications of a class or module at runtime, motivated by the intent to patch existing third-party code as a workaround to a bug or feature which does not act as desired"
I like the concept and I particularly like the way I feel nim shepherds me:
* I very rarely need to come up with a name for a function or other identifier. The correct name can be reused for multiple use cases thanks to the type system and prox overloading.
* to spend a little time designing the interface before jumping in the code
* but also to think what I really need to accomplish and get to it instead of building a grandiose architecture
* to have consistent apis
* to steer away from OOP
* to rely on minimal dependencies and to be kind of minimal in general
* to use the correct tool for the problem (macro are not easy to write and that’s good otherwise you will abuse them. Instead they are great to use)
* to build main table code
* ...
I would be interested in what other nimmers think are the good shepherding.
One might also think of what is bad shepherding of nim, although nothing comes to mind at the moment.
When I program in C++, there are lots of things one must consider. Should this be const, public/private, virtual, should I create a class, should I first create an abstract base class, should I create a factory, should I implement the PIMPL idiom, should this be a template function. The list of concerns is nearly endless. When I write in Python I tend to mainly think about solving my problem. In C++ I will naturally think more about performance and in Python that concern comes only if something seems slow. I make no claims about which is better, just that the language definitely affects me and the approach I take.
This also can change over time. For example, 15 years ago PHP shepherded you to include every file you were using explicitly, making it hard to reason about a given project if you weren’t the creator.
A big effort ensued to change that — class autoloading became the standard, and a large community arose around that standard.
Similarly, JavaScript shepherded you towards some bad practices that the community has now found ample remedies for.
Yes! Another example.of this is C# and F#. Both build on the same .net base and both are Turing complete languages with elements of OOP and FP, but wow, the code that those communities write are completely different. They each steer you in different directions.
Title reminds me of "guide you to the pit of success" (ie, a slippery slope w a positive ending), which IIRC I first encountered in a post by Zeit cofounder G Rauch, writing about NextJS.
" Yet, in practice, many Perl scripts do XML (and HTML) manipulation with regexes, which is brittle and "wrong" for lack of a better term. This is a clear case of shepherding. Text manipulation in Perl is easy. Importing, calling and using an XML parser is not."
Importing, calling, and using an XML parser is straightforward in Perl. The same for HTML: think about what you want to process, what you want to ignore, and write your callbacks accordingly.
Perl coding was a significant portion of my career, and much of that involved web services, involving XML. I've never come across code that attempted to parse XML with regexes in Perl. It was always done with easy-to-import-and-call XML DOM or SAX parsers.
[+] [-] c3534l|6 years ago|reply
Haskell shepherds you into separating out IO code from library code to such an extent that literally any function that has an IO action taints the value returned from that function, causing it to be an IO value, and trying to pass that IO value into another function makes the return type of that function IO, too. Parametric polymorphism is the default, too, so it also shepherds you into writing general purpose code. Haskell is full of these little decisions where it just won't let you do something because it's not "correct" code, and they kind of don't care if that makes coding in it a fight against the compiler.
Rust took that philosophy and applied it pointers. Every value has a lifetime and an ownership which makes it quite hard to do things that aren't memory safe.
Both Rust and Haskell wrap values that can fail in little boxes, and to get them out you have to check which type of value it is, and in C# there's nothing stopping you from returning null and not telling anyone that you can return null, and just assuming people will check for null all the time. Haskell has a philosophy of "make invalid code unrepresentable." The concept of a value being in a box, rather than null being a possible value makes it impossible to use that value without getting it out.
People who write Go love that concurrency is easy and Go fmt has enforced a single canonical style. Building these sorts of things into the language goes a long way in getting them adopted and becoming the norm.
I think we saw a rise of the easy, anything-goes, screw-performance scripting languages. I think the next fashion seems to be in enforcing "correct" coding style. They all have their place.
[+] [-] tmountain|6 years ago|reply
[+] [-] SkyBelow|6 years ago|reply
Having freedom to do what you want is great. Even if you shoot yourself in the foot, you learn your lesson and become a better developer. But as you work with increasing numbers of people, many making the same mistakes you have made, and especially as you end up having to fix their mistakes, you begin to look for a tool that takes away their ability to shoot themselves in the foot. That was one appeal of Rust when I was learning it. It is a pain to fight the compiler over memory, especially coming from a garbage collected background, but it both protected me from myself and protected me from others. At a certain point, at least on large enough group projects, the benefits of that protection outweigh the costs.
[+] [-] PhilippGille|6 years ago|reply
You mean there was nothing?
https://docs.microsoft.com/en-us/dotnet/csharp/nullable-refe...
[+] [-] gowld|6 years ago|reply
Paying that IO/do/return syntax tax (sin-tax? syn-tax?) is still cheaper than the signature/return boilerplate in competing compiled languages like C and Java. Haskell invites you avoud that syn-tax by writing principled IO.
One of the major complains of Haskell is that it is so expressive and powerful that there are so many incompatible ways of architecting modules. (see the incompatibilities in implementations of Monad Transformers / Effects, Lens, etc)
Rails is perhaps the original "opionated" system. https://guides.rubyonrails.org/getting_started.html
[+] [-] 7thaccount|6 years ago|reply
[+] [-] seangrogg|6 years ago|reply
[+] [-] naasking|6 years ago|reply
They care more about predicability and compositionality than about a novice's struggles. Professional programmers should prioritise those things. Certainly you can not care about those things for personal projects.
That said, Haskell could of course use plenty of ergonomic improvements, but the ones you describe are not among them.
[+] [-] BiteCode_dev|6 years ago|reply
This is the sane way of looking at it.
We noticed that a lot of tasks were not worth the trouble of ensuring correctness, and so dynamic languages too over.
But then a lot of system scaled to a point complexity was hard to manage. And those system had huge economic impact. Which made perf and correctness valuable again, especially since they had a lower price of entry.
I do a lot of Python, usually with dynamic types and mixing IO everywhere. It works surprising well for a ton of cases, and can scale quite far. But I recently wanted to make a system that would provide a plugin system that included a scriptable scenario of input collection that would be chained up to a rendering of some sort. This required to disconnect completely the logic of the scenario - controlled by the 3rd party dev writing it - from the source of the input, and to make the API contract very strict.
It was quite a pleasant experience, seing that Python was capable of all that. Type hints work well now, and coroutines are exactly made for that use case. You can make your entire lib Sans I/O using coroutines as contracts. The automatic state saving and step by step execution is not inelegant.
But you can feel that it's been added on top of the original language design. It's not seamless. It's not shepherding you at all, you need to have discipline, and a deep understanding of the concepts. Which is the opposite of how it feels for other more core features of Python: well integrated, inviting you to do the right thing.
I'm hoping the technology will advance enough so that we eventually get one language that can navigate both side of the spectrum. Giving you the ease of Python/Ruby scripting, data analysis, and agility for medium projects, but letting you transition to Haskell/Rust safety nets and perfs with strict typing + memory safety without GC + good I/O hygiene in a progressive way. Something that can scale up, and scale down.
Right now we always have to choose. I've looked at swift, go, v, zig, nim, various lisps and jvm/.net based products. They always have those sweet spots, but also those blind spots. Which of course people loving the language often don't see (I know some people reading this comment will want to shim in their favorite as candidate, don't bother).
Now you could argue that we can't have it all: choose the right tool for the right job. But I disagree. I think we will eventually have it all. IT is a young field, and we are just at the beginning of what we can do.
Maybe as a transition, we will have a low level runtime language that also include a high level language runtime. Like a rust platform with a python implementation, or what v-lang does with v-script. It won't be the perfect solution, but I'd certainly use something like that.
[+] [-] intrepidhero|6 years ago|reply
Newbies (speaking from experience) need a framework to lean on. Something that provides a starting point for solving problems. Opinionated languages provide that out of the box.
[+] [-] myu701|6 years ago|reply
F# is the .NET citizen that does the equivalent of the Rust or Haskell stuff.
Either you use an option type (https://fsharpforfunandprofit.com/posts/the-option-type/) which is an easy way of making a function that says 'user says to find a record with the name of Bob, and you will either get a return type of Some record(id:1,name:bob), or you will get a return type of None'
Or you use the Success/Failure type (see railway oriented programming)[+] [-] cjfd|6 years ago|reply
[+] [-] jlokier|6 years ago|reply
From a Haskell perspective, and a correctness perspective, and also Rust with its pointer tracking, all this makes sense. It's very helpful for correctness.
Yet, the IO monad "virality" reminds me of Java checked exceptions. Checked exceptions mean every function type signature includes the set of exceptions that function might throw.
When that was introduced, it was thought to be a good idea because it's part of the type-safety of Java and will ensure programmers write code that deals with exceptions correctly, one way or another.
But some years later, people started to argue that listing exceptions in the type signature is causing more software engineering problems than it solves (and C# designers took the decision to not include checked exceptions). Googling "checked exceptions harmful" yields plenty of essays on this.
For checked exceptions, there are people arguing both sides of it. Yet they are pretty much all fans of static typing for the rest of the language; it isn't an argument between people who favour static vs. dynamic typing.
So why are checked exceptions considered harmful by some? On the face of it, there's an argument against verbosity. But the deeper one is about software engineering. What I call "type brittleness".
When you have a large codebase, beautifully and carefully annotated with exact, detailed checked-exception signatures, then one day you have to add a trivial little something to one little function that might throw an exception not already in that function's signature... You may have to go through the large codebase, updating signatures on hundreds or thousands of functions which use the first little function indirectly.
And that's if you have the source. When you have libaries you can't change, you have to wrap and unwrap exceptions all over the place to allow them to propagate via libraries which call back to your own code. Sometimes there is no exception type explicitly allowed by the libaries, so you wrap and unwrap using Java's RuntimeException, the one which all functions allow.
The "viral effect" of so much effort for sometimes tiny changes is a brittleness issue. It leads people to resort to "catch and discard all" try-blocks, to confine the virality Sometimes it's "temporary", but you know how it is with temporary things. Sometimes it isn't temporary because the programmer can't find another clean way to do it while not modifying things they shouldn't or can't.
[+] [-] yarrel|6 years ago|reply
[+] [-] zackmorris|6 years ago|reply
For example, say we want to hide low-level threading primitives due to their danger. So we implement a channel system like Go. But we run into a problem where copying data is expensive, so the compiler/runtime has an elaborate mechanism to pass everything by reference and verify that two threads don't try to write the same data. I'm glossing over details here, but basically we end up with Rust.
But what if we questioned our initial assumptions and borrowed techniques from other languages? So we decide to pass everything by value and use a mechanism like copy-on-write (COW) so that mutable data isn't actually copied until it's changed. Now we end up with something more like Clojure and state begins to look more like git under the hood. But novices can just be told that piping data between threads is a free operation unless it's mutated.
To me, the second approach has numerous advantages. I can't prove it mathematically, but my instincts and experience tell me that both approaches can be made to have nearly identical performance. So on a very basic level, I don't quite understand why Rust is a thing. And I look at tons of languages today and sense those fundamental code smells that nobody seems to talk about like boxing, not automatically converting for to foreach to higher level functions (by statically tracing side effects), making us manually write prototypes/headers, etc etc etc.
I really feel that if we could gather all of the best aspects of every language (for example the "having the system on hand" convenience of PHP, the vector processing of MATLAB, the "automagically convert this code to SIMD to run on the GPU" of Julia <- do I have this right?), then we could design a language that satisfies every instinct we have as developers (so that we almost don't need a manual) while at the same time giving us the formalism and performance of the more advanced languages like Haskell. What I'm trying to say is that I think that safe functional programming could be made to look nearly identical to Javascript, or even some of the spoken-language attempts like HyperTalk.
The handwaving around the bare hands stuff is what tires me out as a coder today because fundamentally I just don't view it as necessary. I really believe that there is always a better way, and that we can evolve towards that.
[+] [-] DubiousPusher|6 years ago|reply
I then worked for awhile developing a fairly fresh C++ code base. The programers I worked with were very willing to write maintainable code and follow a standard and it was still really damn hard to keep things like header hygene.
When I go back to the language I can't believe how much time I spend dealing with minor issues that stem from the bad habits it builds. For years I would refuse to say any language was good or bad. Always I insisted you use the right tool for the job. And there are some features of C++ that when you need them you have to use that language or maybe C in its place. But the shortcomings are unrelated to language's issues which largely seem to come from a focus on backward comparability. And so even used in its right application it seems incredibly flawed. And I pretty much believe it's a bad language now.
Disclaimer: I learned to program with C++, I understand its power and for years I loved the language. I also understand there are situations where despite its shortcomings it is the right choice.
[+] [-] naikrovek|6 years ago|reply
Game development often has a very large global state, and game problems are often inherently global state manipulation problems; you need globals in order to even have the game in many cases.
[+] [-] cryptonector|6 years ago|reply
[+] [-] TheBobinator|6 years ago|reply
Games are a simulacrum of reality and reality doesn't say properties of two different objects can never, ever, interact with each other; that's why you have the abuse of global variables to store state and also why there's a rich speedrunning community using all sorts of hacks in games to speed up their playtime due to unforseen edge cases. If you build a model of reality, you're going to be doing R&D learning how it interacts with itself, just like we do today!
Nobody wants to play a game with a static workflow.
[+] [-] joelfolksy|6 years ago|reply
[+] [-] cies|6 years ago|reply
Would you say the reason for choosing it are not inherent to the lang itself but to things like: experience of the team, availability of libraries/ecosystem, need for mature/fast compilers?
[+] [-] l0b0|6 years ago|reply
- Python shepherds you into using list comprehensions, even when it's almost always premature optimization and much harder to read than a loop. As a language smell that's not bad, it's just the worst I could think of in a post-v2 world. Luckily there's `black`, `flake8`, `isort` and `mypy`.
- Bash shepherds you into using `ls` and backticks, useless `cat`s, treating all text as ASCII, and premature portability in the form of "POSIX-ish". Luckily `shellcheck` takes care of several common issues.
- Lisp shepherds you into building your own language.
There's also tool shepherding:
- IDEA shepherds you into refactoring all the time, since it's the only IDE which does this anywhere near reliably enough. (At least in Java. In other languages renaming something with a common name is almost guaranteed to also rename unrelated stuff.)
- Firefox shepherds you into using privacy extensions.
- Chrome shepherds you into using Google extensions.
- Android shepherds you into installing shedloads of apps you hardly ever use.
- *nixes other than Mac OS shepherd you into using the shell and distrusting software by default.
- Windows and Mac OS shepherd you into using GUIs for everything and trusting software by default.
[+] [-] savolai|6 years ago|reply
This sounds like less active guidance than nudging or shepherding. Creating affordances is still an active design choice though.
https://en.wikipedia.org/wiki/Affordance
http://johnnyholland.org/2010/04/perceived-affordances-and-d...
[+] [-] fatso784|6 years ago|reply
It's related to, but definitely not the same as, linguistic relativism. Programming "language" might be a bit of a misnomer, because it creates a false equivalency to natural language. Just as different subfields of mathematics were created to solve different problems, so too were different programming "languages" inspired by different subfields and their notations. With that view, it's unsurprising that some ways of doing things highlight certain methods of solving problems and obscure or impede others.
[+] [-] fileyfood500|6 years ago|reply
Working with Javascript/Typescript, we need to rely on linters to enforce safe practices in Javascript.
[+] [-] gameswithgo|6 years ago|reply
something i like about go is it shepards you to write code any other go programmer can follow easily
something i dislike about c# is it has the tools to let you write very very fast code but shepards you to use non devirtualized interfaces over heap allocated classes tied together with linq overhead.
[+] [-] capableweb|6 years ago|reply
Sure, the syntax and indentation levels are all the same, but that's not really the difficult parts of programming. The difficulty comes from abstractions, indirections and other abstract things that Go, just as any language, let's you do however you want.
There are of course codebases made in Go where the indirections makes no sense and are hard to follow, just as in any language.
What Go shepards you into is to make really verbose code (well, compared to most languages except Java I guess), where everything is explicit, unless hidden by indirection. This is both a blessing and a curse.
[+] [-] pjmlp|6 years ago|reply
[+] [-] DeathArrow|6 years ago|reply
[+] [-] elamje|6 years ago|reply
What tooling are you referring to that will make C# really fast?
Also, what are you referring to with non-devirtualized interfaces vs heap classes with LINQ?
[+] [-] btilly|6 years ago|reply
As an example, there is nothing about Ruby that makes it more or less prone to monkey-patching than many other dynamic languages. But once a certain number of popular frameworks did that, there was no getting away from that. (Rails even has a convention around where you put your monkey patches.)
[+] [-] pansa2|6 years ago|reply
Python disallows making changes to fundamental types like `int` and `list`. It’s not possible for a Python framework to support something like Rails’ `2.days.ago`.
Interestingly, I don’t think this was an explicit decision made when designing Python - it’s just a side effect of the built-in types being written in C rather than in Python itself.
[+] [-] chipperyman573|6 years ago|reply
https://en.wikipedia.org/wiki/Monkey_patch
[+] [-] joelbluminator|6 years ago|reply
I really like 2.days.ago
There were zero times were I wished it wouldn't do that.
But to each his own.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] pietroppeter|6 years ago|reply
* I very rarely need to come up with a name for a function or other identifier. The correct name can be reused for multiple use cases thanks to the type system and prox overloading. * to spend a little time designing the interface before jumping in the code * but also to think what I really need to accomplish and get to it instead of building a grandiose architecture * to have consistent apis * to steer away from OOP * to rely on minimal dependencies and to be kind of minimal in general * to use the correct tool for the problem (macro are not easy to write and that’s good otherwise you will abuse them. Instead they are great to use) * to build main table code * ...
I would be interested in what other nimmers think are the good shepherding.
One might also think of what is bad shepherding of nim, although nothing comes to mind at the moment.
[+] [-] fsloth|6 years ago|reply
You might enjoy his Mesonbuild build system[0], which has a full manual [1]. It's already used in bunch of high visibility projects [2].
[0] https://mesonbuild.com/
[1] https://meson-manual.com/
[2] https://mesonbuild.com/Users.html
[+] [-] jbritton|6 years ago|reply
[+] [-] muglug|6 years ago|reply
A big effort ensued to change that — class autoloading became the standard, and a large community arose around that standard.
Similarly, JavaScript shepherded you towards some bad practices that the community has now found ample remedies for.
[+] [-] rienbdj|6 years ago|reply
[+] [-] drewm1980|6 years ago|reply
C, C++, and Rust shepherd the programmer to use array of structure memory layout. (though it's usually not great for vectorization)
[+] [-] JoeCamel|6 years ago|reply
[+] [-] Random_ernest|6 years ago|reply
He calls this restriction by paradigms.
[+] [-] LoveMortuus|6 years ago|reply
My guess would be that it is and that shepherding is what we talk about when we say that we can learn from every aspect of life.
If what I wrote is correct, than shepherding is the teacher of reality. But I guess it's on us to decide when we've learned enough and move on.
I was at first wanting to ask if shepherding is present in video games as well, but then I realized what shepherding could actually be.
SHEPHERDING; The part of an aspect that _can_ teach you something.
_can_, because it's up to you to decide if you'll learn anything.
Is shepherding always negative or can it be positive?
Also, if we would always strive to fix what shepherding teaches us, would that mean that in infinite amount of time we would reach perfection?
And, last question I swear, is shepherding subjective or objective. Or is it both?
[+] [-] kkdaemas|6 years ago|reply
[+] [-] chrisweekly|6 years ago|reply
[+] [-] cafard|6 years ago|reply
Importing, calling, and using an XML parser is straightforward in Perl. The same for HTML: think about what you want to process, what you want to ignore, and write your callbacks accordingly.
[+] [-] fmakunbound|6 years ago|reply
Perl coding was a significant portion of my career, and much of that involved web services, involving XML. I've never come across code that attempted to parse XML with regexes in Perl. It was always done with easy-to-import-and-call XML DOM or SAX parsers.
[+] [-] djyde|6 years ago|reply
[+] [-] collyw|6 years ago|reply