(no title)
throwaway89988 | 2 years ago
Scala failed hard because it is kitchen sink of every half baked feature someone wanted to write a PhD thesis about. The graphs of the interdependencies of the standard library are an excellent example of a totally insane design, the Scala data structures have been at least an order of magnitude slower than the JVM native ones. Don't get me started about the tooling, which is too slow for any real world projects. (The only ones I regularly see to use Emacs are Scala developers, because opening projects in an IDE like IDEA could take up to 30 min on high end workstations.)
Wirth valued clean design, speed and simplicity. Odersky wants to compete with C++ for complexity. As the article stated, for Wirth a feature would have to pay for itself regarding complexity, speed and usability. If someone can demonstrate how Scalas features came to pay for themselves, I would appreciate a pointer.
Finally... generics in Java are a shit show, thanks to type erasure. Fair enough, Odersky was probably forced for this implementation, thanks to backwards compatibility, still, nothing to be proud of.
kagakuninja|2 years ago
I've used Scala professionally for the last 8 years. The tooling is fine, assuming you avoid exotic libraries like Shapeless. Compile times are fast enough that I don't think about it much, using IntelliJ incremental compilation. A full recompile of a micro service might take 30 seconds, whatever it is is not a big deal IMO.
> opening projects in an IDE like IDEA could take up to 30 min on high end workstations
This is insanely wrong. Maybe you are operating on experiences from 10 years ago? I have a new M2 MacBook Pro, and opening projects is quite fast. The first time you do it, it will resolve the SBT dependencies and index files. That can be done in the background, and is probably less than a minute for a typical project. Even my previous laptop could open projects quickly despite being 3 years old.
> the Scala data structures have been at least an order of magnitude slower than the JVM native ones
They are slower, but not an order of magnitude slower. Perhaps you are remembering the infamous email from the Yammer CTO that got leaked, but that was a long time ago, and the compiler and libraries have improved greatly since then.
But yes, in performance critical code, you can just switch to an imperative style and use Java collections.
I don't have time to explain the benefits of Scala features, but I'll just point out that other than implicits, many ML inspired features of Scala have made their way into modern languages, including Java, C#, Rust and Swift. Scala didn't invent those ideas, but repackaged them in a novel way.
throwaway89988|2 years ago
Concerning your type erasure example, the id function obviously doesn't need to know anything about types. In the real world, types have traits/interfaces/expected attributes etc. and type erasure prevents the compiler to verify this when using binary dependencies, which is obviously _not_ what one wants in a statically compiled language.
dragosiulian|2 years ago
Having been a PhD student in his lab between 2005-2010, I can attest that's not true. Language design was definitely Martin's prerogative. As a PhD student, a lot of time went into implementation efforts, and while several features did end up in PhD theses, the last word on what goes in the language was Martin's.
Scala was ultimately a research project, with ambitions to succeed outside of academia. As such, some ideas worked better than others (pattern matching and case classes vs specialization). As it became clear that the language was picking up in the industry (lead by Twitter around 2008), many of these experiments moved into compiler plugins, macro libraries or research forks.
> Sorry, but Odersky is nothing like Wirth.
I haven't known Wirth to venture into such broad statements, but given what's been written about the way he lead his lab, I would say there were many similarities: a strong bias for solid implementation work, the (bootstrapping) compiler as a litmus test for features or performance work. And yes, a focus on simplicity.
Unfortunately, simplicity is often confused with familiarity. What's simpler: having statement and expressions as separate concepts that don't mix well, or only expressions? Most people coming from C and Java would have internalized the dichotomy and find it (or at least, back in the day, found it) "complex" to think of every expression having a type ('void' vs 'Unit'). The same goes for the split between primitive and reference types vs a unified type hierarchy.
This is not to say Scala doesn't have its warts, and implicits (in particular, implicit resolution rules) combined with macros could lead to a lot of pain. Hopefully there are lessons learned there and Scala 3 is better.
> Scala failed hard
Far from obvious. Databricks has probably upwards 10MLOC of Scala code, and seem to be doing very well (https://www.lihaoyi.com/post/TheDeathofHypeWhatsNextforScala...). Plenty other examples.
Could Scala have been more successful? Undoubtedly. But it's far from a "hard failure". New languages have adopted many Scala features, so nowadays Scala is believed to pay for itself only when using pure-FP libraries. That's very unfortunate if you ask me, since I believe there's a pragmatic sweet spot that lies around the style best illustrated by Haoyi Li's ecosystem of libraries.
throwaway89988|2 years ago
Let me elaborate: Scala failed hard in the sense, that it was IMHO a far superior language compared to Java around 2010 (! JDK 7/8 times) and basically is dead now for new projects (unless there are some die hard Scala fanatics on a team, and even they are moving for greener pastures), also see how Kotlin succeeds everywhere at the moment.
What I totally don't get is, that Scalas failure was _not_ surprising at all, and there were a lot of kind people giving constructive feedback, why Scala fails in the industry (example: https://gist.github.com/alexo/1406271):
- Slow compile times - No binary compatibility even between minor updates - Every feature under the sun was stuffed into Scala, making it impossible to transfer projects to 'industry programmers' w/o too extensive training - Tooling support (like IDEs) was extremely lacking/slow/bad - Not to speak about the community infights about the right way(TM) to approach a problem
Personal experience from me: Scala was too slow/cumbersome to use with subpar tooling. And I consider myself a target group: In love with FP but forced to deploy on the JVM. Besides my own experience, I saw teams of Scala developers fail to materialize any significant benefit in real world projects over 'dumper' programming languages, not even speaking about transferring Scala projects to non academic 'elite' teams.
I like some ideas in Scala 3 and IMHO it is sad that Kotlin (which is IMHO just syntactic sugar over Java) gets so much attention, but in the end Scala hat plenty of years to fix its problems and its failure comes as no surprise, because there was plenty of feedback. Are there still some Scala projects around? Yes, mostly Scala 2 because, surprise, libraries still don't have binary compatibility etc. For Scala 3 I have neither seen industry adaption or any enthusiasm from a wider community.
bombcar|2 years ago
switchbak|2 years ago