Switched to D 4 years ago, and have never looked back. I wager that you can sit down a C++/Java/C# veteran, and say write some D code. Here's the manual, have fun. They will with in a few hours be comfortable with the language, and be fairly competent D programmer. Very little FUD surrounding the switching to yet another language with D.
D's only issue is that it does not have general adoption, which I'm willing to assert is only because it's not on the forefront of the cool kids language of the week. Which is a good thing. New does not always Mean, improved. D has a historical nod to languages of the past, and is trying to improve the on strengths of C/C++ and smooth out the rough edges, and adopt more modern programming concepts. Especially with trying to be ABI compatible, it's a passing of the torch from the old guard to the new.
Regardless of your thoughts on D; My opinion is I'm sold on D, It's here to stay. In 10 years D will still be in use, where as the fad languages will just be foot notes in Computer Science history as nice experiments that brought in new idea's but were just too out there in the fringes limiting themselves to the "thing/fad" of that language.
I think a lot of languages that get popular have to have a thing. Like a thing they do well, and hopefully change the world a little...
Python had math and has lots more maths now.
R also has math and started to kill Python, but now Python has Tensorflow and PyTorch.
Scala has Spark.
Java has Tomcat, and everything that followed, which is probably 20% of the world's mass by volume.
Go has Docker
Ruby has/had a railroad or something?
JS has well, f... I don't know where to start here.
> D's only issue is that it does not have general adoption, which I'm willing to assert is only because it's not on the forefront of the cool kids language of the week. Which is a good thing.
The reference compiler not being open source makes a language not cool with the cool kids. (Not that I'm blaming anyone).
I'm not sure that "it's going to stay", as you say, but at the very least it's been hugely influential on C++ (if constexpr, anyone?) and even for just that it is and has been valuable for everyone working in C++ land.
Interesting. I prefer to write in high-level strongly dynamically typed garbage-collected languages when I can.
But of course I can't always do so and get the performance I want. My approach is generally to prototype in languages like Python, but implement in C89. See: https://www.youtube.com/watch?v=WNTOpl30MIQ . That's 10s of thousands of times faster than our initial prototype and...
We profiled all sorts of things in order to make it that fast. Right down to cache misses, branch mis-prediction, etc, how concurrency on the CPU interacts with I/O.
I think you answered yourself why D is not more popular with the "cool kids". C++/Java/C# are overengineered, verbose, horrible languages to use. Java is a bastard of C++, and C# is the better looking bastard.
Yes we use them, and even like them in a weird Stockholm Syndrome way, but they're not fun to work with. We use them because we need them, not because we enjoy them.
Disclaimer: the personal pronoun "we" as used here means the author plus zero or more people.
Walter, thank you so much for finally doing this! I am so happy that Symantec finally listened. It must have been really frustrating to have to wait so long for this to happen. I have really been enjoying D and I love all the innovation in it. I'm really looking forward to seeing the reference compiler packaged for free operating systems.
Honestly, since I'm slightly psychotic about these things, this is a kind of huge to me. Part of the reason I never learned D was because the compiler was partly proprietary.
Now I have no excuse to avoid learning the language, and that should be fun.
Does that mean that the backend can be rewritten in D at some point? Speaking of which, there could be a standard D intermediate language (Or is there one? I've never at the glue code in between the frontend and backend/s)
Interesting change! Before, people had a choice between the proprietary Digital Mars D (dmd) compiler, or the GCC-based GDC compiler. And apparently, since the last time I looked, also the "LDC" compiler that used the already-open dmd frontend but replaced the proprietary backend with LLVM.
I wonder how releasing the dmd backend as Open Source will change the balance between the various compilers, and what people will favor going forward?
Please don't get me wrong, as I don't want to start a flame here, but why do they call D a "systems programming language" when it uses a GC? Or is it optional? I'm just reading through the docs. They do have a command line option to disable the GC but anyway...this GC thing is, imho, a no-go when it comes to systems programming. It reminds me of Go that started as a "systems programming language" too but later switched to a more realistic "networking stack".
Systems programming languages with GC exist since the late 60's, with ALGOL 68RS being one of the first ones.
Since then a few remarkable ones were Mesa/Cedar, Modula-2+, Modula-3, Oberon(-2), Active Oberon, Sing#, System C#.
The reasons why so far most of them didn't won the hearts of the industry weren't not only technical, but also political.
For example Modula-3 research died the moment Compaq bought DEC research labs, or more recently System C# died when MSR disbanded the Midori research group.
If you want to learn how a full workstation OS can be written in a GC enabled systems programming language, check the Project Oberon book.
Here is the revised 2013 version, the original one being from 1992.
D's garbage collector is both written in D and entirely optional, that alone should qualify it as a systems language.
The gc can be disabled with @nogc, the command line flags are only if you want to disable it for the whole program or if you want warnings as to where the allocations happen.
https://godbolt.org/g/IQ0O06
The GC is only relevant if you allocate using the GC, because that is the only time the GC can run. If you use @nogc on your functions, you are guaranteed not to have GC allocations. You can use D as a better C with no GC but other good features. You can even avoid the D runtime compeletely if you want.
This was something that always rubbed me the wrong way about the language, and it was an impediment for adoption for me (for D, but also Shen and a few others). In this era, there is no excuse for a closed source reference compiler (I could care less if it's not a reference compiler, I just won't use it). I'm surprised it took this long to do this, it seems like D has lost most of its relevance by now...relevance it could have kept with a little more adoption. I wonder if it can recover.
This is exactly my thought. I was really excited about D1 at the time when there were no Go/Rust/Swift/... I called D the C++ should be. I recommended D to almost everyone. They liked it. Some of them even wrote non-trivial programs in D. However, all of them went back to C/C++ later during the painful D1 to D2 transition amidst unnecessary clashes. It was a mess IMHO, which deeply hurt D's adoption. D2 managed to reach a consensus among different parties in the end, but at that point, Go was stabilized, Rust started to look promising with more exciting modern features. D is not the shiny and unique language any more. D had a chance to gain popularity, but now it has lost the momentum. It is a pity.
Something I always thought was cool about dlang was that you can talk to the creator of the programming language on the forums. I don't write much D code as of now, but I always visit the forums everyday for the focused technical discussions. Anyways, congrats on the big news!
I think one of the very good features of D is that it provides a lot of infrastructure that helps with lots of little generic things -- things that are not really needed in the language, but help reduce the amount of code you write or simplify common actions you take. It seems like Walter (and all the other contributors) have distilled their experience writing programs into creating a language that helps a lot with some of these things.
Except for perhaps Lisp languages, almost no language makes compile-time computing and code-generation so easy. This allows for some really powerful language features that can be designed as libraries, and puts this power in the hands of "regular developers" rather than only in the hands of template wizards.
If you want to work with C code, it is an excellent choice. I use it for numerical computing. It's an easy language to learn, no need to worry about memory management if you don't want/need to, generally good syntax, nice compile time features. Overall the best "better C" in my opinion.
Weka.io develop a software-defined distributed storage system in D with latencies well below a millisecond (over Ethernet). They use the GC in some parts, but avoid it on the hot path as much as possible as it is a simple pause-the-world mark-and-sweep collector.
All in all, the GC isn't as much of an issue for high-performance applications as is sometimes claimed, but all the recent work towards reducing the dependency on it has of course been done for a reason – in performance-critical code, you often don't want any allocations at all (GC or not), and the large jitter due to collections can be a problem in some situations.
Is there support for BigFloat in D/phobos or any auxiliary library? I was playing around with D sequences and wrote a D program that calculates a Fibonacci sequence (iterative) with overflow detection that upgrades reals to BigInts. I wanted to also use Binet's formula which requires sqrt(5) but it only works up to n=96 or so due to floating point precision loss.
No matter what people say about the applicability of Dlang, Dlang has a very bright future in science. GIS, compilers, math environments will benefit by translating to D instead of C++. D is my language of choice for this stuff.
It's really surprising that to this day, there are languages in use which have its reference implementation closed source. All the possible optimizations and collaboration possible when it's open is invaluable.
Has there been any new books out there to learn D? I have one that still references the Collection Wars (Phobos vs Native). Once I saw that, I put the book back on the shelf and stuck with Java.
[+] [-] iamNumber4|9 years ago|reply
Switched to D 4 years ago, and have never looked back. I wager that you can sit down a C++/Java/C# veteran, and say write some D code. Here's the manual, have fun. They will with in a few hours be comfortable with the language, and be fairly competent D programmer. Very little FUD surrounding the switching to yet another language with D.
D's only issue is that it does not have general adoption, which I'm willing to assert is only because it's not on the forefront of the cool kids language of the week. Which is a good thing. New does not always Mean, improved. D has a historical nod to languages of the past, and is trying to improve the on strengths of C/C++ and smooth out the rough edges, and adopt more modern programming concepts. Especially with trying to be ABI compatible, it's a passing of the torch from the old guard to the new.
Regardless of your thoughts on D; My opinion is I'm sold on D, It's here to stay. In 10 years D will still be in use, where as the fad languages will just be foot notes in Computer Science history as nice experiments that brought in new idea's but were just too out there in the fringes limiting themselves to the "thing/fad" of that language.
[+] [-] coding123|9 years ago|reply
Python had math and has lots more maths now. R also has math and started to kill Python, but now Python has Tensorflow and PyTorch. Scala has Spark. Java has Tomcat, and everything that followed, which is probably 20% of the world's mass by volume. Go has Docker Ruby has/had a railroad or something? JS has well, f... I don't know where to start here.
Does D have a thing?
[+] [-] adamnemecek|9 years ago|reply
The reference compiler not being open source makes a language not cool with the cool kids. (Not that I'm blaming anyone).
[+] [-] lomnakkus|9 years ago|reply
[+] [-] TTPrograms|9 years ago|reply
[+] [-] Tagore|9 years ago|reply
But of course I can't always do so and get the performance I want. My approach is generally to prototype in languages like Python, but implement in C89. See: https://www.youtube.com/watch?v=WNTOpl30MIQ . That's 10s of thousands of times faster than our initial prototype and...
We profiled all sorts of things in order to make it that fast. Right down to cache misses, branch mis-prediction, etc, how concurrency on the CPU interacts with I/O.
[+] [-] oblio|9 years ago|reply
[+] [-] ASalazarMX|9 years ago|reply
Yes we use them, and even like them in a weird Stockholm Syndrome way, but they're not fun to work with. We use them because we need them, not because we enjoy them.
Disclaimer: the personal pronoun "we" as used here means the author plus zero or more people.
[+] [-] jordigh|9 years ago|reply
Thanks again, this news makes me very happy!
[+] [-] WalterBright|9 years ago|reply
[+] [-] cheez|9 years ago|reply
[+] [-] tombert|9 years ago|reply
Now I have no excuse to avoid learning the language, and that should be fun.
[+] [-] nialv7|9 years ago|reply
[+] [-] AsyncAwait|9 years ago|reply
[+] [-] WalterBright|9 years ago|reply
Here it is:
https://github.com/dlang/dmd/pull/6680
[+] [-] omginternets|9 years ago|reply
[+] [-] mhh__|9 years ago|reply
[+] [-] pjmlp|9 years ago|reply
[+] [-] milesrout|9 years ago|reply
[+] [-] JoshTriplett|9 years ago|reply
I wonder how releasing the dmd backend as Open Source will change the balance between the various compilers, and what people will favor going forward?
[+] [-] WalterBright|9 years ago|reply
[+] [-] WalterBright|9 years ago|reply
[+] [-] brakmic|9 years ago|reply
Regards,
[+] [-] pjmlp|9 years ago|reply
Since then a few remarkable ones were Mesa/Cedar, Modula-2+, Modula-3, Oberon(-2), Active Oberon, Sing#, System C#.
The reasons why so far most of them didn't won the hearts of the industry weren't not only technical, but also political.
For example Modula-3 research died the moment Compaq bought DEC research labs, or more recently System C# died when MSR disbanded the Midori research group.
If you want to learn how a full workstation OS can be written in a GC enabled systems programming language, check the Project Oberon book.
Here is the revised 2013 version, the original one being from 1992.
https://people.inf.ethz.ch/wirth/ProjectOberon/index.html
[+] [-] mhh__|9 years ago|reply
The gc can be disabled with @nogc, the command line flags are only if you want to disable it for the whole program or if you want warnings as to where the allocations happen. https://godbolt.org/g/IQ0O06
[+] [-] bachmeier|9 years ago|reply
[+] [-] asimpletune|9 years ago|reply
P.S. I think of a system's language as one that runs directly on the machine, e.g. Swift, C, go. They operate at the "system" level.
[+] [-] saosebastiao|9 years ago|reply
[+] [-] attractivechaos|9 years ago|reply
[+] [-] qznc|9 years ago|reply
[+] [-] bluecat|9 years ago|reply
[+] [-] softinio|9 years ago|reply
[+] [-] grok2|9 years ago|reply
Stuff that I particularly like.
Assert/Enforce: http://ddili.org/ders/d.en/assert.html , Unit testing: http://ddili.org/ders/d.en/unit_testing.html , Contract programming help: http://ddili.org/ders/d.en/contracts.html, and http://ddili.org/ders/d.en/invariant.html , Scope (love this): http://ddili.org/ders/d.en/scope.html
And the other usuals -- templates, mixins, etc....
BTW, I mostly use D as a better C than as a better C++...
[+] [-] gmfawcett|9 years ago|reply
[+] [-] bachmeier|9 years ago|reply
[+] [-] amelius|9 years ago|reply
(If I'm going to learn a new system programming language, which one should I pick?)
[+] [-] jacquesm|9 years ago|reply
Congratulations Walter, now let's see D take over the world.
[+] [-] zerr|9 years ago|reply
[+] [-] bachmeier|9 years ago|reply
[1] http://blog.mir.dlang.io/glas/benchmark/openblas/2016/09/23/...
[+] [-] WalterBright|9 years ago|reply
[+] [-] klickverbot|9 years ago|reply
All in all, the GC isn't as much of an issue for high-performance applications as is sometimes claimed, but all the recent work towards reducing the dependency on it has of course been done for a reason – in performance-critical code, you often don't want any allocations at all (GC or not), and the large jitter due to collections can be a problem in some situations.
[+] [-] Samathy|9 years ago|reply
Hopefully a fully FOSS compiler will bring it right into the mainstream.
[+] [-] petre|9 years ago|reply
[+] [-] xtreak_1|9 years ago|reply
[+] [-] petre|9 years ago|reply
[+] [-] fithisux|9 years ago|reply
[+] [-] noway421|9 years ago|reply
[+] [-] virmundi|9 years ago|reply
[+] [-] snackai|9 years ago|reply
[+] [-] tbrock|9 years ago|reply