Labview beats the snot out of anything else for building, say, custom spectrometer control and data acquisition systems. I know, because I had to do this. I’ve seen mooks try to do the same thing in C++ or whatever, and laugh scornfully at the result
What's so great about LabVIEW for spetroscopy? I ask because, as part of my MS thesis, I built a fourier transform spectrometer. After messing around for a few days with some LabVIEW code someone else had written for a similar task, I wrote the control code in C in a couple of hours, including the time it took to learn the (deprecated) LabWindows C API. I have no doubt that LabVIEW has improved since I did my thesis (almost ten years ago), but it's not obvious, a priori, that graphical programming should be better for controls.
It's not that Labview is great for spectroscopy. It's that it is great for building things to control large, complex scientific apparatus (mine was an X-ray Fourier spectrometer; the X-ray beamline was also controlled in this way), manage streaming data from dozens of sensors, do the analysis, and rapidly change things around if you stick new hardware in it or need to run things differently. There was C and DSP programming involved on a VxWorks VME crate where things needed to go fast or talk to unsupported hardware, but controlling the whole mechanism was done via Labview. There was no other sane way to manage the complexity.
Oh no, graphical programming languages again. Slightly OT but I think you made the right choice. I'd like to refer you to a series of posts detailing the practical issues from a few days ago:
He had me until "Clojure is popular because Paul Graham is an excellent writer." Did he just confuse Paul Graham for Rich Hickey? I don't get that point otherwise. pg is a lisp guy but I don't see the Clojure connection.
Anyway his overall point seems to be that Clojure isn't the only programming language, or the best one for every case. Well, ok. No one worth taking the time to talk to would ever make that assertion, so it seems like wasted bluster.
I've talked to a lot of people about how they got into Clojure, and a fair number credit Paul Graham's essays for selling them on the idea of using a Lisp. I think there's a pretty clear causal link.
Clojure is my favorite language, and I love Rich Hickey, but I don't think I would have sought out something like Clojure if not for PG's essays.
There was a tweet from Nathan Marz that I felt summed it up nicely a while back "Paul Graham set the bait, Rich Hickey reeled me in". Maybe you're right that I think it's more common than it is because it's true for me, but maybe you think it's less common than it is because it's false for you?
There is also the matter that “programming” is an overly broad word, kinda like “martial arts.” A guy like “Uncle Bob” who spends his time doing OO whatevers has very little to do with what I do. It’s sort of like comparing a guy who does Tai Chi to a guy who does Cornish Wrestling; both martial arts, but, they’re different. My world is made of matrices and floating point numbers. His ain’t.
This. The attitude the OP describes is very common on HN as well, and can't be attributed to Uncle Bob only: the idea that programming means web, cloud, and databases. Other influential writers, like Jeff Atwood, make exactly the same mistake[1]. He can't imagine anyone doing any other martial art than Tai Chi.
This is common on HN for a reason. The typical Y Combinator startup and the average HN reader does web programming for a living. This is HN's target group and there is nothing wrong with that.
Jeff Atwood has a point that desktop apps are dead. I would not build another business app on the desktop. Users are now used to being able to access their data and applications from anywhere, and without having to install any software.
There must be other forums dedicated to scientific computing and similar topics. These topics do find a place in HN from time to time. However, they are not the norm.
I agree with you on the level that it is a very limited view, but as for the OP states that "Most people who consider themselves programmers are employed effectively selling underpants on the internet using LAMP."
I was slightly offended when I read this - I'm 'only' a web developer, but I don't believe I'm any more or less useful or talented than anyone who say, programs kernel drivers.
I'm sure the people who program kernel drivers would disagree with me, on the other hand.
It's a topic which has bugged me for an age now, am I a real programmer? Am I actually making any difference to the world? And the answer to both of those, perhaps, is no, but on the other hand, with the exception of Tim berners-lee, Linus Torvalds and Dennis Ritchie, all of us are pretty much interchangeable. I.E. You could remove us from the team and replace us pretty easily with someone else.
This isn't a "rant on Clojure". It's a rant, all right, and Clojure is prominently featured, but the rant is about "Uncle Bob" Martin's assertion that Clojure is or should be "the last programming language". Which I take to be hyperbole, not meant to be taken literally. But then I haven't watched the video.
Clojure actually gets quite a decent nod here, which, given Locklin's overall snarky tone, is saying something.
I /have/ watched the skills matter video months ago. What i took away from the talk wasn't that Clojure is the last programming language, but that;
- Programming language innovation is rather cyclical.
- Frameworks can be re-implemented, languages ecosystems and frameworks are very important but one should not exclude a language because of it's lack of a framework feature (I think he was picking out .NET and Java here, suggesting that enterprises who select C# or Java because of the frameworks are missing the point)
- Polymorphism is achievable in Clojure, so many popular programming techniques are achievable.
I thought the talk was thought provoking and interesting. Whilst it certainly advocated Clojure, I certainly took the conclusion with a pinch of salt. The entire talk is twinged with humour. It's supposed to spark debate and help push people from their comfort zones IMHO.
I don't get all the anger. Most popular writers are opinionated, and talk to their "crowd". If Uncle Bob (or Joel, or Jeff, or DHH) had to preface all discussion with "does not apply to people who do scientific programming, make games, or Albanians" we'd probably end up missing most of their points. It seems you understand the writer's context, so be happy that the answers for your neck of the coding woods are different.
Similarly, many posts around here are targeted at people in (at least) the US, often California, and usually SF or the Valley. I don't post "Noooo! It doesn't apply to me because I live in South Africa!" every time, because I understand what the context is. Something doesn't have to be universally true for it to be true in its context, and often useful for us outside of it.
In any case, thanks for the alternate opinion - it's good to be reminded that there are universes outside of ours.
That's true. But once Bob Martin posits that "Clojure could be the last programming language," he's crossing pretty much all programming domains. The original post is right in taking him to task for that statement.
If you had asked me about Uncle Bob and spending an hour with him, I could have spared you most of it. He is very opinionated, and often really quite wrong.
FWIW, I don't think he has the slightest clue what leiningen is. It is hardly "basically a shell script which Does Things". It is written entirely in Clojure. All the shell script does is tie it together and bootstrap things. I'm fine with criticism, but not blatantly inaccurate information.
Oddly, the bit you call out as critical of Clojure, and use to say he has no clue what he's talking about, is the bit where he says that a tool written in Clojure "works brilliantly" and is an example of how Clojure is more useful to people than Common Lisp.
Read in context, I took that "shell script" statement as trying to describe Lein from the user's perspective-- which would make sense, given that his point there is its utility.
For what it's worth, I've never used Lein, and had no presuppositions about it. When I read the paragraph, I carried away the idea that Lein was written in Clojure and acts like a well-done command-line tool.
Object Orientation doesn't imply "single dispatch / message passing" style of Object Orientation. Common Lisp's CLOS and Clojure's multimethods and protocols are also object oriented, albeit in a different style.
Of course CLOS is far more powerful than what Clojure has, but I wouldn't be so quick to discard OO in Clojure.
OO doesn't imply mutability: even in Java it's encouraged (per Bloch) to create "functional objects" -- objects that create other objects instead of mutating internal state.
It's always difficult to pinpoint the essence of a paradigm that doesn't have a formal definition. But I disagree with you. In my opinion, message passing is central to OO thinking. It's not a style. It is the essence if there is one at all.
Sending a message to one particular object isn't just a method dispatch mechanism. It also defines "self" and hence what data can be accessed without breaking encapsulation - another principle of OO.
Now, I'm not saying Clojure doesn't support OO. It does. What I'm saying is that whenever you go beyond thinking in terms of passing a message to one individual object you are using features that are not object oriented on a conceptual level. You could even do that in Smalltalk and it still wouldn't be OO.
This article is OK when defending the "right tool for the right job" statement in the middle of the present Clojure hype that is going on, but this perfectly coherent discussion just gets swallowed by the author's apparent underground syndrome.
> Common Lisp native ASDF is probably very well designed, but it is practically useless to anyone who isn’t already an ASDF guru. ASDF should be taken out back and shot.
This sounds like the best you can say is "it's probably very well designed for the wrong purpose"; and I'm not sure that can be usefully distinguished from "poorly designed".
(I have dabbled with both CL and Clojure, but never used either ASDF or Lein.)
> R* and kd-trees are preposterously slow on the JVM compared to the old libANN C++ library, or naive kd-tree implementations. Factors of 100k to 1E6. I may be wrong, but I’m guessing trees confuse the bejeepers out of the JVM
Is somebody here knowledgeable enough to comment further on this? May this be due to excessive allocations and indirections? (It was also one of Bjarne's objections against Java.. composition of classes is always by reference.)
APL was actually the first real programming language I learned, 35 year ago! It was very cool and mind-bending, and has hugely influenced the way I think today. APL has also been hugely influential on the world, but after I learned Lisp, I'd never go back to APL. Though I do wish there were an APL embedded DSL for Lisp.
I wasn't aware that anyone still uses APL for anything real. Back when I used it, it didn't even have arrays of strings. Though you could make a twelve dimensional array of characters instead, for whatever that might be worth.
After Prolog ("what do you mean it just figures it out?"), APL made me think about how I approached solving problems. It looks like something out of a wizard's grimoire in a fantasy novel and forces you to think in groups instead of iterations. It would be damn fun to write shaders in it as opposed to stilted C.
Yikes, one minute you're reading about the latest obsession of "Uncle Bob", next you're on a right-wing site defending the Nazi's view of modern art. There has to be some kind of mix between a "Bacon number" and Godwin's law…
By the way, was it Bob Martin who did the TDD sudoku solver? I'm always mixing up my XP evangelists.
And regarding last languages, I'm always reminded of the transputer/4GL/Prolog hype of ages past. Although one might argue that Lisp itself is probably a good candidate - but given the wide variety of existing and possible languages that could theoretically be called by that name, this isn't saying a lot.
I like the style of this guy. He is right, clojure won't be the last programming language (and it's my favorite language), the last/100 years programming language will be an artificial intelligence.
Are people really paying that much attention to languages being used? Most of my problems are finding and getting the right libraries to work with my apps. Learning new API or new libraries take up more time than worrying about language issues.
Clojure isn't even a Lisp, it is a bunch of misconceptions with lisp-like syntax.)
Attempt to make so-called lisp by breaking code-is-data concept, or trying to use some yaml-like notation instead of s-expression with annotations (to describe a representation where we need only a structure), without proper recursion is something as far from Lisp as, say, Python.
(define (keep pred l)
(cond ((null? l) '())
((pred (car l))
(cons (car l) (keep pred (cdr l))))
(else (keep pred (cdr l)))))
this is nonsense:
(defn keep
"Returns a lazy sequence of the non-nil results of (f item). Note, this means false return values will be included. f must be free of side-effects."
{:added "1.2"
:static true}
([f coll]
(lazy-seq
(when-let [s (seq coll)]
(if (chunked-seq? s)
(let [c (chunk-first s)
size (count c)
b (chunk-buffer size)]
(dotimes [i size]
(let [x (f (.nth c i))]
(when-not (nil? x)
(chunk-append b x))))
(chunk-cons (chunk b) (keep f (chunk-rest s))))
(let [x (f (first s))]
(if (nil? x)
(keep f (rest s))
(cons x (keep f (rest s))))))))))
When one breaks the underlying ideas he ruins the spell..
[+] [-] luu|13 years ago|reply
What's so great about LabVIEW for spetroscopy? I ask because, as part of my MS thesis, I built a fourier transform spectrometer. After messing around for a few days with some LabVIEW code someone else had written for a similar task, I wrote the control code in C in a couple of hours, including the time it took to learn the (deprecated) LabWindows C API. I have no doubt that LabVIEW has improved since I did my thesis (almost ten years ago), but it's not obvious, a priori, that graphical programming should be better for controls.
[+] [-] scottlocklin|13 years ago|reply
[+] [-] makmanalp|13 years ago|reply
http://news.ycombinator.com/item?id=4496494
[+] [-] yummyfajitas|13 years ago|reply
LabVIEW probably has a library to talk to the spectrometer.
Apart from that, LabVIEW has very little advantage.
[+] [-] mattdeboard|13 years ago|reply
Anyway his overall point seems to be that Clojure isn't the only programming language, or the best one for every case. Well, ok. No one worth taking the time to talk to would ever make that assertion, so it seems like wasted bluster.
[+] [-] prospero|13 years ago|reply
[+] [-] cgag|13 years ago|reply
There was a tweet from Nathan Marz that I felt summed it up nicely a while back "Paul Graham set the bait, Rich Hickey reeled me in". Maybe you're right that I think it's more common than it is because it's true for me, but maybe you think it's less common than it is because it's false for you?
[+] [-] dsrguru|13 years ago|reply
[+] [-] SatvikBeri|13 years ago|reply
[+] [-] batista|13 years ago|reply
Pg -> promoting Lisps. Clojure -> a Lisp. Pg -> influential. Clojure -> got traction on HN and startup crowd.
[+] [-] skrebbel|13 years ago|reply
This. The attitude the OP describes is very common on HN as well, and can't be attributed to Uncle Bob only: the idea that programming means web, cloud, and databases. Other influential writers, like Jeff Atwood, make exactly the same mistake[1]. He can't imagine anyone doing any other martial art than Tai Chi.
A bit sad, really.
[1] http://www.codinghorror.com/blog/2009/08/all-programming-is-...
[+] [-] jasim|13 years ago|reply
Jeff Atwood has a point that desktop apps are dead. I would not build another business app on the desktop. Users are now used to being able to access their data and applications from anywhere, and without having to install any software.
There must be other forums dedicated to scientific computing and similar topics. These topics do find a place in HN from time to time. However, they are not the norm.
[+] [-] shanelja|13 years ago|reply
I was slightly offended when I read this - I'm 'only' a web developer, but I don't believe I'm any more or less useful or talented than anyone who say, programs kernel drivers.
I'm sure the people who program kernel drivers would disagree with me, on the other hand.
It's a topic which has bugged me for an age now, am I a real programmer? Am I actually making any difference to the world? And the answer to both of those, perhaps, is no, but on the other hand, with the exception of Tim berners-lee, Linus Torvalds and Dennis Ritchie, all of us are pretty much interchangeable. I.E. You could remove us from the team and replace us pretty easily with someone else.
[+] [-] adeelk|13 years ago|reply
[+] [-] davidrupp|13 years ago|reply
Clojure actually gets quite a decent nod here, which, given Locklin's overall snarky tone, is saying something.
[+] [-] ed_blackburn|13 years ago|reply
- Programming language innovation is rather cyclical.
- Frameworks can be re-implemented, languages ecosystems and frameworks are very important but one should not exclude a language because of it's lack of a framework feature (I think he was picking out .NET and Java here, suggesting that enterprises who select C# or Java because of the frameworks are missing the point)
- Polymorphism is achievable in Clojure, so many popular programming techniques are achievable.
I thought the talk was thought provoking and interesting. Whilst it certainly advocated Clojure, I certainly took the conclusion with a pinch of salt. The entire talk is twinged with humour. It's supposed to spark debate and help push people from their comfort zones IMHO.
[+] [-] kinleyd|13 years ago|reply
[+] [-] richardw|13 years ago|reply
Similarly, many posts around here are targeted at people in (at least) the US, often California, and usually SF or the Valley. I don't post "Noooo! It doesn't apply to me because I live in South Africa!" every time, because I understand what the context is. Something doesn't have to be universally true for it to be true in its context, and often useful for us outside of it.
In any case, thanks for the alternate opinion - it's good to be reminded that there are universes outside of ours.
[+] [-] andrewbinstock|13 years ago|reply
[+] [-] wglb|13 years ago|reply
[+] [-] Rayne|13 years ago|reply
[+] [-] dmlorenzetti|13 years ago|reply
Read in context, I took that "shell script" statement as trying to describe Lein from the user's perspective-- which would make sense, given that his point there is its utility.
For what it's worth, I've never used Lein, and had no presuppositions about it. When I read the paragraph, I carried away the idea that Lein was written in Clojure and acts like a well-done command-line tool.
[+] [-] strlen|13 years ago|reply
Of course CLOS is far more powerful than what Clojure has, but I wouldn't be so quick to discard OO in Clojure.
OO doesn't imply mutability: even in Java it's encouraged (per Bloch) to create "functional objects" -- objects that create other objects instead of mutating internal state.
[+] [-] fauigerzigerk|13 years ago|reply
Sending a message to one particular object isn't just a method dispatch mechanism. It also defines "self" and hence what data can be accessed without breaking encapsulation - another principle of OO.
Now, I'm not saying Clojure doesn't support OO. It does. What I'm saying is that whenever you go beyond thinking in terms of passing a message to one individual object you are using features that are not object oriented on a conceptual level. You could even do that in Smalltalk and it still wouldn't be OO.
[+] [-] wreckimnaked|13 years ago|reply
[+] [-] thronemonkey|13 years ago|reply
[+] [-] philh|13 years ago|reply
This sounds like the best you can say is "it's probably very well designed for the wrong purpose"; and I'm not sure that can be usefully distinguished from "poorly designed".
(I have dabbled with both CL and Clojure, but never used either ASDF or Lein.)
[+] [-] mahmud|13 years ago|reply
[+] [-] zvrba|13 years ago|reply
Is somebody here knowledgeable enough to comment further on this? May this be due to excessive allocations and indirections? (It was also one of Bjarne's objections against Java.. composition of classes is always by reference.)
[+] [-] rabidsnail|13 years ago|reply
[+] [-] nessus42|13 years ago|reply
APL was actually the first real programming language I learned, 35 year ago! It was very cool and mind-bending, and has hugely influenced the way I think today. APL has also been hugely influential on the world, but after I learned Lisp, I'd never go back to APL. Though I do wish there were an APL embedded DSL for Lisp.
I wasn't aware that anyone still uses APL for anything real. Back when I used it, it didn't even have arrays of strings. Though you could make a twelve dimensional array of characters instead, for whatever that might be worth.
[+] [-] protomyth|13 years ago|reply
[+] [-] mhd|13 years ago|reply
By the way, was it Bob Martin who did the TDD sudoku solver? I'm always mixing up my XP evangelists.
And regarding last languages, I'm always reminded of the transputer/4GL/Prolog hype of ages past. Although one might argue that Lisp itself is probably a good candidate - but given the wide variety of existing and possible languages that could theoretically be called by that name, this isn't saying a lot.
[+] [-] mgallivan|13 years ago|reply
[+] [-] jgrant27|13 years ago|reply
[+] [-] islon|13 years ago|reply
[+] [-] technomancy|13 years ago|reply
[+] [-] ww520|13 years ago|reply
[+] [-] dschiptsov|13 years ago|reply
Attempt to make so-called lisp by breaking code-is-data concept, or trying to use some yaml-like notation instead of s-expression with annotations (to describe a representation where we need only a structure), without proper recursion is something as far from Lisp as, say, Python.
there is more - http://karma-engineering.com/lab/blog
[+] [-] dschiptsov|13 years ago|reply
this is idiomatic Lisp:
this is nonsense: When one breaks the underlying ideas he ruins the spell..