I used to think so, but I have a function that gets called about a billion times each and every day as new data comes in, and and takes about 0.01 seconds to evaluate (optimizaiton with nlopt). I tried to code it in c (30% speed improvement) python (twice as slow), Julia (about the same speed). Reason is that call has 5 parameters that operate on a vector of length 50 to return a value to minimize. Turns out R is pretty good at such vector calculations.
n4r9|2 years ago
If so, it looks like you're interfacing from R to high-performing code written in C. Isn't that exactly what OP was describing?
BrandonS113|2 years ago
wodenokoto|2 years ago
Like, R is “what if we made a lisp inspired version of Python built around numpy and pandas and then reversed timed”
BrandonS113|2 years ago
And data.tables in R is faster (and I think nicer to write) than DataFrames in Julia. And since data.tables feed my optimization, R still wins.