top | item 37841489

(no title)

BrandonS113 | 2 years ago

I used to think so, but I have a function that gets called about a billion times each and every day as new data comes in, and and takes about 0.01 seconds to evaluate (optimizaiton with nlopt). I tried to code it in c (30% speed improvement) python (twice as slow), Julia (about the same speed). Reason is that call has 5 parameters that operate on a vector of length 50 to return a value to minimize. Turns out R is pretty good at such vector calculations.

discuss

order

n4r9|2 years ago

Is this what you mean by nlopt? https://github.com/stevengj/nlopt

If so, it looks like you're interfacing from R to high-performing code written in C. Isn't that exactly what OP was describing?

BrandonS113|2 years ago

no, the function it calls is pure R and that is where the the code spends all its time.

wodenokoto|2 years ago

I am willing to concede, but also willing to argue that vector and data frame manipulations in R are calls to optimized code.

Like, R is “what if we made a lisp inspired version of Python built around numpy and pandas and then reversed timed”

BrandonS113|2 years ago

I think that is exactly what is happening. Most of my code is much much faster in Julia, and the code is nicer. But R has its moments. Which is good since this particular app has 3K lines, and I do not want to port it to Julia.

And data.tables in R is faster (and I think nicer to write) than DataFrames in Julia. And since data.tables feed my optimization, R still wins.