Pardon my shooting from the hip here, but IMO if you're using R for something radically different than statistical analysis and data visualization, you're probably better off using a tool/language that's more purpose-suited.
> As someone who basically uses R as a nice LISP-y scripting language to orchestrate calling low-level compiled code from other languages
When I read this, I think, would `bash` or something equally portable/universally installed work?
R is a beautiful thing when limited to its core uses But in my experience, the more we build away from those core uses, the more brittleness we introduce. I wish the Posit team would focus on the core R experience, resolve some of the hundreds of open issues on its core packages in a timely way [0,1], and just generally play to R's strengths.
Unlike many other languages, R has a native/built-in tabular data structure. So when your data have tabular structure R is by far the best glue for building pipes between external libreries. If the data fits in RAM it literally doesn't have to leave the data.table object throughout the whole pipeline (including all the cleaning and transformations).
The only meaningful alternative I see is Python with maybe Polars or DuckDB.
Tyler is actually using R for exactly what R and it's predecessor S were designed to do since the beginning. You can read more about it's history by googling John Chambers who helped develop S at Bell Labs.
>> As someone who basically uses R as a nice LISP-y scripting language to orchestrate calling low-level compiled code from other languages
Except... this is exactly was R was created to do, with a focus on mathematical/statistical libraries written in things like FORTRAN.
R is great as a glue language for these purposes if the purpose of calling that low-level compiled code is largely to work with data and especially if that data is not so large/computationally intensive to work with that it does not need to be distributed across hardware.
Another strength of R, albeit via the tidyverse-package, is in data manipulation. It excels in taking some misfit, human friendly, pile of excel-sheets with implied relations and turning it into rectangular data sets that can actually be analysed. Sure there are faster alternatives if you are doing big data, but if you need versatility R is a good friend.
setgree|1 year ago
> As someone who basically uses R as a nice LISP-y scripting language to orchestrate calling low-level compiled code from other languages
When I read this, I think, would `bash` or something equally portable/universally installed work?
R is a beautiful thing when limited to its core uses But in my experience, the more we build away from those core uses, the more brittleness we introduce. I wish the Posit team would focus on the core R experience, resolve some of the hundreds of open issues on its core packages in a timely way [0,1], and just generally play to R's strengths.
[0] https://github.com/rstudio/rmarkdown/issues
[1] https://github.com/tidyverse/ggplot2/issues
vhhn|1 year ago
The only meaningful alternative I see is Python with maybe Polars or DuckDB.
nickforr|1 year ago
wdkrnls|1 year ago
jasonpbecker|1 year ago
Except... this is exactly was R was created to do, with a focus on mathematical/statistical libraries written in things like FORTRAN.
R is great as a glue language for these purposes if the purpose of calling that low-level compiled code is largely to work with data and especially if that data is not so large/computationally intensive to work with that it does not need to be distributed across hardware.
napoleongl|1 year ago
mdaniel|1 year ago
https://github.com/posit-dev/positron (Elastic V2)