Better yet, wait for the official announcement once binaries are compiled and posted for download. I believe there will be a blog post to go along with it, too.
Yes, there'll be an official announcement in a little bit. Tagging is just the first step (we haven't even added the release metadata to the tag on GitHub yet).
This could either be brilliant or a total nightmare:
"Support for arrays with indexing starting at values different from 1. The array types are expected to be defined in packages, but now Julia provides an API for writing generic algorithms for arbitrary indexing schemes (#16260)."
Originally I thought total nightmare but now I'm not sure.
Many people think this was added for support of 0 offset arrays, but that's only a side effect really. This feature was requested by people who work with rather odd arrays (e.g. diagonal slices through high dimensional data). I have mentioned in previous threads on the subject that julia's 1-indexed arrays don't matter much in practice because the generic APIs hide that fact most of the time. This just adds the last bit of code to make that completely true and adds documentation/cleanup. For those seeking more information, I'd recommend Tim Holy's JuliaCon keynote on the subject: https://www.youtube.com/watch?v=fl0g9tHeghA
I used Julia a bit in the same time span, and I was frustrated with the breakage, but I also saw a lot of things that clearly needed breaking changes. I'm glad they're happening.
Until now, [a, b] meant "concatenate a and b if they're arrays, make a 2-element array otherwise". This struck me as both annoyingly inconsistent and as a failure to think recursively.
Now it's always a 2-element array, possibly an array of arrays. And that's a huge change in syntax, and it's for the better. I look at many things in the 0.5 release notes and they're fixes to specific pain points I had. This has me paying attention to the language again.
Arguably, it is better numpy based on multimethods, but it is far from being polished as Python 3.5+ On the contrary it suffers from the kitchen sink syndrome, with process of continuous adding of stuff instead of continuous clarification and refinement, which characterizes a good Python 3 language.))
To you it is a better numpy/python, to me it is a faster/cleaner/more parallelizable version of R, to some people it is a cleaner/faster version of Matlab.
This could be seen as a "kitchen sink" approach but it also is very useful when you want to get things done. My gripes about Julia are really minor considering how young and ambitious the project is.
* Higher order functions now specialize on (and possibly even inline!) passed functions
* Anonymous functions are now fast, too
* Fused broadcasting can avoid intermediate allocations and only make one pass through the array
* User-extensible bounds checks allow custom array types to opt-in to skipping bounds checking, enabling SIMD-ification of some for loops
That said, compilation times may take a bit longer due to the LLVM upgrade… but this resulted in an even stronger push towards better performance in many other areas.
A feature request, Anaconda partners with Intel and includes MKL default. Could you do same for Julia and add MKL default at least in version 1.0 so no manual work needed?
If you have a specific operation where MKL is noticeably faster than OpenBLAS, report it. OpenBLAS can always be improved if they have specific workloads to target.
I haven't thus far been interested in Julia. Unless you're into high-level math, it didn't seem to provide much value, and it did weird stuff with arrays, and wasn't Lua, which gets a free pass for being amazingly well designed in all other respects (arguably, it was well designed in that one as well, but it makes all the array math a pain).
I don't know, maybe it's great. Maybe I should reconsider. But then again, it's strongly typed, which isn't usually my sort of thing (I'm not working with a team, and my programs haven't devolved into chaos yet, so with no empirical data either way, I'll take my favorite)
I'm not sure what you mean by "high-level math". It isn't a symbolic language like Mathematica for mathematics research but something along the lines of R for data analysis. The strong typing isn't just there for discipline, but efficiency in execution.
I use Julia. I work in Computational Logistics but some of the code I write is personal.
This week I have used Julia for running SQL against SQLite, manipulating dictionaries, raytracing, statistical analysis, engineering problems, simplex optimization, processing data from Excel, k-means clustering.
I'm pretty sure none of these would be classed as high-level math exactly, though some math is involved.
Any it's not as strongly typed for writing code as you seem to think
function fred(a, b)
return a * b
end
would work to multiply its numeric arguments or concatenate string arguments but it would throw an error on mixed string / numeric arguments
It also supports duck typing
function jule(a, b)
return a.value * b.value
end
will work no matter the types of a and b so long as they have the required attributes
[+] [-] 3JPLW|9 years ago|reply
Better yet, wait for the official announcement once binaries are compiled and posted for download. I believe there will be a blog post to go along with it, too.
[+] [-] KenoFischer|9 years ago|reply
[+] [-] tomrod|9 years ago|reply
Edit: this is a game changer for me:
> Support for multi-threading. Loops with independent iterations can be easily parallelized with the Threads.@threads macro.
[+] [-] jmde|9 years ago|reply
"Support for arrays with indexing starting at values different from 1. The array types are expected to be defined in packages, but now Julia provides an API for writing generic algorithms for arbitrary indexing schemes (#16260)."
Originally I thought total nightmare but now I'm not sure.
[+] [-] KenoFischer|9 years ago|reply
[+] [-] Avshalom|9 years ago|reply
[+] [-] data_hope|9 years ago|reply
I think the varying indices are a great feature for porting old code over from other programming languages.
[+] [-] sndean|9 years ago|reply
I really like/liked Julia, but some things breaking between 0.2 and 0.4 made me use it a bit less.
[+] [-] rspeer|9 years ago|reply
Until now, [a, b] meant "concatenate a and b if they're arrays, make a 2-element array otherwise". This struck me as both annoyingly inconsistent and as a failure to think recursively.
Now it's always a 2-element array, possibly an array of arrays. And that's a huge change in syntax, and it's for the better. I look at many things in the 0.5 release notes and they're fixes to specific pain points I had. This has me paying attention to the language again.
[+] [-] dschiptsov|9 years ago|reply
But there is a lot of enthusiasm.
[+] [-] papaf|9 years ago|reply
To you it is a better numpy/python, to me it is a faster/cleaner/more parallelizable version of R, to some people it is a cleaner/faster version of Matlab.
This could be seen as a "kitchen sink" approach but it also is very useful when you want to get things done. My gripes about Julia are really minor considering how young and ambitious the project is.
[+] [-] mathieutd|9 years ago|reply
[+] [-] 3JPLW|9 years ago|reply
* Higher order functions now specialize on (and possibly even inline!) passed functions
* Anonymous functions are now fast, too
* Fused broadcasting can avoid intermediate allocations and only make one pass through the array
* User-extensible bounds checks allow custom array types to opt-in to skipping bounds checking, enabling SIMD-ification of some for loops
That said, compilation times may take a bit longer due to the LLVM upgrade… but this resulted in an even stronger push towards better performance in many other areas.
[+] [-] ceyhunkazel|9 years ago|reply
[+] [-] tavert|9 years ago|reply
If you have a specific operation where MKL is noticeably faster than OpenBLAS, report it. OpenBLAS can always be improved if they have specific workloads to target.
[+] [-] qwertyuiop924|9 years ago|reply
I don't know, maybe it's great. Maybe I should reconsider. But then again, it's strongly typed, which isn't usually my sort of thing (I'm not working with a team, and my programs haven't devolved into chaos yet, so with no empirical data either way, I'll take my favorite)
[+] [-] jhbadger|9 years ago|reply
[+] [-] SixSigma|9 years ago|reply
This week I have used Julia for running SQL against SQLite, manipulating dictionaries, raytracing, statistical analysis, engineering problems, simplex optimization, processing data from Excel, k-means clustering.
I'm pretty sure none of these would be classed as high-level math exactly, though some math is involved.
Any it's not as strongly typed for writing code as you seem to think
would work to multiply its numeric arguments or concatenate string arguments but it would throw an error on mixed string / numeric argumentsIt also supports duck typing
will work no matter the types of a and b so long as they have the required attributes[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] inst|9 years ago|reply
[deleted]