Personally, matlab drives me absolutely up the wall when it comes to ANYTHING other that flipping big matricies around. As a domain-specific tool for linear algebra, I certainly prefer it over R, but as a general purpose tool it makes me want to pull my own teeth out.
It's just not designed to make good pipeline tools that are maintainable, easily tested, and easily refactored, and never was. Its ability to handle things like "easy and sensible string and path manipulations" are ... rudimentary, at best, and a weird pastiche of C, fortran, java, and whatever else language was faddish when that feature was added.
I have more or less one real annoyance with the technical content of matlab. (One-indexing i can live with):
size([1])
ans = [1 1]
which is flatly wrong, and numpy gets it right:
In[0] np.array([1]).shape
Out[0] (1,)
Python is actually a general purpose language which has a mature scientific stack, and i feel more secure in my numerical computations there because i can have a robust test suite and command line entry points into my code that increase my confidence that my code's doing what i think it should be, and makes it easy to use.
Packaging is more or less a coin-flip. Python packaging is a giant faff; matlab packaging is nonexistent (you have to vendor every dependency yourself) and expensive (your users have to shell out for the toolboxes you use, and/or have the MCR installed and can't edit your code).
I'll maintain matlab when i have to, but i don't enjoy it very much.
One behavior I like about Matlab is that functions are functions. Arguments are passed by value and there's no way a function can modify its arguments (well, except perhaps objects which are less common).
Matlab will try to optimize and avoid a copy if the function does not modify the argument. Matlab is also (sometimes) smart enough so that A=f(A) will modify A in place instead of making a copy.
This is what I expect from a math oriented language. Maintain the illusion of referential transparency but optimize under the hood if possible.
Also Matlab has a reasonable JIT compiler. And a good debugger.
I no longer use Matlab but it is a very productive environment for scientific computing (simulations, exploration).
Yeah, I was more than a little surprised to see his perspective - I thought for sure he was going to rave about Julia, pick on Python a little, and skewer Matlab. My graduate thesis advisor forced me to use Matlab to do all of my master’s thesis work, and I found it to be about the most frustrating environment possible, at least for any actual programming. I’m a little shocked to read that anybody uses Matlab outside of a pure academic environment… I guess I just don’t work on sophisticated enough projects?
> (...) ANYTHING other that flipping big matricies around.
but... what else is there, in life? Flipping big matrices around is nearly everything I do, and the python stuff seems too cumbersome for me to bother.
Matlab has its quirks, but I've never come across a better IDE for debugging 'scientific' code/scripts. Seeing current values by hovering over variables, the ability to pause and execute some 'testing' code, or overwrite things and carry on, being able to easily edit arrays/matrices in an excel-like table, having matrix arithmetic that doesn't look like shit when written as code... these are things that I find very useful. I use it in my field (structural engineering) for those reason alone, even if it might be slower, or more difficult to accomplish certain tasks.
I've used quite a bit of R and Python and I've never touched Matlab. Similarly to your comment - Python has nothing that comes even close to RStudio for working with data. Jupyter, Spyder, PyCharm, VSCode/Atom with data science extensions - none of them are as good.
You should check out Spyder, included with the Anaconda Python distribution. It includes most of what makes MATLAB a productive IDE. In particular, you can inspect various types of variables (including Pandas tables in a nice grid view) and watch how the contents changes as the code executes.
I definitely understand this point. Another alternative is Python (with numpy, scipy and matplotlib) using the Pycharm IDE. It has many of the features you're talking about.
MATLAB is adored in academia for a number of reasons. It is easy to make readable small scripts for in-class examples. The debugging feature/IDE is easy to navigate. The school pays for the licenses; there is no overhead work to compile or download packages (unless you want to do something 'fancy').
I took a ML course that was taught in Python. All my Numerical Analysis and Modeling courses relied on MATLAB for examples and homework. I (as a programmer outside of just the math world) picked Julia for research. Now I do much more theoretical research, as I did not enjoy mixing coding and mathematics.
A fellow student, developing PDE solvers in FORTRAN was told by a mentor to get it to work in MATLAB first and then move on to faster languages.
The biggest gripe I have with matlab is that it teaches absolutely horrendous programming habits. Most of the graduate students that only used/learned matlab in their studies (I'm in an engineering field) program everything into one big script which they copy around and change a couple of parameters. Now obviously you can do things properly in matlab, it's just matlabs structure encourages you not to (who thought that one function per file and no namespaces are a good idea?!) Students who used Python on the other hand typically have much better programming habits (not necessarily good), I think because of it's roots as a programming language first, you get much more exposed to programming paradigms when you learn it.
The other thing I found weird was the complaint about matrixes objects being deprecated. After the addition of the '@' operator it is really the same as matlab, except their default is a matrix, while for numpy it's an array. As a side note the author complains about the '@', what about matlabs stupid decision to use the most easily overlooked ascii character for distinguishing between matrix and element wise operations. In my experience, almost all the time a matlab calculation returns weird or garbage results, the bug search is a "find the missing '.'
Mathematica allows non-ASCII variable names. Odd the author didn't mention that, since Mathematica is pretty common in academia.
I learned Mathematica and MATLAB in my math and science courses (Physics) and was going to learn Java before I dropped Computer Science. Interesting I could probably replace all those with Julia now.
also wondered about that (still remember a friend of mine, after he started using python3 "I can use greek letters for variables now"). seems like python needs more experience than porting a numerics textbook...
Besides that: how do you actually input a significant number of greek letter w/o consulting a layout-picture or wikipedia?
Matlab sells its onerously expensive licenses by marketing itself as having unbeatable numerics performance. This is mostly a farce.
The vast majority of Matlab's vaunted numerics performance comes from using MKL instead of OpenBLAS. However Intel has made MKL free software. Meaning that you can easily build NumPY on top of it. Numpy+MKL will compile down to virtually identical assembly as Matlab.
There's very little reason in this day and age to pay Mathworks such an insane licensing fee.
> Matlab sells its onerously expensive licenses by marketing itself as having unbeatable numerics performance.
Not at all. At Lockheed (probably one of Mathwork's biggest customers) the big use case is the toolboxes. Scientific/engineering packages like signal processing, radar, phased array, embedded/VHDL are 2nd to none and are used daily. Some sites also used a lot more of the Simulink side for modeling & simulation.
For what it's worth, OpenBLAS is not significantly slower than MKL, at least for level 3 linear algebra on x86 on AVX2 downwards. I don't know whether it supports SKX well now, but otherwise BLIS does. (Also OpenBLAS and BLIS are infinitely faster than on POWER and ARM.) I don't understand "compiling down" -- the linear algebra performance is determined by the library, which is, or could be, the same in each case.
[I agree about the marketing aspects and the huge waste of university resources buying into that rather than funding development of free software.]
The choice of language usually comes down to the packages. In any of the three aforementioned languages one can easily and quickly manipulate matrices short of an unwillingness to learn. Julia is nice because it's fast with native code. Python is nice because of Scipy. Matlab is nice because it decides how to spend your money without cause.
I'm an AI researcher / practitioner. For me code accompanying papers is very useful and usually this code is in Python. Occasionally it's Matlab but let's be honest, who cares about those papers :). I'd love to use Julia but the package support just isn't there. Ironically people like me are supposed to be writing this code but with a demanding job and a family it's not likely I will be improving their DataFrame effort anytime soon.
Anyway the MAIN reason I use open source software is because if it isn't working correctly I simply fix the code myself. This isn't possible in the proprietary world. Why would you trust your research or production work with code you can't see and edit?
There's been a lot of talk about documentation. Docs are secondary sources, like WIRED, read the code if you're serious about being correct. Even (especially) hired hands make mistakes and fail to write good tests.
This article reminded me of the fictional Simpson's news article "Old Man Yells at Cloud". It's funny, and he may have a point, but it has no relevance.
Matlab is a calculator. It is a really nice calculator for some things, but its definitely a calculator with programming language features bolted on. I strongly believe that Matlab is unsuitable for writing most software. It is nonetheless extremely popular in some engineering fields for write-only scripts.
I'll never forget when I took a controls class and we were given an option to use Python on our own or matlab with guidance and support from the professor and TAs. I chose Python since my background was slightly different that most of the students, but most everyone chose matlab. It was highly amusing to watch the whole lab suffer for days because no one understood matlab's semantics. (The base framework had been written by some expert who retired and made extensive use of both handle and value classes. Problem was, no one still involved with the clas knew what the difference was.
Meanwhile, I spent about 6 seconds longer writing out np.dot a few times.
Matlab is good for math. Most software (even math heavy stuff, EM simulations, etc) has little math (in terms of source, no execution time).
"MATLAB is the BMW sedan of the scientific computing world. It’s expensive, and that’s before you start talking about accessories (toolboxes). You’re paying for a rock-solid, smooth performance and service..."
I do scientific programming in both Python and Matlab. The two things that to me are major benefits of Matlab are the ease of setting up your installation, and the documentation. The Matlab documentation is amazing when compared to the numpy/scipy documentation, and is almost reason alone for a beginner to use Matlab. FWIW, the Mathematica documentation is also fantastic.
Where is the discussion of R? You talk of scientific computation and don't speak of R? That's an oversight, given the majority of the scientists I know have used R. There's also STATA, which economists love, which can do somethings in my workflow much quicker than R. There is also a huge contingent of analysts that uses SAS, especially in healthcare and finance.
The car analogies in the blogpost are not particularly useful... Why do people feel the need to dumb down a topic with off-the-wall analogies? Talking about Julia like it's Tesla is laughable. Tesla is a huge innovator, Julia is another tool that does similar things to the other tools. The apt analogy for Julia would be a new ICE company, not a new EV company.
I've a lot of experience with both MATLAB and Python. I find Python to be the better designed language, with fewer rough edges. But MATLAB still promotes and encourages the most productive approach to programming, one that NumPy just falls short of.
For exploratory work MATLAB and the MATLAB IDE are pretty good and shorter scripts are fine. However once you start trying to write actual programs and pass 1-2 kLOC or so, MATLAB just gets more and more painful.
The author does not mention modern Fortran, which does have array operations, like Matlab, Python/Numpy, and Julia. I wonder if its lacking a REPL is the main reason why.
I suspect doing all this in the context of a numerical analysis textbook has contributed to the authors prespective. One nice thing about Python is that you can build and prototype applications and services around it. This is important as many applications need numerical analysis. Could you imagine writing your entire application in matlab?
I find it quite ridiculous that a discussion about the matlab language in the context of free software projects does not even mention the excellent interpreter Octave. I say "ridiculous" to be generous; in reality it seems mostly bad faith.
> One function per disk file in a flat namespace was refreshingly simple for a small project, but a headache for a large one.
This is an absolutely horrible bonkers limitation for a “small project”. It’s “refreshing” like someone constantly dumping buckets of ice water over your head. Being able to define functions in the repl, export multiple functions from a single file, etc. are things I pretty much can’t live without in an interactive programming language. Needing to make every tiny utility function into its own file adds SO MUCH FRICTION to basic prototyping workflows.
The result is that when working with Matlab I try to make as many functions as possible into lambdas, e.g. square = @(x) x∗x. But these are very limited in practice, and the workarounds to make a function work as a lambda often compromise readability, performance, correctness, and functionality.
+ + +
Is an occasional V.conj().T @ D∗∗3 @ V really that much worse than V' ∗ D^3 ∗ V?
I find that in even small programs, syntax complexity and clarity is dominated by logic flow rather than matrix multiplication. My Python programs end up dramatically easier to read than my Matlab programs, including the almost-pure-numerics parts of them.
If concise built-in syntax for numerical operations were our highest ideal, we’d all be using APL.
> exists a matrix class, and yet its use is discouraged and will be deprecated
I think Driscoll doesn’t understand what’s going on here. There is plenty of discussion for someone who searches about the problems with the matrix class. The matrix class dated from a time when there was no @ operator, so ∗ was used for matrix multiplication instead of elementwise multiplication. This made it inconsistent with everything else and artificially limiting in obnoxious ways. Now that Python has an @ operator it is no longer useful or necessary. This has nothing to do with matrices being important or not.
> Matplotlib package is an amazing piece of work, and for a while it looked better than MATLAB, but I find it quite lacking in 3D still
YMMV, and maybe I’m spoiled by D3, Vega, Altair, ggplot2, etc., but I really don’t like Matlab’s plotting tools or Matplotlib. They are inflexible and full of arcane details, and produce mediocre output. We should aspire to better plotting than those in all of our environments.
+ + +
> The big feature of multiple dispatch makes some things a lot easier and clearer than object orientation does.
This is partly because Matlab’s version of “object orientation” is a horrendously broken pile of trash.
> Partly that’s my relative inexperience and the kinds of tasks I do, but it’s also partly because MathWorks has done an incredible job automatically optimizing code.
I’ve poked around in several large Matlab projects, and there are huge performance problems everywhere. In particular any project using Matlab’s version of object orientation ends up incurring huge amounts of overhead.
+ + +
Overall my impression is that Julia (and Matlab’s) language choices are driven by people who want to directly type their math paper into a program with as little thought and as few changes as possible.
For folks with decades of experience reading and writing math papers, this is fair enough I guess.
For many people from a software background it seems like a poor choice of priorities.
Programs written by researchers are often unintelligible without the accompanying paper (and sometimes with, depending on the paper). Full of 1-letter variable names defined off-screen somewhere with no comment explaining what it stands for, weird API inconsistencies, lack of structure, hacks that worked on one set of inputs for a demo but don’t handle edge cases, ....
My position is that despite all of Julia's problems (many of the worst being internal) I think it's the best choice of the three. And at least as far as the move from MatLab to Julia that's just a win for all of science.
> This is partly because Matlab’s version of “object orientation” is a horrendously broken pile of trash.
Sure, but also multi-methods are objectively better than single dispatch object systems. And per Graham's web of power this is easily demonstrated by the fact that what a multimethod can do in one line, a single dispatch system requires n^m (n being objects in the hierarchy, m being the number of objects in the call) lines of pattern code (sans any meta programming features of course) to match in expressiveness.
The fact that Julia can use multi-methods with type inference to improve the compilation of typeless methods to the level of native compiled program speeds is just gravy.
> I’ve poked around in several large Matlab projects, and there are huge performance problems everywhere.
A problem python can share, with the worst offenses requiring rewriting the given chunk of code in a different language and then writing bindings for it, and managing the compilation of it.
And that Julia works very hard to avoid, through it's gradual type system (often allowing a programmer to add a type to a single line of code and have the entire project get 10x performance gains) and multimethods (allowing programmers to optimize specific edge cases without intruding on the formula code).
> Overall my impression is that Julia (and Matlab’s) language choices are driven by people who want to directly type their math paper into a program with as little thought and as few changes as possible.
Yes, but at least Julia does so in a way that allows for professional programmers to easily work with and maintain it. Between multi-methods, optional typing, choose-your-own-starting-array-index, a programmer can modify an existing Julia code base without the language itself being the problem.
Julia could certainly do better here - they are often hostile to attempts to improve the software engineering story around Julia - they at least have a path forward for a programmer attempting to maintain or optimize their scientific computing code in a sane way. Something that neither python nor Matlab can really claim.
> And there’s zero-indexing (as opposed to indexes that start at 1)
What, and that's exactly the part I dislike about both Matlab and Julia! The amount of "+ 1" and "- 1"s in matlab indices you need when subdividing matrices into multiple equal sized parts is horrible.
Weird, I thought mathematicians would know better what makes sense and what not. Starting with an offset of 1 is clearly what does not make sense, you start at distance 0 from your starting point.
In mathematics, a matrix doesn't have an 'offset' or a 'starting point.' I think that perceiving matrices in those terms is an artifact of thinking of matrices as sitting in an address space, where the elements of your matrix are part of a larger span that can contain other data.
A matrix is a collection of elements and nothing more. Indexing starts at 1 because that's the first element in your matrix, and numbering the first element 1 makes sense. Numbering from 0 makes sense when it represents an offset into something, but a matrix isn't that. 0-based indexing just always feels to me like letting the implementation details leak out. (I don't feel this way when an array actually represents a chunk of memory, rather than a math object.)
The proliferation of +1 and -1 depends on the application. Some work better with 1-indexed, some work better with 0-indexed. Personally I get annoyed at having to use `len-1` too often when working with 0-indexed arrays. This is why some languages like Julia (and FORTRAN apparently?) let you choose your index-base, which... has trade-offs.
> thought mathematicians would know better what makes sense and what not
I'm no mathematician, but when I studied linear algebra, I remember index starts from 1 in matrix. I think MATLAB just follows that convention, it's "Matrix lab" after all.
What kills me about MATLAB is that it's CLI still doesn't have any terminal shortcuts beyond Ctrl+C. No Ctrl+D, no Ctrl+W, no Ctrl+left arrow or Ctrl+right arrow.
It's also a gigantic memory hog and all the plots are copy-on-write, and the memory usually isn't deallocated afterwards.
Specifically the author's mention of extended character support in Julia for math symbols, as well as the emphasis on matrix support, makes me wonder why APL didn't maintain popularity among the academic crowd.
Well...the reason is APL can't do scientific computing well.
It doesn't by default have scientific libraries and builtins for solving equations. It has a fast interpreter, but scientific computing often needs much faster.
It was also late getting off the mainframe.
I really like APL, but it doesn't have the horsepower for my needs.
[+] [-] hprotagonist|6 years ago|reply
It's just not designed to make good pipeline tools that are maintainable, easily tested, and easily refactored, and never was. Its ability to handle things like "easy and sensible string and path manipulations" are ... rudimentary, at best, and a weird pastiche of C, fortran, java, and whatever else language was faddish when that feature was added.
I have more or less one real annoyance with the technical content of matlab. (One-indexing i can live with):
which is flatly wrong, and numpy gets it right: Python is actually a general purpose language which has a mature scientific stack, and i feel more secure in my numerical computations there because i can have a robust test suite and command line entry points into my code that increase my confidence that my code's doing what i think it should be, and makes it easy to use.Packaging is more or less a coin-flip. Python packaging is a giant faff; matlab packaging is nonexistent (you have to vendor every dependency yourself) and expensive (your users have to shell out for the toolboxes you use, and/or have the MCR installed and can't edit your code).
I'll maintain matlab when i have to, but i don't enjoy it very much.
[+] [-] nimrody|6 years ago|reply
Matlab will try to optimize and avoid a copy if the function does not modify the argument. Matlab is also (sometimes) smart enough so that A=f(A) will modify A in place instead of making a copy.
This is what I expect from a math oriented language. Maintain the illusion of referential transparency but optimize under the hood if possible.
Also Matlab has a reasonable JIT compiler. And a good debugger.
I no longer use Matlab but it is a very productive environment for scientific computing (simulations, exploration).
[+] [-] commandlinefan|6 years ago|reply
[+] [-] enriquto|6 years ago|reply
but... what else is there, in life? Flipping big matrices around is nearly everything I do, and the python stuff seems too cumbersome for me to bother.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] fireattack|6 years ago|reply
Why is it wrong?
Edit: typo
[+] [-] ggcdn|6 years ago|reply
[+] [-] 0xd171|6 years ago|reply
[+] [-] eigenvalue|6 years ago|reply
[+] [-] viraj_shah|6 years ago|reply
[+] [-] mlevental|6 years ago|reply
https://i.imgur.com/epByE8g.png
[+] [-] ohsonice|6 years ago|reply
MATLAB is adored in academia for a number of reasons. It is easy to make readable small scripts for in-class examples. The debugging feature/IDE is easy to navigate. The school pays for the licenses; there is no overhead work to compile or download packages (unless you want to do something 'fancy').
I took a ML course that was taught in Python. All my Numerical Analysis and Modeling courses relied on MATLAB for examples and homework. I (as a programmer outside of just the math world) picked Julia for research. Now I do much more theoretical research, as I did not enjoy mixing coding and mathematics.
A fellow student, developing PDE solvers in FORTRAN was told by a mentor to get it to work in MATLAB first and then move on to faster languages.
Happy to answer any questions :)
[+] [-] cycomanic|6 years ago|reply
The other thing I found weird was the complaint about matrixes objects being deprecated. After the addition of the '@' operator it is really the same as matlab, except their default is a matrix, while for numpy it's an array. As a side note the author complains about the '@', what about matlabs stupid decision to use the most easily overlooked ascii character for distinguishing between matrix and element wise operations. In my experience, almost all the time a matlab calculation returns weird or garbage results, the bug search is a "find the missing '.'
[+] [-] hprotagonist|6 years ago|reply
Python 3 has supported unicode variable names for 12 years. Not all of unicode is permitted, but all the useful bits are.
[+] [-] 3JPLW|6 years ago|reply
[+] [-] Robotbeat|6 years ago|reply
I learned Mathematica and MATLAB in my math and science courses (Physics) and was going to learn Java before I dropped Computer Science. Interesting I could probably replace all those with Julia now.
[+] [-] fock|6 years ago|reply
Besides that: how do you actually input a significant number of greek letter w/o consulting a layout-picture or wikipedia?
[+] [-] dcolkitt|6 years ago|reply
The vast majority of Matlab's vaunted numerics performance comes from using MKL instead of OpenBLAS. However Intel has made MKL free software. Meaning that you can easily build NumPY on top of it. Numpy+MKL will compile down to virtually identical assembly as Matlab.
There's very little reason in this day and age to pay Mathworks such an insane licensing fee.
[+] [-] zild3d|6 years ago|reply
Not at all. At Lockheed (probably one of Mathwork's biggest customers) the big use case is the toolboxes. Scientific/engineering packages like signal processing, radar, phased array, embedded/VHDL are 2nd to none and are used daily. Some sites also used a lot more of the Simulink side for modeling & simulation.
[0] https://www.mathworks.com/help/signal/ref/signalanalyzer-app...
[1] https://www.mathworks.com/help/comm/examples/rf-satellite-li...
[2] https://www.mathworks.com/help/fusion/examples/multi-patform...
[3] https://www.mathworks.com/products/hdl-coder.html
[+] [-] kevin_thibedeau|6 years ago|reply
[+] [-] gnufx|6 years ago|reply
[I agree about the marketing aspects and the huge waste of university resources buying into that rather than funding development of free software.]
[+] [-] cfusting|6 years ago|reply
I'm an AI researcher / practitioner. For me code accompanying papers is very useful and usually this code is in Python. Occasionally it's Matlab but let's be honest, who cares about those papers :). I'd love to use Julia but the package support just isn't there. Ironically people like me are supposed to be writing this code but with a demanding job and a family it's not likely I will be improving their DataFrame effort anytime soon.
Anyway the MAIN reason I use open source software is because if it isn't working correctly I simply fix the code myself. This isn't possible in the proprietary world. Why would you trust your research or production work with code you can't see and edit?
There's been a lot of talk about documentation. Docs are secondary sources, like WIRED, read the code if you're serious about being correct. Even (especially) hired hands make mistakes and fail to write good tests.
This article reminded me of the fictional Simpson's news article "Old Man Yells at Cloud". It's funny, and he may have a point, but it has no relevance.
[+] [-] rrss|6 years ago|reply
I'll never forget when I took a controls class and we were given an option to use Python on our own or matlab with guidance and support from the professor and TAs. I chose Python since my background was slightly different that most of the students, but most everyone chose matlab. It was highly amusing to watch the whole lab suffer for days because no one understood matlab's semantics. (The base framework had been written by some expert who retired and made extensive use of both handle and value classes. Problem was, no one still involved with the clas knew what the difference was.
Meanwhile, I spent about 6 seconds longer writing out np.dot a few times.
Matlab is good for math. Most software (even math heavy stuff, EM simulations, etc) has little math (in terms of source, no execution time).
[+] [-] usgroup|6 years ago|reply
mmmmhmmmm...
[+] [-] tcpekin|6 years ago|reply
[+] [-] WhompingWindows|6 years ago|reply
The car analogies in the blogpost are not particularly useful... Why do people feel the need to dumb down a topic with off-the-wall analogies? Talking about Julia like it's Tesla is laughable. Tesla is a huge innovator, Julia is another tool that does similar things to the other tools. The apt analogy for Julia would be a new ICE company, not a new EV company.
[+] [-] ballenarosada|6 years ago|reply
[+] [-] BeetleB|6 years ago|reply
As an example, most physics programs don't even have an introductory statistics class in their curriculum.
Some engineering disciplines use it more, and it is often relied a lot in industry. But most engineering research makes little to no use of it.
[+] [-] roel_v|6 years ago|reply
[+] [-] w_t_payne|6 years ago|reply
[+] [-] dagw|6 years ago|reply
[+] [-] philipov|6 years ago|reply
[+] [-] Bostonian|6 years ago|reply
[+] [-] Paul-ish|6 years ago|reply
[+] [-] enriquto|6 years ago|reply
[+] [-] cr0sh|6 years ago|reply
[+] [-] jacobolus|6 years ago|reply
http://www.math.udel.edu/~driscoll/SC/ or on github https://github.com/tobydriscoll/sc-toolbox
But I can’t say I agree with much of this.
> One function per disk file in a flat namespace was refreshingly simple for a small project, but a headache for a large one.
This is an absolutely horrible bonkers limitation for a “small project”. It’s “refreshing” like someone constantly dumping buckets of ice water over your head. Being able to define functions in the repl, export multiple functions from a single file, etc. are things I pretty much can’t live without in an interactive programming language. Needing to make every tiny utility function into its own file adds SO MUCH FRICTION to basic prototyping workflows.
The result is that when working with Matlab I try to make as many functions as possible into lambdas, e.g. square = @(x) x∗x. But these are very limited in practice, and the workarounds to make a function work as a lambda often compromise readability, performance, correctness, and functionality.
+ + +
Is an occasional V.conj().T @ D∗∗3 @ V really that much worse than V' ∗ D^3 ∗ V?
I find that in even small programs, syntax complexity and clarity is dominated by logic flow rather than matrix multiplication. My Python programs end up dramatically easier to read than my Matlab programs, including the almost-pure-numerics parts of them.
If concise built-in syntax for numerical operations were our highest ideal, we’d all be using APL.
> exists a matrix class, and yet its use is discouraged and will be deprecated
I think Driscoll doesn’t understand what’s going on here. There is plenty of discussion for someone who searches about the problems with the matrix class. The matrix class dated from a time when there was no @ operator, so ∗ was used for matrix multiplication instead of elementwise multiplication. This made it inconsistent with everything else and artificially limiting in obnoxious ways. Now that Python has an @ operator it is no longer useful or necessary. This has nothing to do with matrices being important or not.
> Matplotlib package is an amazing piece of work, and for a while it looked better than MATLAB, but I find it quite lacking in 3D still
YMMV, and maybe I’m spoiled by D3, Vega, Altair, ggplot2, etc., but I really don’t like Matlab’s plotting tools or Matplotlib. They are inflexible and full of arcane details, and produce mediocre output. We should aspire to better plotting than those in all of our environments.
+ + +
> The big feature of multiple dispatch makes some things a lot easier and clearer than object orientation does.
This is partly because Matlab’s version of “object orientation” is a horrendously broken pile of trash.
> Partly that’s my relative inexperience and the kinds of tasks I do, but it’s also partly because MathWorks has done an incredible job automatically optimizing code.
I’ve poked around in several large Matlab projects, and there are huge performance problems everywhere. In particular any project using Matlab’s version of object orientation ends up incurring huge amounts of overhead.
+ + +
Overall my impression is that Julia (and Matlab’s) language choices are driven by people who want to directly type their math paper into a program with as little thought and as few changes as possible.
For folks with decades of experience reading and writing math papers, this is fair enough I guess.
For many people from a software background it seems like a poor choice of priorities.
Programs written by researchers are often unintelligible without the accompanying paper (and sometimes with, depending on the paper). Full of 1-letter variable names defined off-screen somewhere with no comment explaining what it stands for, weird API inconsistencies, lack of structure, hacks that worked on one set of inputs for a demo but don’t handle edge cases, ....
[+] [-] SolarNet|6 years ago|reply
> This is partly because Matlab’s version of “object orientation” is a horrendously broken pile of trash.
Sure, but also multi-methods are objectively better than single dispatch object systems. And per Graham's web of power this is easily demonstrated by the fact that what a multimethod can do in one line, a single dispatch system requires n^m (n being objects in the hierarchy, m being the number of objects in the call) lines of pattern code (sans any meta programming features of course) to match in expressiveness.
The fact that Julia can use multi-methods with type inference to improve the compilation of typeless methods to the level of native compiled program speeds is just gravy.
> I’ve poked around in several large Matlab projects, and there are huge performance problems everywhere.
A problem python can share, with the worst offenses requiring rewriting the given chunk of code in a different language and then writing bindings for it, and managing the compilation of it.
And that Julia works very hard to avoid, through it's gradual type system (often allowing a programmer to add a type to a single line of code and have the entire project get 10x performance gains) and multimethods (allowing programmers to optimize specific edge cases without intruding on the formula code).
> Overall my impression is that Julia (and Matlab’s) language choices are driven by people who want to directly type their math paper into a program with as little thought and as few changes as possible.
Yes, but at least Julia does so in a way that allows for professional programmers to easily work with and maintain it. Between multi-methods, optional typing, choose-your-own-starting-array-index, a programmer can modify an existing Julia code base without the language itself being the problem.
Julia could certainly do better here - they are often hostile to attempts to improve the software engineering story around Julia - they at least have a path forward for a programmer attempting to maintain or optimize their scientific computing code in a sane way. Something that neither python nor Matlab can really claim.
[+] [-] Aardwolf|6 years ago|reply
What, and that's exactly the part I dislike about both Matlab and Julia! The amount of "+ 1" and "- 1"s in matlab indices you need when subdividing matrices into multiple equal sized parts is horrible.
Weird, I thought mathematicians would know better what makes sense and what not. Starting with an offset of 1 is clearly what does not make sense, you start at distance 0 from your starting point.
[+] [-] lmkg|6 years ago|reply
A matrix is a collection of elements and nothing more. Indexing starts at 1 because that's the first element in your matrix, and numbering the first element 1 makes sense. Numbering from 0 makes sense when it represents an offset into something, but a matrix isn't that. 0-based indexing just always feels to me like letting the implementation details leak out. (I don't feel this way when an array actually represents a chunk of memory, rather than a math object.)
The proliferation of +1 and -1 depends on the application. Some work better with 1-indexed, some work better with 0-indexed. Personally I get annoyed at having to use `len-1` too often when working with 0-indexed arrays. This is why some languages like Julia (and FORTRAN apparently?) let you choose your index-base, which... has trade-offs.
[+] [-] tejtm|6 years ago|reply
[1]https://docs.julialang.org/en/latest/devdocs/offset-arrays/
[2]https://julialang.org/blog/2017/04/offset-arrays
[+] [-] fireattack|6 years ago|reply
I'm no mathematician, but when I studied linear algebra, I remember index starts from 1 in matrix. I think MATLAB just follows that convention, it's "Matrix lab" after all.
[+] [-] 0-_-0|6 years ago|reply
[+] [-] inamberclad|6 years ago|reply
It's also a gigantic memory hog and all the plots are copy-on-write, and the memory usually isn't deallocated afterwards.
[+] [-] Lowkeyloki|6 years ago|reply
[+] [-] 6thaccount2|6 years ago|reply
It doesn't by default have scientific libraries and builtins for solving equations. It has a fast interpreter, but scientific computing often needs much faster.
It was also late getting off the mainframe.
I really like APL, but it doesn't have the horsepower for my needs.
[+] [-] unknown|6 years ago|reply
[deleted]