Hey. I am currently writing a blog post about my experience with Elixir. I aim to write about things I liked, but also I want to describe stuff that I just hated or disliked. Because I've noticed that people always try to praise the new tech they are using, but rarely point out bad things. And the truth is that learning a new technology is many times very time/money consuming process.
[+] [-] outsidetheparty|7 years ago|reply
* Bonus points for tools whose authors have made the effort to explicitly compare it with competitor tools, particularly ones that acknowledge points where the competitor might have the advantage. ("Our new tech is better than existing old tech in every possible way" gets the side-eye from me; "Our new tech is better than existing old tech for these particular purposes, but old tech may still be more appropriate for these other purposes" goes a tremendous way towards confirming that the new tech has a real reason to exist.
* Is there documentation? Is it any good? This is a really low bar, but far too many new tools have no documentation at all ("just check out the source code") or have minimal, incomplete, or tautological docs ("bar.foo(): executes the foo method of bar"). A message board or IRC channel is nice, but not a substitute.
* How big is the API surface? Does it need to be that big? I tend to avoid tools where there are six different ways to do the same thing -- looking at you, Angular -- it suggests the developers are unfocused or in disagreement, and makes it harder to find support or documentation on any particular issue. Same thing if the API has undergone major breaking changes or paradigm shifts between versions (looking at you, Angular...)
* What does the tag look like on stackoverflow? This serves as a good indicator of whether the tech is too new or obscure to bother with, what the common pain points are, the average skill/knowledge level of its users, and whether help will be available if I get stuck when using it.
* Is there a relatively simple way to try it out? I'm much more likely to experiment with something where I can clone a repo and get going with simple but nontrivial example code; if I have to reconfigure half the settings on my machine just to get a hello world, I'm not going to bother.
[+] [-] chiefalchemist|7 years ago|reply
Maybe?
[+] [-] jasode|7 years ago|reply
I definitely understand your reason for prioritizing documentation but I wanted to point out that "documentation" actually accomplishes the opposite of the OP's concerns about warts/disadvantages/gotchas/etc:
>, but also I want to describe stuff that I just hated or disliked. Because I've noticed that people always try to praise the new tech they are using, but rarely point out bad things.
As we know, documentation is typically written by people who are generally positive about the programming language or technology. Hoping for documentation authors to point out the flaws is like asking a mother to list the reasons her son is defective and girls shouldn't marry her.
To echo the specific issue the OP mentioned, my [totally unrealistic] dream documentation would be written by a crusty skeptical person that was an expert in the technology but became disillusioned by it. They'd point out all the flaws and pathological (but realistic) use cases where the technology fails or is inappropriate.
Since official documentation doesn't include the contrarian viewpoint, the newbies trying to evaluate new technology have to synthesize the "negatives" from other sources. E.g. I've typed the following phrases into google search:
It's the contrarian writing that helps us learn the limitations and tradeoffs of the technology. Do I search on "rust sucks" because I think Rust is bad?!? No. I search on that because that's the negative phrase that others might have used -- and I want to read their criticisms of Rust.By all means, read the official and blessed documentation of the technology but be aware that it's a very biased viewpoint.
EDIT: added "[totally unrealistic]" to prevent misunderstanding of my point
[+] [-] itwy|7 years ago|reply
Or you just imagined your ideal choosing process?
Or perhaps you use what you employer's previous employee chose to use.
[+] [-] nikanj|7 years ago|reply
I have built this standard up after a few mistakes, where we chose a tool just to realize pretty much everyone working on it was a extremo ultimate rockstar ninja, better known as first-year CS dropout with a god complex.
NB: I'm not saying all first-year dropouts are bad, or that having a degree is in any way mandatory in this field. All I'm saying, is that when our site is down, I'd like to have a few people with 1980s MIT Electrical Engineering degrees on my team. The kind of people who know what a process is, how TCP passes packets, et cetera.
[+] [-] iiv|7 years ago|reply
[+] [-] ukutaht|7 years ago|reply
Elixir is great for a chat application. At the same time it's probably not the right choice for a Machine Learning project.
Of course there are a number of absolutes that I look for regardless of the context:
* How good is the documentation?
* Is it actively maintained?
* Is it mature?
* etc.
[+] [-] lostcolony|7 years ago|reply
Relatedly, while I wouldn't use either for many types of ML...I wouldn't use Python either. The only benefit to using Python in this space is it has libraries bound atop C/C++ implementations. Erlang/Elixir doesn't, that I know of, but that doesn't prevent it from being done if someone wanted to. In terms of actually building something from the ground up...well, it depends. It isn't fast...but it does make the concurrency part easy to model ( https://www.amazon.com/Handbook-Neuroevolution-Through-Erlan... for instance), and for learning/prototyping that might be what you're prioritizing for. Certainly, Erlang/Elixir have seen use in high speed trading and ad bidding platforms, things known for needing low latency.
[+] [-] analog31|7 years ago|reply
And now this might seem weird, but I have had good results following tech trends that attracted the widespread interest of hobbyists. These tend to be things that are easier to obtain, install, and use, and that have communities built up around them. The hobbyists will tend to weed out the things that are just too painful to use.
Examples over the years include Turbo Pascal, Visual Basic, PIC microcontrollers, Arduino, and Python.
Some of these are proprietary technologies, of course, but their vendors didn't abuse us too much (until Visual Basic went overboard with dot-net, whereupon I switched to Python). I got a good solid decade or more out of each of these things.
Of equal or greater importance to the use of these tools is what I can learn. I don't mind throwing a vendor a few bucks if I will use their tool to expand my knowledge, and if that knowledge is applicable to a broader range of things. For instance through Python and the community of developers, I've learned new disciplines that have made me a better programmer in any language.
[+] [-] kenha|7 years ago|reply
If it’s for a task, I’d say:
1. Really spend some time to understand what you’re trying to solve 2. (If applicable) What pain point are you experiencing with existing solution? 3. Find tech (old or new) that might be a good solution, and understand their limitations/trade offs that you'll be making by choosing this tool.
Then, make a decision.
If it’s for learning, it ultimately is what are you trying to get out of the learning experience, and see if it fits your goal. (It is totally OK to learn a new tech just because it sounds cool - Learning more about something cool is a type of goals as well.)
[+] [-] scarface74|7 years ago|reply
My time is limited for learning any new to me technology. If it doesn’t either lead to me making more money in the future or remaining competitive, I don’t learn it.
[+] [-] wrestlerman|7 years ago|reply
[+] [-] spreiti|7 years ago|reply
[+] [-] Bahamut|7 years ago|reply
[+] [-] jwr|7 years ago|reply
2. Is it elegant? In the general sense: things that are elegant are often also good designs.
3. Do people use it? This is a hard one, and stems from 25+ years of experience. No matter how beautiful or impressive the new technology is, there will be problems, and I am too old to iron them out myself. I've got things to do. So, tough as it may sound, these days I will not start using things that have not seen reasonable adoption. This does not mean I won't read about them, just not use them. And my criteria for "reasonable adoption" are not "everybody and their dog uses it", I just need to see a user base. Also, the threshold is higher for databases. Practical examples: Clojure and ClojureScript are fine, Datomic is not. RethinkDB just fell below the threshold and I have to migrate; FoundationDB is barely getting to the threshold.
Taking Elixir as an example, it fulfills all three criteria. I know what it is, I've seen it in action, I read about it and I keep in the back of my mind as a tool I might want to use when needed.
[+] [-] royalghost|7 years ago|reply
[+] [-] moonlet|7 years ago|reply
[+] [-] lm28469|7 years ago|reply
[+] [-] yaseer|7 years ago|reply
The most important is an analysis of what use-cases is Technology X good for, and why? Every technical decision is a list of pros and cons. If it fits the use-case perfectly, that is the most important factor.
After that the most important factor I consider is community and momentum.
It's possible a technology is immature and lacks good documentation. BUT - if it has a rapidly growing community and momentum, these 'cons' will disappear rapidly.
[+] [-] tabtab|7 years ago|reply
1. Is it road-tested for a few years at least? Unless you are in R&D or a high-risk start-up, don't volunteer to be a guinea pig. 5 years is my rule of thumb.
2. Are there successes in similar organizations? One size does NOT fit all. Make sure it's useful in your particular organization in terms of domain (subject matter), culture, and company size.
3. Do the benefits over-emphasize a few narrow factors while ignoring others? There's rarely a free lunch; most decisions are balancing various trade-offs. There are probably down-sides that vendors or fans don't want you to know about or failed to notice due to enthusiasm bias.
4. Does it over-extrapolate current trends? For example, just because more UI's are going mobile does not mean every business is throwing out their mouse and big monitors. You may be limiting yourself by trying to make your UI's both mouse-friendly and finger-friendly even though most actual business users will be on a desktop. It's not always good to keep up with the Tech Kardashians; they are not always rational or timely.
5. Does it require a big learning curve or lots of prerequisites? If the new technology turns out to be mostly a fad instead of a real improvement, a long learning curve or expensive investments will drain your time and budget. Look for incremental improvements first.
6. Vague buzzwords or promises: Lack of specifics and realistic examples is a sign you are being had.
7. Experimenting is fine & recommended, but don't do it on production projects. If possible, introduce it to production gradually.
[+] [-] harel|7 years ago|reply
I also acknowledge that one day, I might wake up to a stinking pile of tech-crap, but even then at least I know that at the time it felt right.
For example, many years ago, in a decade far far away (The 90s), I went with ColdFusion as a tech stack. Back then it was that, Perl or maybe TCL. ColdFusion felt right because it allowed very rapid prototyping with a clear syntax and batteries included. There was nothing like it. Fast forward a few years later and that tech was so smelly it made everyone nauseous but I knew at the time it made perfect sense, and by then other options presented themselves.
[+] [-] outsidetheparty|7 years ago|reply
I mean, the whole time I knew it was a doomed language, far too philosophically pure to be practical, and the signs that XML was a mass hysteria were there from the beginning -- but it was exactly the tool I needed at the time I needed it.
[+] [-] tonyarkles|7 years ago|reply
[+] [-] geoka9|7 years ago|reply
[+] [-] bulletsvshumans|7 years ago|reply
[+] [-] stared|7 years ago|reply
* are there any tutorials / code examples? (to see if I like its API/philosophy AND there is a way to pick it up)
* are there any practical, working projects out there (used by companies, etc)? (otherwise it may be not useful for bigger projects)
* is it in active development? (otherwise there is a risk that it will cease to be useful)
* how does it match against other tools (e.g. maybe it is easy to pick, but so are all other frameworks).
Also, I did write some comparisons. Vide: https://deepsense.ai/keras-or-pytorch/ (got popular here).
[+] [-] jshowa3|7 years ago|reply
I look for a book. Books in general are usually miles better than technical blogs and you can find a ton of information addressed about a subject in one place, so there's the convenience factor (not that you couldn't create some program that indexes your bookmarks, but still have the problem of providing meaningful titles and organization for your bookmarks).
I also find that books are generally peer reviewed, especially textbooks, so the BS is kept to a minimum. Makes for a more boring read, but its more accurate.
That being said, I like language analysis posts. People tend to bring up a lot of things I haven't thought of, and there was one done on C a while back on HN that looked very good: https://eev.ee/blog/2016/12/01/lets-stop-copying-c/ (tbh, I skimmed the article because I didn't have a lot of time to read it at the moment).
[+] [-] ibash|7 years ago|reply
This applies to everything open source, including languages. If you open up the codebase and go “wtf” that’s a problem. If you open up the codebase and go “I don’t understand this, but it looks clean and with some effort I could understand this” that’s gold.
This is especially true for libraries, you get a sense for the right size and right amount of complexity in dependencies.
[+] [-] thisisit|7 years ago|reply
One, usability over existing systems. It is a difficult one to actually answer but I find people talking about new technologies all the time and when you ask them - Okay, what can we do with it which older technology couldn't? Mostly I hear murmurs or barely justifiable answers. But, if the explanation is sound, I go for number 2.
Does the person talking about it has significant exposure into the problem space? Mostly you will see people talking about how X is great but never having to work with the nuances of an older tech Y.
Now this process has some bias built-in. As a supporter of older tech Y it is entirely possible to never find a reasonable explanation. The only way around it is to talk to as many people as you can.
[+] [-] mschaef|7 years ago|reply
Rationale is that in a world with limited time, I'd rather focus on solving problems external to the technology itself. If I have something in my toolbox that will work, then it's generally the easiest thing to use, rather than learning something new. There's less of an immediate learning curve and fewer of the issues associated with being an early adopter.
Where this changes are in situations where either the investment to learn a new technology is low enough or the potential for return high enough that the learning might be expected to produce a high ROI.
So what that means practically is that I'm looking for things where I either have an immediate commercial need to know it or a strong feeling it's likely to be useful in a way that none of my existing toolset will fulfill.
From the perspective of something like a programming language, this is part of the reason I've tended to like Lisp-family languages as an adjunct to C-family languages. They're different enough that they're more likely to be complementary to each other and it's likely to be easier to make choices about what code goes in which language.
[+] [-] kodablah|7 years ago|reply
[+] [-] teunispeters|7 years ago|reply
and of course compare with existing and known. Very rarely does an old or new tech measure up against comparison... unless it offers something not present elsewhere.
And of course, bug reports (). If there's a long queue of bug reports I won't touch it until a significant number are addressed. (when I worked in commercial drupal dev, this is how we picked modules to use). If there are no bug reports or any kind of online reputation of ignoring or denying bug reports, it gets avoided. This is not unusual in project with particularly fragile egos in charge and therefore unreliable. () bug reports include feature requests, documentation requests and similar as well.
[+] [-] yamalight|7 years ago|reply
[+] [-] Maultasche|7 years ago|reply
I'm also learning Elixir and writing about it at https://inquisitivedeveloper.com/. In the first couple of posts, I talk about what Elixir is really good at and what it isn't good at. I'm generally positive about it but I do grumble when I encounter something I feel could use improvement or doesn't make sense to me.
For example, Elixir is great for concurrent and scalable software. I'd use it to build a web service or game server, but it is unsuitable for a game client, physics simulations, or OS development. It's just not low-level enough for those purposes.
To sugar coat it is to just set up your reader for disappointment further down the line when they hit the limitations and realize that it's not suitable for their needs.
I also keep a general awareness of what technologies are out there and what they're good for. I've never used Redis, for example, but I know what it's good for. The same applied to RabbitMQ. I knew what it was good at even though I didn't use it. That lasted until I encountered a situation where it would be useful and I ended up introducing it into my organization.