Ask HN: What new/experimental technology should I learn for the next 10 years?
43 points| twolf910616 | 5 years ago
In 2010, if you were in the right places and looking at the right trends (Theano, ImageNet, CUDA, the explosion of available datasets), you could reasonably predict that machine learning would be a good investment. (and ironically probably one of the worst thing to get into right now due to potential bull-whip effect)
Now I fully understand that hindsight is 20/20, and that any new and untested technology will be highly uncertain. But if you're working on something exciting, could you make a pitch as to why your specific field would take off in the next 10 years and is a good time to invest in right now? Rust? WASM? Maybe even robotics?
(For my own prediction: I think something Application-specific integrated circuits are super interesting to look into for the next 10 years. The slowing of Moores Law and Dennard scaling, and the ever increasing focus on specialized hardware all points to that ASICS would become more interesting. Personally I know very little about this field, but it feels like the age of homogenous computing is over and the next 10 years could be an exciting time for be able to go `full-stack`)
[+] [-] vincent-manis|5 years ago|reply
I was once told, around 1980, by a highly-respected IT person, that C had no commercial application. A few years later, I was told by a college's industry advisory committee, that Unix would not be used widely.
Arthur C Clarke's First Law: “When a distinguished but elderly scientist states that something is possible, they are almost certainly right. When they state that something is impossible, they are very probably wrong.” (rephrased to remove sexism)
Alan Kay: “The best way to predict the future is to invent it.”
[+] [-] bromonkey|5 years ago|reply
Don't, go learn past technologies instead. People leaving school for tech in 10 years likely aren't going to have stood up servers and configured services. We'll be lucky if they've used anything that isn't a web gui on a large cloud operation. Rad Hat has already signaled it is going in this direction by tossing out their EX300 exam in favor of the EX294. I believe we will continue seeing such a trend going forward. There's going to be a growing need and shrinking availability of competent sysadmins over the next 10 years if I had to guess.
[+] [-] giantg2|5 years ago|reply
[+] [-] OneFunFellow|5 years ago|reply
I have looked into mobile development in years past and I always got the impression that it was a hot mess. An ever changing everything, undocumented everything, tons of languages and frameworks to choose from. I was immediately turned off by how complex, immature, and ever-changing that everything was.
To me, the most important selling point of Ionic is that I can write code once (Ionic UI Framework + Angular or React or Vue), then run it through Capacitor and it spits out a NATIVE mobile app that runs in a WebView (not a PWA, but they support PWAs).
I don't have to know anything about Android or SwiftUI. If I want to access native features (such as camera or location) I simply use a Capacitor plugin. Again, zero native code, the plugin handles it. There are plugins for things like storage, clipboard, file system, haptics, and more.
If you were ever turned off by the complexity of mobile app development take a look at Ionic. If you know HTML, CSS, a JS framework, and can learn their (simple) framework language you can write a fully functional app without knowing anything about the native coding.
[1] https://ionicframework.com/ // https://github.com/ionic-team/ionic-framework
[2] https://capacitorjs.com/
[+] [-] GoldenMonkey|5 years ago|reply
Shipped mobile banking, digital wallets and crypto wallets to 21 million customers, as architect and lead.
And yet, here we are in 2020... still there is this fantasy about xyz framework flavor of the year... write once... run everywhere...
Magical silver bullet to save time and development costs.
Because, who wants to learn the native language for the mobile platform.
And every single code once, write anywhere framework leads exactly to it’s own tradeoffs.
Let’s explore each.
React-native - if you enjoy debugging this is your platform of choice. Have an obscure nodejs open source library you are dependent on? That is being deprecated? Have fun maintaining and upgrading it yourself. Or how about 2000 security warnings you need to resolve... yeah, you’ll code once... and debug everywhere.
Like Iconic? - you won’t when iOS decides to upgrade the OS. And the look and feel and your app suddenly looks like it is a decade out of date. Good luck upgrading the look and feel... re-write on next flavor of the platform.
Like xamarin bc you like C#? Have fun learning microsoft’s unique mobile paradigm. As well as needing to understand how iOS and android paradigm’s work. And debugging across 3 platforms... trying to isolate whether the memory leak is on the microsoft or iOS side of things.
The reality is this. There are tradeoffs. And you are really only trading one pain for another pain.
Learning the native tools and languages are only one kind of pain. And the least painful.
[+] [-] reactspa|5 years ago|reply
[+] [-] dezmou|5 years ago|reply
HTML applications also allow you to do great thinks like WebGL and WASM.
[+] [-] yuy910616|5 years ago|reply
[+] [-] gostsamo|5 years ago|reply
What I expect in ten years is that IOT will take off but this time it will have to be invisible for the user. An AI solution will observe, train, and expect the needs of the user and will try to provide what is needed in just in time manner. This will happen in both private and public spaces, so preparing some privacy and anti-tracking techniques might prove useful.
[+] [-] tpetry|5 years ago|reply
Jump on the next hype topic any FANG company is talking about. Everytime you got a lot of developers jumping blindly into it even if they dont need it and you earn a lot of money by buying one of the first snd again if you move people away from it because its not a silver bullet for everyone.
Think about big data and map-reduce 10+ years ago, everybody was using it, and many with workloads <100GB you could reslly throw onto one beefy machine. Take nosql databases and everyone rebuilding joins and transactional semantics until switching back to a traditional database. Or the hype of microservice which is vanishing more and more.
There are not that many inventions which really last, and nobody is able to predict it very good. But being an early adaptor of some fang hype get you somewhere.
[+] [-] huhnmonster|5 years ago|reply
Would you agree or did I just look at the wrong Github accounts?
[+] [-] danielscrubs|5 years ago|reply
I hope we will se more proof assistance connected to more normal languages, but I wouldn’t hold my breath for though. :)
[+] [-] giantg2|5 years ago|reply
[+] [-] tolbish|5 years ago|reply
[+] [-] person_of_color|5 years ago|reply
[+] [-] smarri|5 years ago|reply
[+] [-] franzwong|5 years ago|reply
[+] [-] harpratap|5 years ago|reply
[+] [-] wqTJ3jmY8br4RWa|5 years ago|reply
[+] [-] giantg2|5 years ago|reply
I doubt I'll be any good or do anything with that knowledge though.
[+] [-] cdnsteve|5 years ago|reply
[+] [-] cik|5 years ago|reply
[+] [-] ibraheemdev|5 years ago|reply
[+] [-] catacombs|5 years ago|reply
[+] [-] transfire|5 years ago|reply
[+] [-] mauflows|5 years ago|reply
[+] [-] 1996|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] nt2h9uh238h|5 years ago|reply
[+] [-] polyterative|5 years ago|reply
[+] [-] badhabit|5 years ago|reply
[+] [-] mojae|5 years ago|reply
[deleted]