Hopefully 5 years is long enough for us to stop and take stock of the inevitable destination of the browser: an OS in a VM.
With any luck we might finally realise that we're not dealing with "documents" anymore, that HTML, JS, and CSS do not a good UI framework make, and that JS simply isn't good enough for what we need it to do and how we need it to do it.
ECMAScript 7 will be pretty much ready to implement and the standards committee will realise that they've run out of syntax for ECMAScript 8, forcing a rethink. Fingers crossed a neckbeard will arrive and point out that a LISP would be ideal for both defining the "DOM" and providing a base language for targeting by other languages, who can just (ok, "just") serialize their ASTs and pass it in.
Seriously: if, in 5 years, we're still perfecting the dominant user-facing runtime by tweaking an arbitrary XML spec, a language with only one kind of number, and a styling language so inept it almost requires auto-generation tools, well I might just kill myself.
After doing web for a little bit, ending up loathing it, and then checking out iOS and enjoying it so far, I couldn't hope for this outcome more. I never realized how crippled browser based web applications and the ancient, shitty tools used for creating them are before developing on a web-connected platform that uses a systems language directly for its applications. Pretty much anything you can do with a computer, you can do in an iOS/Android app, and it can connect to the web, which lets you create rich experiences far more easily. To make any modern web application for a browser, you have to use garbage like opinionated JS frameworks and a myriad of other crutches and hacky solutions because the tools for the frontend are incredibly limited and were not designed for what is demanded of them today. Plus browsers don't easily support connection protocols other than HTTP, so this leads to a whole other host of hacky backend solutions for developing so called "real-time" applications, when you could easily use something like UDP were you not limited by a browser. Only downside to iOS/Android platforms is that they are governed basically by the Apple/Google dictatorships(which is somewhat understandable, because it would be a disaster if anyone could so easily distribute applications widely capable of executing malicious code on a user's device), and it is completely up to them what is allowed and what isn't, in contrast to the open nature of the web. I don't know how this could be remedied, seeing as there is an undeniable trend of more and more consumer-facing web activities being done through mobile apps, rather than browsers. Something like a completely FOSS device OS with a decentralized marketplace; e.g. anyone with a server can host their application, similar to how it is with jailbroken/rooted phones? This still wouldn't solve the problem of people being able to execute malicious code on a user device, but I guess if you think about it it isnt much different from some idiot downloading "free cursors" from some malware site and giving themselves a virus like what happens all the time today.
Five years is nowhere near long enough for the entire platform to be replaced with something completely different. Unless, of course, you mean the "OS" and VM that is already in our browsers; but it seems apparent you don't want that OS and VM, you seem to want something very, very different (and different in ways that nobody is really working on, as far as I know).
I'm less critical of JavaScript, but I've also got a much more conservative expectation of how much will change in any five year period. Your estimate assumes more dramatic and rapid change to the web than at any point in its history (except maybe the initial introduction of the graphical browser, or of the release of JavaScript, itself), and an ecosystem with a few billion participants does not and cannot change as fast as one with a few thousand. And, of course, for something to be the way we all do things in five years, somebody would have to be doing it that way today on a small scale.
Please don't kill yourself, despite the fact that your dream definitely is not going to happen the way you've envisioned it. (In its place will be a gradual, but measurable, improvement on nearly every front in the systems we currently work with. It'll probably all turn out alright.)
I'm deathly afraid of that ever becoming true. An OS in a VM by default? Isn't that the end of all things nice and native? I know that's pretty much the state of affairs in the browser already but I do worry about how that would affect other parts.
Will system language become even less popular and "apps" even more popular? Will we completely live in the cloud by default? What will happen to our data in the long run?
What more functionality exactly are you trying to achieve with this "OS in a VM" concept you are referring to?
Would you say Iphone/Android/WP apps satisfy what you consider a "OS in a VM?" Or close to it? The web and and iPhone app environment arent that different anymore.
I think this is a ridiculous thing to say. Don't think of the web as dev platform. Think of it as OpenGL or any other GL language. To create a game you need to use a graphics card which provides a primitive set of instructions but they can be empowered with the help of libraries. The web is the same. It provides a primitive set of functions that can be represented in a common way. They are still usable on their own but much more useful as part of frameworks.
Web technologies is for UI what opengl is for gaming.
Same place we were five years ago: nobody cares what API you're programming to as long as your apps are good.
No, seriously. The iPhone SDK came out in 2008. Google Maps came out in 2005, ushering in the modern era of web apps. Mac OS X came out in 2001 and .NET came out in 2002, representing the major desktop platforms we know today.
The web as a platform is no different than any other platform in that it's just another platform with its own strengths and weaknesses. It just moves a whole lot slower than the rest, which is why we're still asking this question after this many years.
APIs and platforms come and go. Developers always have had and always will have choices about which ones to pick for developing their apps against. These choices have some impact on how easy it is to build various types of apps, but at the end of the day, the only thing that matters is how well your app serves the needs of those using it.
A related question might be: where will mobile apps be in 5 years?
Hopefully, the answer is that web standards will evolve to allow HTML+JS to do things currently only possible in mobile apps. This would mirror the original transition of desktop apps to web apps, and would have positive implications for the internet by bringing mobile users back to webpages (though I wouldn't count on it being done in 5 years).
No I don't like yapping at my phone or PC, but my 6yo kid loves FireTV's voice search feature and uses google voice search all the time on her iPad, even if she knows how to spell the words. As precision improves, I see my wife is using it more too, for searches and dictation.
More support for device-/screen-hopping (state-in-the-cloud, continued-commerce)
Stick-computers
Affordable devices optimized for streaming screens/desktops/apps hosted in the cloud.
Hardware upgrades only in data centers.
Consumers investing instead in high bandwidth/low latency connections.
Web 3.0 == forgetting (ignoring/hiding)
More apps featuring some form of 'forgetting' (hiding older, unused, unimportant, noisy data).
More apps assisting in the reverse: new mechanisms to improve recall/precision of 'forgotten' information by 'priming' our searches with user input (sounds, locations, colors, images) and/or interactive feedback (hot/cold, before/after, similar/different, binary search)
Not just with the voice control - I can imagine something like Siri being able to respond to requests of things like, "Please show me all of my videos taken while I was in Hawaii of 2014."
This also comes back to hands free for driving, etc. I don't understand why a tesla d can drive itself but we're still using blinkers and manual actions to change lanes...
I don't think things will change a great deal in 5 years. I would love to be wrong, but the progress of web standards and browser technology is painfully slow. The things at the leading edge of development are amazing, but there is an enormous gap between the first browser supporting something and being able to assume that most users will have support. There is also a very slow progression due to the rather annoying notion that everything must have a 'Use Case'. WebGL comes along and when that got into browsers the need for Typed Arrays became more evedent, once people started making games and so-forth they wanted fullscreen options once things were fullscreen people wanted to use mouse movement rather than mouse pointer position so that FPS games could work properly. Each advance demonstrates a new deficiency which gets fixed but it takes another iteration of browser development and release.
I think the change is coming, it's just that the timeframe of 5 years is just too short.
I started a project a few years ago aiming to do Desktop style things using HTML+CSS+JS. When I started I was just mucking around and I expected something better to come along and I'd abandon the project, but I'm still plodding along ( https://www.youtube.com/watch?v=7namj7iy16Y ).
Along the way I came across a simple idea as a test of whether or not WebApps were up to the job of desktop work. Completely replace the functionality notepad.exe (or OS Specific equivalent level program) with a Webapp. The text editing part is easily managed, but the test is not for what can be done but what can't. What about opening all .txt files with it, editing then saving the result to the desktop, or hacking on startup scripts when nothing else is available?
Much of the behaviour of WebApps assumes that "It's on the Cloud". That is a Deal breaker for a lot of people. Especially since Snowden.
We're getting there (or at least somewhere) but we're not doing it terribly quickly.
Data is replicated, no item exists only in one place.
Data is shared, shareable and in formats that can be operated on both by the owner and by the user, the visitor, the renter, the spectator of the items in question.
Data is encrypted, watermarked, tagged, associated to immutable provenance histories.
Data is ephemeral, guaranteed to be forgotten, subject to erasure, intended to be unrecoverable once used.
The network is reliable. The network is unavailable. The network is on the side of the people. The network will stab you in the back when you least expect it.
Some programmers will insist on writing javascript to control medical devices. Some programmers will insist that casual games running in sandboxed environments be provably correct.
Someone will cobble together an AI and set it to rewriting enterprise java apps, when interviewed after the disaster the AI will claim that humanity "had it coming".
A software bug will start a small war. And a EULA clause will end another one.
You will be completely transparent and documented for everyone else to see. You will not have sufficient permissions to find out anything about anyone with any power over you.
Most applications will incorporate blockchain technology; but bitcoin will be the subject of jokes and nostalgic pop culture trivia questions.
It will be the best of times ( for some ), and the worst of times ( for others ).
Here's something a little out of left field, but worth considering: By adding support for Android apps, Google Chrome has the potential to completely dominate desktop and mobile development.
Using a simple hack [1] it is already possible to write a single Android app that runs on (a) Android tablets; (b) Mac computers; (c) Windows computers; (d) Chrome OS computers; (e) Nexus Player set-top boxes. If Google were to put its full weight behind this strategy then I think it would quickly become the default platform for desktop development and thereby destroy the Windows empire. It might even mean Android apps being developed for what would normally be the domain of web apps.
I don't remember which book it was (maybe "Inmates are running the asylum"), but there was an example of dancing bear. It is amusing to see that bear is able to dance at all, but that is by no means are good dance.
Also, how many users are constantly switching between different platforms? If all I see is the lousy app on my platform of choice I could not care less about its capability to run on others platforms.
Developers will view REST APIs with the same dismissive disdain reserved today for SOAP/COM/CORBA. In 5 years time there'll be something much cooler, probably based on GIT. (GIT will live forever.)
The hot in-demand skill will be writing tight C code for ASM.JS based single page holographic apps.
We will commute in flying cars and poverty will be history.
Or maybe not - who knows? The best way to predict the future is to invent it. Nobody has a crystal ball, so it's really up to you.
> The best way to predict the future is to invent it. Nobody has a crystal ball, so it's really up to you.
In a sense you're right, it usually takes a few people doing something different, and if it's better, others will follow suit. However, by following current trends and looking at how past trends played out, I'm sure we can have a reasonable idea of what will decline and what will gain favor.
As the OP already noted, we can already notice a move towards web apps. Based on that one might conclude that we'll all be using web browsers as our OS and Chrome OS is a perfect example of that. However, we should also notice that demand for desktop apps didn't suddenly die out as web apps gained popularity. Desktop apps are still in demand, so the previous conclusion is too hasty. One conjecture I want to make based on current trend is that developers will start developing more for mobile platforms (phones and tablets), rather than the desktop because this is where consumer demand and use is moving to.
In both cases we can use current trends to strengthen or weaken hypotheses. It's not like the future happens in a crystal ball either, yesterday was another day that contributed to the future.
>>As the browser becomes home to more and more software that traditionally would be desktop only
Sorry I don't see that. All of my desktop apps are still desktop apps. On Windows I still use Office, VS, Adobe CS, Amira.
I don't see this happening any time soon due to legacy codebase as well as performance and ease of coding issues. For example, with great difficulty one could port Photoshop, perhaps using clever off-screen WebGL buffers to do the heavy computation, but why would you do that?
There are plenty of Photoshop-inspired apps that run in the browser already. None of them are on par with Photoshop obviously, and maybe none will be in 5 years but I don't think it's completely unrealistic.
There are a bunch of obvious advantages to webapps like no installation, settings are kept no matter which computer you are working from, works everywhere, etc. They may not outweigh the disadvantages in your view but it does seem to be the trend in the world.
The dream of "convergence" will reverse starting with windows 10. Web and mobile will tend towards html5/js, desktop applications such as browsers, games, ide:s, media production, cad,... will remain heavy C++ (or possibly Java/.NET) applications. In 5 years we will see no dramatic change in the landscape. People will realize that even moderately complex applications should never be browser based and ideas such as browser based word processors or drawing programs will be remembered as a fad.
Rust will ease the pain of heavy desktop development somewhat. C and C++ needs to retire, but that won't happen over night. It will take 30 years, not 5.
Web dev will be helped by ES6, or even better: AtScript/TypeScript. JS becomes the assembly of the web and hopefully no one has to actually write todays JS directly.
I see everything moving to a ChromeOS-like model, with HTML+JS and WebGL for graphics-heavy things. If not those technologies in particular, something equivalent.
Essentially the consumer OS will be reduced to being a platform for a browser (I use "browser" loosely here -- I consider iOS to be an app browser in this sense). The OS will disappear from visibility.
To make this work you'll need a lot more standardization on APIs and data formats (because in the future consumers won't touch "files" anymore). I also think that's coming.
Finally, I see the server/client sides being even more divorced from each other. Clients will become dumber (single-purpose) and servers will become smarter (multi-purpose), and both will be much more connected.
I've thought for a while that certain web technologies that Google is developing or promoting (e.g. WebGL, NaCl/PNaCl, V8, Chrome app store, Chromebook, etc.) has implied a vision of web apps being just as capable as desktop apps, perhaps even in categories such as video games that need raw access to the CPU/GPU. However, progress towards this vision seems to be going much more slowly than I would have expected, and desktop apps will no doubt continue to fill certain needs and niches even if a fully realized web platform is achieved.
Of course, mobile apps (and the consumer electronics revolution that has allowed many people to consider their phone as a primary computing device) are also competing in this space.
Well, reading comments was interesting. A lot of "we need a faster horse" thinking, and, sadly, a lot of "we don't care about the users" thinking too. By "we don't care about the users" I mean thinking purely from the developers perspective, not from "make it harder for the database easier for the user" POV. And you can substitute database with developer.
For those hoping for the rise of web, I'd suggest a few points to consider. The most important: what advantages does this kind of approach have compared to native — and to think about that form user's perspective.
Run the same app everywhere? Ok, but typical user does not switch platforms that often
on the same kind of device. Switching desktop/mobile is another matter, but there is an alternative to running the same everywhere, e.g. Apple's continuity.
Ease of updates: just go to the same URL and you get the version. Well ok, what if I don't like the new version? How do I keep the old one?
And how big advantage is that (again, from users perspective) compared to app stores and auto updating, that is already there?
I am afraid that in five years from now we will see the same we saw five years ago: throwing in half-baked features into browsers to keep them "on-par" with native APIs
and endless search for THE framework. The trouble is, that while native have those frameworks at foundation, web tech basically requires them to be bolted on…
I think about this a lot, part of my job is designing mobile apps for Ubuntu touch that scale to tablet, desktop, TV using the same code base (but different layouts). We are working towards convergence and the native Qt/QML apps we build for mobile can already run on Ubuntu desktop. I know other desktop/mobile platforms are also working towards the convergence goal, it will be interesting how it plays out. I can see Android desktop apps not being too far away, for instance.
5 years goes past pretty quick, if I think of desktop or web apps 5 years ago not that much has changed.
I have a lot of faith in browser engines continuing to make a great progress, and web as a platform has more developers and momentum than anything else so I expect to see a lot of interesting things coming out as web based SaaS software.
If I think of some of the more complex desktop apps I use (things like Photoshop, After Effects, 3DS Max, Logic Pro, Davinci Resolve) I don't think those will be replaced in 5 years by apps running solely on web technologies. I think something like Qt offers a better fit for these types of apps if cross-platform is a goal.
Having said that, apps of that complexity are a small fraction of what people actually use - as web closes in on the quality of user experience of native apps I think more and more traditional desktop and mobile apps will be built with web technologies.
And there will probably be some curve balls in terms of how people interact with computing devices like IoT devices and Oculus Rift/Magic Leap type wearables with completely new interfaces.
What is happening now and will be even more prominent in the next 5 years is that people will be using web technologies like html, css and javascript to write one codebase that is transferable to any platform (browser or native).
Nowadays with node.js you can convert your web-app to an android, ios, mac, windows app, etc. You have access to system hardware, and from the users perspective it is no different than any native app, even though it is built with web technologies.
It is quite amazing, html, css and javascript standards are becoming a universal platform for developers. We are writing apps for html5 architecture vs specific processor architectures. The browser is a compile target and javascript is becoming like bytecode.
I see a future in which we develop multiple new ways of coding apps using web technologies, completely different than the way we do now. We will have various languages, methodologies to do it with but in the end it will all compile down to code a browser rendering engines can understand. And we can use this rendering engine to run the app outside the browser.
So basically I am saying we can code apps in various ways that compile down to web-technolgies and use this single codebase to port to any platform. And the user experience will be indistinguishable from native apps coded directly for the platform.
IMHO, Prediction of software sustenance depends primarily on following points:
a) Current Usage
b) Current set of users
c) Availability of internet
d) Kind of exposure to apps
Current Usage:
Current desktop apps are used by the majority for purposes such as documents, presentation, image modification, watching videos. If these purposes are served in fullest and flexible enough to adopt, then web apps will start dominating.
Current set of users:
By numbers, if we go, majority of current set of users for desktop apps will fall in the category who might not be aware of a browser turning to be a replacement for all their needs. It will take 5 years for web apps to expand and become popular enough and reach out to this majority for extinction of desktop apps.
Availability of internet:
Currently desktop apps are predominantly used for offline purpose, since connection to internet is not required. And places where there is no connectivity to internet is present, still desktop apps would survive. But web apps requiring internet connectivity at all times for each request-response will not be able to reach out to places where there is lack of internet connection. Web apps will need to provide a offline environment similar to desktop apps, then web apps would dominate. Otherwise, if internet connection is provided to everyone like how sun provides light to every corner of the planet, then web apps would burn down desktop apps.
Kind of exposure to apps:
Currently, majority of users tend to sit in a cubicle and perform work, which is more related to desktop apps. If majority of users tend to become more mobile, for example, all the personal PC boxes become tablet/iPads/smartPhones then web apps (mobile apps) would need to replace desktop apps.
Apps would have won as the browser still try to catch up with the richness of desktop environemnts.
Everyone will be back to native application development, with REST/socket protocols to communicate with micro services in distributed servers.
HTML will be used just as multimedia document format as a means of providing an interactive reading experience, while a group would be pushing for HTML 6 as the real thing.
Oh, how I'd like to see sanity going back to the web and identity crisis to be over. I am sick of that constant search for the holly grail of silver bullet framework. Guys, how about admitting that html, css and js are not really the best tools for the job of building apps?
It's an exciting future for web apps – I'm more interested to know who's going to take us there. It will require a significant collaboration between the browser makers and the operating systems – and while sometimes they are the same people, the major players have a conflict of interest.
When the iPhone launched without an app store or sdk, apple pushed for web apps. Yet [this article](https://developer.apple.com/library/safari/referencelibrary/...) hasn't been updated in over five years. And Apple has very little incentive to do so now. The Chrome web app store is a promising concept, but I'm not convinced it has great traction.
My fear is that as web apps gain access to native app features, standardization will go out the window as each operating system (or maybe browser at that point) competes for market share.
I hope we trend towards open, public APIs that make which interface you build or use just a personal choice that others don't have to live by.
One example is HN, they've recently released an API and when that is full-featured apps and alternative sites built on the API will be able to completely replace the HN website for anyone who want them.
It's a platonic ideal, but the internet runs on ads, and that ruins everything.
An interface you don't control is an interface you can't advertise on. The vast majority of players will never open their data in a manner that hurts their ability to sell sponsorship, and this problem is entrenched in the fabric of the internet.
HN is a very, very rare exception in that it never was and never will be dependent on ad funding.
[+] [-] lorddoig|11 years ago|reply
With any luck we might finally realise that we're not dealing with "documents" anymore, that HTML, JS, and CSS do not a good UI framework make, and that JS simply isn't good enough for what we need it to do and how we need it to do it.
ECMAScript 7 will be pretty much ready to implement and the standards committee will realise that they've run out of syntax for ECMAScript 8, forcing a rethink. Fingers crossed a neckbeard will arrive and point out that a LISP would be ideal for both defining the "DOM" and providing a base language for targeting by other languages, who can just (ok, "just") serialize their ASTs and pass it in.
Seriously: if, in 5 years, we're still perfecting the dominant user-facing runtime by tweaking an arbitrary XML spec, a language with only one kind of number, and a styling language so inept it almost requires auto-generation tools, well I might just kill myself.
[+] [-] no_future|11 years ago|reply
[+] [-] SwellJoe|11 years ago|reply
I'm less critical of JavaScript, but I've also got a much more conservative expectation of how much will change in any five year period. Your estimate assumes more dramatic and rapid change to the web than at any point in its history (except maybe the initial introduction of the graphical browser, or of the release of JavaScript, itself), and an ecosystem with a few billion participants does not and cannot change as fast as one with a few thousand. And, of course, for something to be the way we all do things in five years, somebody would have to be doing it that way today on a small scale.
Please don't kill yourself, despite the fact that your dream definitely is not going to happen the way you've envisioned it. (In its place will be a gradual, but measurable, improvement on nearly every front in the systems we currently work with. It'll probably all turn out alright.)
[+] [-] Svenstaro|11 years ago|reply
Will system language become even less popular and "apps" even more popular? Will we completely live in the cloud by default? What will happen to our data in the long run?
[+] [-] marcus_holmes|11 years ago|reply
I really hate language snobbery.
[+] [-] randyrand|11 years ago|reply
Here's one without a gui: http://bellard.org/jslinux/
What more functionality exactly are you trying to achieve with this "OS in a VM" concept you are referring to?
Would you say Iphone/Android/WP apps satisfy what you consider a "OS in a VM?" Or close to it? The web and and iPhone app environment arent that different anymore.
[+] [-] passfree|11 years ago|reply
Web technologies is for UI what opengl is for gaming.
[+] [-] ayrx|11 years ago|reply
[+] [-] puls|11 years ago|reply
No, seriously. The iPhone SDK came out in 2008. Google Maps came out in 2005, ushering in the modern era of web apps. Mac OS X came out in 2001 and .NET came out in 2002, representing the major desktop platforms we know today.
The web as a platform is no different than any other platform in that it's just another platform with its own strengths and weaknesses. It just moves a whole lot slower than the rest, which is why we're still asking this question after this many years.
APIs and platforms come and go. Developers always have had and always will have choices about which ones to pick for developing their apps against. These choices have some impact on how easy it is to build various types of apps, but at the end of the day, the only thing that matters is how well your app serves the needs of those using it.
[+] [-] hillis|11 years ago|reply
Hopefully, the answer is that web standards will evolve to allow HTML+JS to do things currently only possible in mobile apps. This would mirror the original transition of desktop apps to web apps, and would have positive implications for the internet by bringing mobile users back to webpages (though I wouldn't count on it being done in 5 years).
[+] [-] jordanpg|11 years ago|reply
[+] [-] notlisted|11 years ago|reply
No I don't like yapping at my phone or PC, but my 6yo kid loves FireTV's voice search feature and uses google voice search all the time on her iPad, even if she knows how to spell the words. As precision improves, I see my wife is using it more too, for searches and dictation.
More support for device-/screen-hopping (state-in-the-cloud, continued-commerce)
Stick-computers
Affordable devices optimized for streaming screens/desktops/apps hosted in the cloud. Hardware upgrades only in data centers. Consumers investing instead in high bandwidth/low latency connections.
Web 3.0 == forgetting (ignoring/hiding)
More apps featuring some form of 'forgetting' (hiding older, unused, unimportant, noisy data).
More apps assisting in the reverse: new mechanisms to improve recall/precision of 'forgotten' information by 'priming' our searches with user input (sounds, locations, colors, images) and/or interactive feedback (hot/cold, before/after, similar/different, binary search)
[+] [-] gravity13|11 years ago|reply
[+] [-] aosmith|11 years ago|reply
[+] [-] Lerc|11 years ago|reply
I think the change is coming, it's just that the timeframe of 5 years is just too short.
I started a project a few years ago aiming to do Desktop style things using HTML+CSS+JS. When I started I was just mucking around and I expected something better to come along and I'd abandon the project, but I'm still plodding along ( https://www.youtube.com/watch?v=7namj7iy16Y ).
Along the way I came across a simple idea as a test of whether or not WebApps were up to the job of desktop work. Completely replace the functionality notepad.exe (or OS Specific equivalent level program) with a Webapp. The text editing part is easily managed, but the test is not for what can be done but what can't. What about opening all .txt files with it, editing then saving the result to the desktop, or hacking on startup scripts when nothing else is available?
Much of the behaviour of WebApps assumes that "It's on the Cloud". That is a Deal breaker for a lot of people. Especially since Snowden.
We're getting there (or at least somewhere) but we're not doing it terribly quickly.
[+] [-] olefoo|11 years ago|reply
Data is replicated, no item exists only in one place.
Data is shared, shareable and in formats that can be operated on both by the owner and by the user, the visitor, the renter, the spectator of the items in question.
Data is encrypted, watermarked, tagged, associated to immutable provenance histories.
Data is ephemeral, guaranteed to be forgotten, subject to erasure, intended to be unrecoverable once used.
The network is reliable. The network is unavailable. The network is on the side of the people. The network will stab you in the back when you least expect it.
Some programmers will insist on writing javascript to control medical devices. Some programmers will insist that casual games running in sandboxed environments be provably correct.
Someone will cobble together an AI and set it to rewriting enterprise java apps, when interviewed after the disaster the AI will claim that humanity "had it coming".
A software bug will start a small war. And a EULA clause will end another one.
You will be completely transparent and documented for everyone else to see. You will not have sufficient permissions to find out anything about anyone with any power over you.
Most applications will incorporate blockchain technology; but bitcoin will be the subject of jokes and nostalgic pop culture trivia questions.
It will be the best of times ( for some ), and the worst of times ( for others ).
[+] [-] MarkMc|11 years ago|reply
Using a simple hack [1] it is already possible to write a single Android app that runs on (a) Android tablets; (b) Mac computers; (c) Windows computers; (d) Chrome OS computers; (e) Nexus Player set-top boxes. If Google were to put its full weight behind this strategy then I think it would quickly become the default platform for desktop development and thereby destroy the Windows empire. It might even mean Android apps being developed for what would normally be the domain of web apps.
[1] http://lifehacker.com/how-to-run-android-apps-inside-chrome-...
[+] [-] rimantas|11 years ago|reply
Also, how many users are constantly switching between different platforms? If all I see is the lousy app on my platform of choice I could not care less about its capability to run on others platforms.
[+] [-] fineline|11 years ago|reply
The hot in-demand skill will be writing tight C code for ASM.JS based single page holographic apps.
We will commute in flying cars and poverty will be history.
Or maybe not - who knows? The best way to predict the future is to invent it. Nobody has a crystal ball, so it's really up to you.
[+] [-] cooper12|11 years ago|reply
In a sense you're right, it usually takes a few people doing something different, and if it's better, others will follow suit. However, by following current trends and looking at how past trends played out, I'm sure we can have a reasonable idea of what will decline and what will gain favor.
As the OP already noted, we can already notice a move towards web apps. Based on that one might conclude that we'll all be using web browsers as our OS and Chrome OS is a perfect example of that. However, we should also notice that demand for desktop apps didn't suddenly die out as web apps gained popularity. Desktop apps are still in demand, so the previous conclusion is too hasty. One conjecture I want to make based on current trend is that developers will start developing more for mobile platforms (phones and tablets), rather than the desktop because this is where consumer demand and use is moving to.
In both cases we can use current trends to strengthen or weaken hypotheses. It's not like the future happens in a crystal ball either, yesterday was another day that contributed to the future.
[+] [-] aosmith|11 years ago|reply
[+] [-] frozenport|11 years ago|reply
Sorry I don't see that. All of my desktop apps are still desktop apps. On Windows I still use Office, VS, Adobe CS, Amira.
I don't see this happening any time soon due to legacy codebase as well as performance and ease of coding issues. For example, with great difficulty one could port Photoshop, perhaps using clever off-screen WebGL buffers to do the heavy computation, but why would you do that?
[+] [-] kristiandupont|11 years ago|reply
There are a bunch of obvious advantages to webapps like no installation, settings are kept no matter which computer you are working from, works everywhere, etc. They may not outweigh the disadvantages in your view but it does seem to be the trend in the world.
[+] [-] tonyedgecombe|11 years ago|reply
[+] [-] alkonaut|11 years ago|reply
Rust will ease the pain of heavy desktop development somewhat. C and C++ needs to retire, but that won't happen over night. It will take 30 years, not 5.
Web dev will be helped by ES6, or even better: AtScript/TypeScript. JS becomes the assembly of the web and hopefully no one has to actually write todays JS directly.
[+] [-] variables|11 years ago|reply
Essentially the consumer OS will be reduced to being a platform for a browser (I use "browser" loosely here -- I consider iOS to be an app browser in this sense). The OS will disappear from visibility.
To make this work you'll need a lot more standardization on APIs and data formats (because in the future consumers won't touch "files" anymore). I also think that's coming.
Finally, I see the server/client sides being even more divorced from each other. Clients will become dumber (single-purpose) and servers will become smarter (multi-purpose), and both will be much more connected.
[+] [-] simmons|11 years ago|reply
Of course, mobile apps (and the consumer electronics revolution that has allowed many people to consider their phone as a primary computing device) are also competing in this space.
[+] [-] aikah|11 years ago|reply
[+] [-] imaginenore|11 years ago|reply
[+] [-] rimantas|11 years ago|reply
Ease of updates: just go to the same URL and you get the version. Well ok, what if I don't like the new version? How do I keep the old one? And how big advantage is that (again, from users perspective) compared to app stores and auto updating, that is already there?
I am afraid that in five years from now we will see the same we saw five years ago: throwing in half-baked features into browsers to keep them "on-par" with native APIs and endless search for THE framework. The trouble is, that while native have those frameworks at foundation, web tech basically requires them to be bolted on…
[+] [-] dharma1|11 years ago|reply
5 years goes past pretty quick, if I think of desktop or web apps 5 years ago not that much has changed.
I have a lot of faith in browser engines continuing to make a great progress, and web as a platform has more developers and momentum than anything else so I expect to see a lot of interesting things coming out as web based SaaS software.
If I think of some of the more complex desktop apps I use (things like Photoshop, After Effects, 3DS Max, Logic Pro, Davinci Resolve) I don't think those will be replaced in 5 years by apps running solely on web technologies. I think something like Qt offers a better fit for these types of apps if cross-platform is a goal.
Having said that, apps of that complexity are a small fraction of what people actually use - as web closes in on the quality of user experience of native apps I think more and more traditional desktop and mobile apps will be built with web technologies.
And there will probably be some curve balls in terms of how people interact with computing devices like IoT devices and Oculus Rift/Magic Leap type wearables with completely new interfaces.
[+] [-] sferoze|11 years ago|reply
Nowadays with node.js you can convert your web-app to an android, ios, mac, windows app, etc. You have access to system hardware, and from the users perspective it is no different than any native app, even though it is built with web technologies.
It is quite amazing, html, css and javascript standards are becoming a universal platform for developers. We are writing apps for html5 architecture vs specific processor architectures. The browser is a compile target and javascript is becoming like bytecode.
I see a future in which we develop multiple new ways of coding apps using web technologies, completely different than the way we do now. We will have various languages, methodologies to do it with but in the end it will all compile down to code a browser rendering engines can understand. And we can use this rendering engine to run the app outside the browser.
So basically I am saying we can code apps in various ways that compile down to web-technolgies and use this single codebase to port to any platform. And the user experience will be indistinguishable from native apps coded directly for the platform.
[+] [-] gansai|11 years ago|reply
Current Usage: Current desktop apps are used by the majority for purposes such as documents, presentation, image modification, watching videos. If these purposes are served in fullest and flexible enough to adopt, then web apps will start dominating.
Current set of users: By numbers, if we go, majority of current set of users for desktop apps will fall in the category who might not be aware of a browser turning to be a replacement for all their needs. It will take 5 years for web apps to expand and become popular enough and reach out to this majority for extinction of desktop apps.
Availability of internet: Currently desktop apps are predominantly used for offline purpose, since connection to internet is not required. And places where there is no connectivity to internet is present, still desktop apps would survive. But web apps requiring internet connectivity at all times for each request-response will not be able to reach out to places where there is lack of internet connection. Web apps will need to provide a offline environment similar to desktop apps, then web apps would dominate. Otherwise, if internet connection is provided to everyone like how sun provides light to every corner of the planet, then web apps would burn down desktop apps.
Kind of exposure to apps: Currently, majority of users tend to sit in a cubicle and perform work, which is more related to desktop apps. If majority of users tend to become more mobile, for example, all the personal PC boxes become tablet/iPads/smartPhones then web apps (mobile apps) would need to replace desktop apps.
[+] [-] pjmlp|11 years ago|reply
Everyone will be back to native application development, with REST/socket protocols to communicate with micro services in distributed servers.
HTML will be used just as multimedia document format as a means of providing an interactive reading experience, while a group would be pushing for HTML 6 as the real thing.
[+] [-] rimantas|11 years ago|reply
[+] [-] ivyirwin|11 years ago|reply
When the iPhone launched without an app store or sdk, apple pushed for web apps. Yet [this article](https://developer.apple.com/library/safari/referencelibrary/...) hasn't been updated in over five years. And Apple has very little incentive to do so now. The Chrome web app store is a promising concept, but I'm not convinced it has great traction.
My fear is that as web apps gain access to native app features, standardization will go out the window as each operating system (or maybe browser at that point) competes for market share.
[+] [-] benologist|11 years ago|reply
One example is HN, they've recently released an API and when that is full-featured apps and alternative sites built on the API will be able to completely replace the HN website for anyone who want them.
[+] [-] variables|11 years ago|reply
An interface you don't control is an interface you can't advertise on. The vast majority of players will never open their data in a manner that hurts their ability to sell sponsorship, and this problem is entrenched in the fabric of the internet.
HN is a very, very rare exception in that it never was and never will be dependent on ad funding.