top | item 31827387

Deno raises $21M

608 points| 0xedb | 3 years ago |deno.com | reply

382 comments

order
[+] nickjj|3 years ago|reply
From the post:

> For example, it is integrated with GitHub in such a way that on every push it will provision a new server running specifically that code, deployed to the edge, worldwide, and persisted permanently. Want to access the code your app was running a month ago at commit f7c5e19? It will be served up instantly at a moment's notice. It costs you nothing to have that bit of JavaScript responding to requests indefinitely.

These things sound great and almost a dream come true but how realistic is it to consider being able to do this in most web apps? As soon as your application uses a SQL database and you have database migrations then you're out of luck because a commit from 2 months ago might expect a different database schema than the current version and while it's common to migrate in backwards compatible ways, the backwards compatibility is usually only temporary until you finish migrating from A to B.

Long story short, this sounds cool but in practice is really only applicable to static sites or dynamic sites where you plan to keep a version of your database and code base backwards compatible from day 1 to current day (which I've never seen done in any app developed over the last ~20 years of freelancing for many different companies). The post mentions "The open source Deno runtime shows how clean and productive a modern, batteries-included, programming environment can be" so it sounds like they expect you'll be running database backed apps and not only static sites.

[+] meibo|3 years ago|reply
I hope that app isn't actually running on a server somewhere for each commit indefinitely, and that it's moreso intended for serverless setups, like the Netlift edge functions they mentioned...

We could do with some more consciousness for energy & compute resources in this industry, as decoupled it may be from the real world, clicking the button to deploy an EC2 instance somewhere does use real power and will contribute to hardware wear.

[+] nilsbunger|3 years ago|reply
I'd spawn a new DB for that instance too. It can still go a long way to give you a look at how your application worked in the past.
[+] irrational|3 years ago|reply
There are ways to version your database schema. Then you just need to make sure your code version is tied to your database version.
[+] lmm|3 years ago|reply
> The post mentions "The open source Deno runtime shows how clean and productive a modern, batteries-included, programming environment can be" so it sounds like they expect you'll be running database backed apps and not only static sites.

It suggests you'll be running a datastore, not necessarily an SQL database (the most overrated technology in existence IMO, especially for web apps where essentially none of its strong points are relevant). Storing old data as-is and migrating on read is definitely doable, and you can keep backwards compatibility to day 1 that way relatively easily. I worked on a system much like the one from "An oral history of Bank Python" that did exactly that, and had been doing so on a large scale for around a decade. Having a better-integrated datastore that can present multiple views of the same data is another way to achieve that, if you want to keep the migration out of the "application" code.

[+] potamic|3 years ago|reply
Interesting how this will play out. It's an ambitious goal to consolidate client side and server side javascript ecosystems which is quite fragmented today. On the other hand, this may only increase fragmentation further by introducing another target to develop for (wait for transpilers that can automagically convert between deno and node code). I will always look at javascript as this problem kid that cannot get its shit together in life, perpetually chasing the romance of utopia.
[+] jitl|3 years ago|reply
There is already a system for building a NPM node package from a Deno package, called dnt (deno to node). It's maintained by some Deno team members. Here's a blog post they wrote about porting "Oak" (a deno HTTP server framework) to Node: https://deno.com/blog/dnt-oak

The nice thing about Deno conceptually, is that it's much more similar to the browser platform than node is. It uses ES6 only, has things like `fetch` built-in by default, and generally follows browser standards around interfaces like Request, Response, etc. Instead of needing a complicated build process to make Node code work on the browser, now we have a complicated build process to make Deno/Browser code work on Node. ¯\_(ツ)_/¯

[+] ascorbic|3 years ago|reply
Deno code is really close to browser code. If you have ESM code that runs in the browser and doesn't need access to the DOM then it's a good bet it'll run on Deno, and vice versa. They use web standards for most things, and anything proprietary is put on the Deno global object. For standards that need adapting to work outside the browser, they're working with Cloudflare and others on WinterCG, which is defining a common baseline for these non-browser runtimes.
[+] Cthulhu_|3 years ago|reply
I do think Deno is a step in the right direction, by reducing the need for transpiler steps (e.g. TS to JS) and embracing JS standards instead of building their own (which is what NodeJS did, but this was in a time when JS had no standard for dependencies or per-file isolation).
[+] danielvaughn|3 years ago|reply
It's what I kind of love about it though - the JS ecosystem is Neverland.
[+] rising-sky|3 years ago|reply
It's a battle of ideals, you could have have a high entropy ecosystem that's constantly evolving and perhaps "appears" unstable, or an ecosystem that's "gotten it's shit together" and probably trends toward stagnation and apathy
[+] vcryan|3 years ago|reply
Deno overstates the problem it is intended to solve because its founders needed to justify achieving funding by being developer-famous.

Let's say a company were to adopt this tech over Node, well, it seems like it would be slightly better, but probably not much of a game-changer.

I'll leave it to y'all to talk about what tech is truly interesting as I don't want to seem ideological/biased, I just don't see how Deno is particularly notable.

[+] Androider|3 years ago|reply
It's usually not enough to be a bit better than what you're trying to displace, you have to be 10X better than the status quo to have any real impact. For example, SVN tried to be a better CVS, while Git came out of left field and destroyed the competition by being 10X better.

In that, Deno reminds me of the once-hyped Meteor.js. Meteor.js also though that funding could be the answer, but it wasn't. They're both clever, great for demos, but not sufficiently so to overcome the sheer inertia of Node+NPM et al. When something truly 10X better arrives, it will be quite apparent. Just like how React spawned a new generation of frameworks, nothing has unseated it yet because all the competition is React-like and not 10X better.

[+] erikpukinskis|3 years ago|reply
> I just don't see how Deno is particularly notable.

I think the place they can really sell me is:

- source maps

- debugging

It’s an absolutely PITA to get those two things working across a JavaScript stack. There are so many runtime contexts to deal with:

1) the browser

2) your API server

3) your frontend test env

4) your API server test env

5) browser and backend for your end to end/integration tests

6) pre-packaged code from secondary repos you are importing into all of the above

7) All of the above in CI

8) Special production builds of much of the above

It’s truly a nightmare. I’ve been trying to set up a fresh JavaScript full stack and it’s so much work. Every step of the way I need to take days off to do deep research into how to set this stuff up.

And it starts to make sense why no company I’ve ever worked at had all of that stuff working. You just deal with wrong stack traces and use console.log instead of a debugger in the places where it doesn’t work.

If Deno can provide all of those runtimes in an integrated way, with debugging and source maps working automatically that’s a total game changer.

And I’m honestly not sure who else could do it. Maybe like Next/Nuxt and all them? But do those projects handle build/packaging across multiple repos? I don’t think so…

Deno can nail that because they own packaging, and they can just skip the whole build/sourcemap step entirely and just distribute .ts files.

[+] ignoramous|3 years ago|reply
TFA goes: Try Deno Deploy - you will be surprised at the speed and simplicity.

Deno Deploy (and Cloudflare Workers) is a big deal. At least for a small tech shop like ours, I've come to found it useful for >50% of the solutions we have to implement. Its simplicity and cost-effectiveness reminds me of S3 back when it launched: 5 APIs and pay-what-you-use billing. Sure, right now, Deno Deploy's capabilities are limited, but there's nothing stopping them from building a platform around it as they go along, and now they've got $21M reasons to keep at it.

I see parallels of Zeit/Vercel meteoric rise (no pun) in Deno.

[+] alephnan|3 years ago|reply
> its founders needed to justify achieving funding by being developer-famous

I agree here, but

> Let's say a company were to adopt this tech over Node, well, it seems like it would be slightly better, but probably not much of a game-changer.

If I'm starting a new project, Deno will be compelling if it means zero configuration with sufficiently sane defaults. It's like Rails vs Ruby. With Node.JS, you can pick and choose then configure things like TypeScript, but then you have to manage configuration files for TypeScript, linting, and all these things not relevant to the app code.

[+] pier25|3 years ago|reply
Maybe not the tech, but consider the vision.

Deno is working on the runtime, the cloud infrastructure, the dx, and a framework (fresh). Nobody is doing that afaik. I think this is where the value lies.

[+] intothemild|3 years ago|reply
> its founders needed to justify achieving funding by being developer-famous.

If I remember my Javascript-of-the-week-drama correctly, didn't Deno become a thing because a couple node/npm devs were upset that there was someone in the core node/npm team who did a bad thing?

Am I remembering this right?

[+] jaredcwhite|3 years ago|reply
The rise of JavaScript-only clouds intended as "defacto" solutions for web development scares the hell out of me. Monocultures are dangerous. At least in the early Node era, it sat alongside all the other flavors of server infra out there with a near-infinite variety of languages, platforms, OSes. Now we're being told that the future of web development is…Deno? That's it? One tool? One language? One platform?

Not the web I intend to build and participate in, I can tell you that right now.

[+] spencerchubb|3 years ago|reply
Deno is only possible because V8 is so hyper-optimized. If you think about it, V8 has to compile/interpret and then run javascript code on the fly after downloading it in the browser. That's how good V8 is. That puts javascript in a unique position to enable something like Deno

So if there was a compiler/interpreter for another language that was close to being as good as V8, then something like this could exist for other languages.

Also, it looks like wasm works on Deno, so that gives some other languages.

[+] madeofpalk|3 years ago|reply
Granted, I don't pay a lot of attention to it, but I thought Docker(files) ended up being the defacto standard, not Javascript?

I'm sure Deno will tell you the future is Deno. You don't have to believe them.

[+] ricardobeat|3 years ago|reply
It may not be the case right now, but I'm pretty sure their platform will eventually run anything that compiles to WASM. This is just the first step.
[+] butterfly771|3 years ago|reply
History has proved the first-mover advantage. Many unreasonable places in the history of human evolution have been preserved because they did not affect survival. The best may not be widely used. Ecosystems that are large enough are easier to live to the end. The future is the future of the survivors. Js is like this.
[+] Cthulhu_|3 years ago|reply
It's less of a problem, I think, than you think it is. In the enterprise world, Java is / was a similar de-facto standard because they paid good money for Java enterprise servers, Oracle databases, trainings, frameworks, the works.

I'd rather work for a monoculture than a cowboy pick-whatever-you-want one; it's not a dichotomy, sure, but I'm aware of the latter for business continuity. You run into scaling and talent acquisition issues. Bus factor. Etc. If you as a company can say "We need a Deno developer" instead of "We need a full-stack Javascript/NodeJS/Scala/Java/Go/Rust/Erlang developer" (just to name a random array of languages) you can hire and train a lot better.

[+] TazeTSchnitzel|3 years ago|reply
> Early in cloud computing, virtual machines were the compute abstraction […]

This is funny to me because serverless sounds to me like the return of PHP (etc) shared hosting. What's old is new again?

[+] vlunkr|3 years ago|reply
This question is going to make me sound like a jerk, but why do you want to write your back-end in JS? Deno looks like a great improvement over node.js, but I don't feel compelled to use it. It seems like people jumped to node based on some performance promises that didn't really pay off (IMO). And since then, we have newer options like Rust, Go, and Elixir as performant back-end options, and even older choices like Ruby and Python have continued to improve.

Seems like the standard arguments would be that developers already know JS, and that you can share code with the browser. I don't find these highly compelling.

EDIT: I haven't learned typescript yet, based on the replies, it seems like that could be a good reason to choose it. Seems like a nice middle-ground between typical scripting and compiled languages.

[+] password4321|3 years ago|reply
+ People get paid, hopefully a few even get big $.

- Investors want unicorn returns.

Good luck!

[+] caust1c|3 years ago|reply
The color changing on this page gives me a migraine:

https://deno.com/deploy

I like Deno in principle, but I'd love to see how Slack, Github and Netlify are using it.

[+] eyberg|3 years ago|reply
I don't know if marketing was involved with using the term 'isolate' or not but if they are isolates as described by companies such as Cloudflare and Google, it might help to speak a bit more about the actual implementation at the infrastructure level.

Isolates are a really interesting approach to deal with the inherent nature of scripting languages to deal with the lack of threads as most scripting languages are inherently single-thread/single-process. If you have a 2000 line ruby class named 'Dog' you can easily overwrite it with the number 42. This is awesome on one hand, however it makes scaling the vm through the use of threads too difficult as threads will share the heap and then you have to put mutexes on everything removing any performance gain you would have normally gotten. Instead the end-user has to pre-fork a ton of app vms with their own large memory consumption, their own network connections, etc and stick it behind a load balancer which is not ideal to their compiled, statically typed cousins and frankly I just don't see the future allowing these languages as we continuously march towards larger and large core count systems. I'd personally like to see more of the scripting languages adopt this construct as it addresses a really hard problem these types of languages have to deal with and makes scaling them a lot easier. To this note - if you are working on this in any of the scripting languages please let me know because it's something I'd like to help push forward.

Having said that, they should never be considered as a multi-tenant (N customers) isolation 'security' barrier. They are there for scaling not security.

[+] kirankp89|3 years ago|reply
I’ve never been interested in JS as I mostly work in C++ but the ease of install/use of Deno over Node made me actually want to try it. It’s nice to have access to a bunch of web tech with a single binary. Very excited to see where this project goes!
[+] ebingdom|3 years ago|reply
> Cold start (time to first response, ms) O(100) O(1000) O(10000)

Ugh, that's not how big O notation works.

[+] dan-robertson|3 years ago|reply
On the other hand, it is how it is often used informally in speech. I would probably have written eg ~100 instead, but I easily understood what they meant.
[+] fwg|3 years ago|reply
Yes it is.

Big-O means given an arbitrary function of some complexity, it is definitely bounded by this other function from the top, i.e. that other function is always larger than our arbitrary function.

f(n) \in O(n^2) means n^2 (ignoring constant factors) is always larger than f(n). If you have no polynomial elements in your O(g), then you only state the constant factor. Like in O(1).

So saying Cold_start(service) \in O(100 ms) is exactly the same as saying the cold start will always be below 100 ms. It makes sense to not say they are all O(1), although strictly they are, as the interesting bit is the difference in magnitude of the constant.

[+] mmastrac|3 years ago|reply
I played with an early version of Deno a few years back and it was already way more comfortable to use than node. It's a real counterexample to second-system syndrome.

The only reason I didn't continue was a lack of ARM support.

[+] blobbers|3 years ago|reply
It will be interesting to see if Deno can provide the level of productivity improvement node.js delivered (was supposed to deliver? continues to deliver?).

It's one thing for it to claim supremacy over node, but can it attract the TJ Holowaychuk's of the world and truly generate a full ecosystem.

[+] chrismsimpson|3 years ago|reply
Interesting raise. I leave the front marketing page of Deno Deploy open in my browser for a few moments and I get the ubiquitous "This webpage is using significant energy. Closing it may improve the responsiveness of your Mac."

Says it all about the state of the JavaScript ecosystem really.

[+] jhgg|3 years ago|reply
Deno deploy seems cool and all, but I haven't seen any great rationale for using their service over say Cloudflare Workers.
[+] kjksf|3 years ago|reply
As someone who dabbled with both workers and Deno Deploy: Deno Deploy actually works.

I was very excited about the idea of workers but their tooling is (or at least was when I tried it last) abysmal, buggy and hard to understand. To this day I don't know how to setup a basic dev vs. production in workers.toml. Local debugging was buggy for naked domains (even found a GitHub issue for it that languished for months with no bugfix or clear explanation / workaround) and very slow.

Great idea killed by poor dev tooling.

Deno deploy is the opposite of that: deploys are instant and it's obvious how to deploy. You can develop and test locally.

Cloudflare released wrangler v2 (which dropped rust for node) and maybe it's better now but the one experience I had with wrangler v2 was trying to deploy a small static website (pages) and it failed due to their backend throwing 50x errors.

[+] tick_tock_tick|3 years ago|reply
Yeah unless I'm missing something isn't this just a stand alone company offering roughly workers?
[+] mattlondon|3 years ago|reply
You could say the same about AWS Azure, and GCP.

Competition is good.

[+] pier25|3 years ago|reply
Workers are not meant as a generalist runtime.
[+] Zababa|3 years ago|reply
> Cold start (time to first response, ms) O(100) O(1000) O(10000)

I think ~100 ~1000 ~10000 would be clearer than using the big O notation, since this has nothing to do with fuinctions.

[+] pcj-github|3 years ago|reply
Comments are a bit negative. I for one think they are onto something here. Surprised it's only 21M. I would have expected in these market conditions to beef up more for the next 2-3 years.
[+] sntran|3 years ago|reply
I find Deno very interesting. It's written in Rust by Mozilla, executing TypeScript by Microsoft on V8 engine by Google, and its name is sorted version of Node.

That aside, I have been very productive with Deno. Web Standards are going in the right direction, and Deno helps using them easy. The Request/Response model with streams make a lot of sense, and provides lots of way to optimize.

I understand performance is not the best compared to Elixir or Rust, but the ability to quickly download Deno, run a web server, import modules through URL, and start hacking and testing, then bundle into a cross-platform executable is a life-saver. No installation step, no build tool in between.

[+] pseudosavant|3 years ago|reply
As a Deno fan I was surprised to learn that the free tier for Deno Deploy includes 100k requests per day and 100GB of bandwidth monthly. I know I'll be trying it out now.
[+] LAC-Tech|3 years ago|reply
Does this mean we can finally get a REPL where a file can be loaded, modified, then reloaded, without having to restart the whole thing?

Seriously my biggest pet peeve with both deno and node.js. In every other REPL I've used this is basic functionality. When I talk about this to JS people they look at me like I'm from mars.

[+] alexwebb2|3 years ago|reply
JS modules can produce side effects on load. Does that present an obstacle to that kind of REPL pattern?