Reports of Deno's Demise Have Been Greatly Exaggerated
The whole selling point for me was that deno was node without the bullshit and baggage, but they dropped that and basically just turned it into node with built in typescript support and a few other minor things like the permissions.
Similar story with bun.sh - node backwards compatibility (although not using V8).
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
What's the point? If you're in love with static types, but have to do JavaScript because you're targeting the browser, I kind of understand why'd you go for TypeScript. But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
People like to sneer at TypeScript, but let's be honest: people like to sneer at anything that's popular enough. The fact is that no language that I enjoy better than TypeScript (which is already not a very long list) stands any chance of adoption in an average workplace.
It also interops nicely with F#, so you can write a pure functional library in F# and call it from a C# program in the "functional core, imperative shell" style.
It has an incredibly solid runtime, and a good type system that balances letting you do what you want without being overly fussy.
That doesn't mean there's anything wrong with it and I've often thought to give it another shot, but it's not a viable option right now for me because it's been too hard to get started.
but every time I've tried to figure out how to install it I get lost in the morass of nearly-identical names for totally different platforms and forget which one I'm supposed to be installing on Linux these days.
I realize Microsoft is terrible at naming things, but for .NET/C# it's really not that hard these days. If you want to use the new, cross platform .NET on Linux then just install .NET 8 or 9.
New versions come out every year, with the version number increasing by one each year. Even numbered versions are LTS, odd numbered releases are only supported for about a year. This naming scheme for the cross-platform version of .NET has been used since .NET 5, almost 5 years ago, it's really not too complicated.
Any legacy .NET projects are made with .NET Framework 4.x (4.8.1 8s the latest). So if it's 4.x, or called .NET Framework instead of just .NET, it's referring to the old one.
.NET Core is no longer used as a name, and hasn't been since .NET Core 3.1. They skipped .NET Core t completely (to avoid confusion with the old one vut I think they caused confusion with this decision instead) and dropped the Core for .NET 5. Some people will still call .NET 5+ .NET Core (including several of my coworkers) which I'm sure doesn't help matters.
Mono isn't 100% completely dead yet, but you'll have little if any reason to use it (directly). I think the Mono Common Language Runtime is still used behind the scenes by the newer .NET when publishing on platforms that don't allow JIT (like iOS). They've started adding AOT compilation options in the newest versions of .NET so I expect Mono will be dropped completely at some point. Unless you want to run C# on platforms like BSD or Solaris or other exotic options that Mono supports but the newer .NET doesn't.
I might feel differently if I worked with a large number of people who I didn't trust, but on small to medium teams composed of very senior people using Go feels like coding with one hand tied behind my back.
Personally I'd chose Clojure for anything I have a choice with.
And then you would have to solve the problem of how to communicate with the client.
Not sure why you'd think that be a difficult thing?
You aren't suggesting to handwrite an HTTP API client, right? You would have to set up either OpenAPI which is a mess, or GraphQL which is also a mess. LMK if you have a better solution.
What exactly is the problem you're encountering when trying to interact with HTTP APIs?
The problem is that, unlike when using server actions, when using HTTP APIs, there is nothing that automatically generates bindings.
If you really want to use OpenAPI for whatever reason
No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
But it's not like using OpenAPI or GraphQL are the only two options
What are other options?
when using HTTP APIs, there is nothing that automatically generates bindings.
Is that really the biggest problem you face when programming? How many endpoints do you have? Even with projects with ~30 endpoints, it doesn't seem problematic to me, but maybe people regularly work on projects with 100s of endpoints, then it kind of makes sense. But I'm not sure that's typical enough.
No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
People do a lot of stuff for a lot of nonsense reasons, doesn't mean that's the best way to approach things. The JS/TS ecosystem seems extra easy to fall into cargo culting too.
IDK what you mean by "deal with OpenAPI", OpenAPI is a spec not a technology like graphql.
In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
server actions" seems to be a NextJS thing, not a JS thing?
It's a JS framework thing. Every mainstream JS framework has server actions or equivalent.
Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.
It is way more convenient to write a server action and be able to immediately use it in a client component than having to write an HTTP endpoint in a separate back-end project, and then regenerate your client via OpenAPI, or whatever else you use.
IDK what you mean by "deal with OpenAPI"
I mean dealing with tooling to generate an HTTP client from OpenAPI schema.
In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
Wrong
Every mainstream JS framework has server actions or equivalent
No. "Server actions" are a React concept, it has little to do with backend technology (the backend still speaks HTTP). This concept is completely irrelevant to most big frameworks like Express, NestJS, Koa, Hapi. Next is barely a "backend" framework: it's rather a React server framework that has a few basic backend functionalities.
is this really an honest question?
Yes, there is nothing that works better than server actions. None of what you listed really makes sense to me. I have never had any runtime performance problems with TypeScript and wasn't JavaScript the most popular programming language in the world (the talent pool argument)?.
vibe coding
"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering. It was a tiny experiment that somehow people thought was a suggested way of developing software, which obviously it isn't. Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering.
Well I guess making Next.js apps isn't really "serious engineering"
Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
I do read the code but I barely write any code by hand.
Well I guess making Next.js apps isn't really "serious engineering"
Where did I say that?
I do read the code but I barely write any code by hand.
Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now. The description "vibe coding" is explicitly about "programming" with a LLM without reading or writing any code at all, for any purpose. If you read the code, you're not really vibe coding as originally described by Karpathy.
Where did I say that?
You replied to a comment that says "Yep, Next.js has the best support for vibe coding."
Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now.
You can stop arguing that if one glances at the code one is no longer vibe coding, because in practice by looking at the code or even the LLM's thoughts you can catch things you don't want early.
At this stage, I don't think anyone needs to try and persuade anyone why JavaScript and typescript are the Lingua Franca of software engineering.
Performant, expressive, amazing tooling (not including node/npm), natively cross-platform.
An absolute joy to code with. Why would anyone want to use anything else for general purpose coding?
In my mind there are two alternative approaches in the current ecosystem: C++ where absolute 100% maximal performance is the overriding primary objective and be damned with the consequences, then for everything else just use Typescript
anyone why JavaScript and typescript are the Lingua Franca of software engineering.
I mean, it obviously isn't, although for web development, I'd probably agree with you. But regardless, zealots who hold opinions like this, where there is "one language to rule them all" is why discussing with TS peeps is so annoying. In your world, there is either C or TypeScript, but for the rest of us, we tend to use different languages depending on what problem we're solving, as they all have different strengths and drawbacks. If you cannot see any drawbacks with TypeScript, it's probably not because there aren't any, but you're currently blind to them.
Why would anyone want to use anything else for general purpose coding?
Because you've tasted the world of having a REPL connected to your editor where you can edit running code live and execute forms. Just one example. There are so many languages available out there when you control your stack. I understand using JavaScript for frontend web stuff, because you have no other options, and I personally have nothing against JavaScript itself. But for the love of god, realize there is a world out there outside of your bubble, and some of those languages have benefits too.
https://survey.stackoverflow.co/2024/technology#most-popular... ... JS is the most popular language in the world, per Stack Overflow.
You may as well call binary (i.e. 1s and 0s) the Lingua Franca in that case.
Lets not get started on C build chains (especially cross-compiling) ... cmake Vs cygwin Vs msvc Vs whatever else these days with hacky-and-brittle ifdefs conditionals everywhere just to make it work - chaos! JavaScript just runs on pretty much any modern computer you can sit down at or put in your pocket, and even on more exotic things that don't have it installed by default you are about 10 seconds away from installing the official package and you are off and running.
Something as simple as a map with two integers as a key, or case-insensitive string keys, requires jumping through hoops. Even Go and Python can do this.
why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js. We want an ROI, we just don't want node/npm.
Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js.
Right, makes sense. It also makes sense that most of those learnings are transferable, it's not like TypeScript is the only language with types. So your design/architecture skills can be used elsewhere too. Locking yourself into the ecosystem of one language, then asking other runtimes to adhere to your preference sounds like a sure way of getting disappointed, instead of being pragmatic and flexible to chose the right tool for the problem.
Also, when I was writing a frontend and backend both in TS, I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right? And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
It's not like TypeScript gives you some inherit benefit that makes it easier to convert to JavaScript, besides the fact that it's literally a "Compile-to-JS" language.
But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right?
JavaScript does make it easier to target both the web browser and Node.js, sure. But TypeScript also has a fairly mature type system and ecosystem (flaws in `tsc` itself notwithstanding). Not to say that no novel approaches are worth exploring, though; I just haven't seen one that rivals my TS experience yet.
And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
That depends. In many other programming languages (such as ones that compile to WASM) it's also possible to have common code shared between server and client, but it's usually pretty inconvenient to actually get the code running in both environments. It's also possible to have a common interface definition and generate types for server and client from that definition, but that's still more complicated.
Anyway I don't fault anyone for being disappointed that Deno fell into the Node.js compatibility trap. Pure TypeScript without that particular cruft is also something I was excited about. I also was excited to see what looked like some fair innovation (like their import mechanism and their sandboxing) but I don't know how that'll continue if Node.js compatibility ends up being too much of a time sink.
I don't have very strong opinions because I've never really used Deno and I probably won't even bother at this point, but I definitely would not agree that this is just a problem of needing to use another programming language instead.
Go is unsound (Go slices...)
What are you referring to here? Go's current slice implementation could only possibly introduce unsoundness via data races. Even there, I think you might actually need to muck around with an interface value rather than a slice to demonstrate unsoundness of the type system (rather than just memory corruption or unsafe memory access).
Typescript's type system is just deliberately unsound by design (for defensible reasons). It's a whole nother level of unsoundness in comparison to Go.
What are you referring to here?
In this case my use of "unsound" is somewhat stretched to "particularly easy to make subtle mistakes as a programmer":
https://www.reddit.com/r/golang/comments/1h5ws8e/comment/m09...
But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript
Why not use it? What high level programming language would you suggest instead with the same level of performance and ecosystem support.
The way you make a scripting language fast is by getting the hell out of it and into C or C++ as fast as possible, and PHP's library ecosystem embraces that harder than just about any other scripting language, is the reason (I think).
[EDIT] My point is mainly that Node's performance isn't really that impressive, in the field of "languages one might use to write server-side Web code". It beats absolute slugs like Python and Ruby handily, but past that... eh. And really, in its speed category you'd probably do just as well using either of those and paying a little more attention to things like network calls and database access patterns.
Just one benchmark I could find https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
If you want the developer experience of using something that’s not Node, you can still get it from Deno.
But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
The answer is obvious in the programming language case: for those who do not want async, the addition of async/await begins to poison the ecosystem. Now they have a growing list of libraries that they cannot use if they want to avoid async, so the effort involved in picking a library goes up and the odds get increasingly high that they're locked out of some of the key tools in the ecosystem because new libraries without async become harder and harder to find.
For those who really hate colored functions, the addition of async is the removal of a feature: colorless functions are replaced with colored functions.
The same can be said of NPM compatibility. Sure, I can try to avoid it and stick to Deno imports and inspect each library that I use for NPM dependencies. But it gets harder and harder as time goes on, because a key feature of Deno has been removed: it's no longer an ecosystem reset.
If you're gonna argue that fragmentation is a problem in the node ecosystem (which I agree with), you can't convince me that a plethora of approaches to asynchronous code is preferable to async/await and promises.
1) The original essay that coined this term was looking at it from a language design perspective. The argument is a fair one if that design question is up for debate, but that isn't the case for Javascript.
(In general, I do agree that "you don't have to use it" is not a strong argument.)
The problem is that if you think statically, you can say "oh, just use the 'clean' subset". But the world is not static. If you think dynamically, you can see the full Node ecosystem as a fairly powerful attractor; why would I write a deno library that only works on deno when I can write a node library that works on both? Well, if I'm writing in the Node ecosystem, why not use the whole thing?
This is a general effect; it is very hard for people to set up long-term ecosystems that are "too close" to existing ecosystems. Generally the new thing will either get pulled in (as in this case) or ignored (as in the many cases of 'hey Typescript seems to have worked pretty well, let me do a Typescript-like gloss of this other language', which generally just get ignored). There are successes, like Typescript (JS is in general a special case because being the only language in the browser for so long it was both a language and a compile target; most other attempts to "Typescriptify" a language flounder on the fact that few if any other languages have that situation), Elixir (managed to avoid just being absorbed by Erlang, IMHO kind of a similar situation where the 'base langauge' for the ecosystem was not really the best), and the occasional Lisp variant that bubbles up (though like Clojure, usually with a story about where it can be run), but in general this is very hard to pull off, harder in some ways than simply starting a brand new language ecosystem, which is itself no picnic.
Also, promises already color functions just like callbacks do. Async/await just changes the syntax by which that coloring is expressed. The real problem people have with async is that they prefer green threads as a solution to concurrency, not that they don't like the syntax.
Of course, in (browser-compatible) Javascript, some things can not be done synchronously, but that's not up for debate.
In Rust, you can call async functions from normal ones by spawning them on the executor. The .await syntax isn't as painful as dealing with callbacks and closures in JavaScript. Plus, if you call an async function incorrectly, Rust's compiler will catch it and give you a clear error message, unlike JavaScript, where bad function calls could lead to unpredictable behavior. The premises of the article don't apply, so Rust's async/await doesn't introduce the same "colored function" issues.
(See also https://without.boats/blog/let-futures-be-futures/ )
JavaScript itself has come a long way towards making coloring less painful. TypeScript+ESLint solves the weird unpredictable behavior issues with JS and async/await solves the syntax issue. Promises in general give well-defined semantics to calling an async function from a sync function. But all that only undoes some of the arguments about function coloring, not all of them. Fundamentally the same question applies: do you make async-ness part of the type system or do you instead build a system like green threads that doesn't put it in the type system?
I happen to think that coloring functions according to their async-ness is actually the right move (with the right ergonomic improvements), but plenty of people don't agree with me there even with all the ergonomic improvements Rust and TypeScript have made to the model.
It’s software you are mostly using to build end user software. Gauge it by how well it works and serves the end user.
This is why AI is good. It will just force this SWE vintage to solve the problem at hand instead of over engineering or holding on to some traditionalist views of the language.
So it reminds me more of trying to avoid CGO in Go or avoid “unsafe” in Rust.
It would be worse if Node-specific types started appearing as function parameters or return types, but that seems fairly rare even in npm libraries, so it seems easy to avoid.
If a Deno library uses a Node API or an npm library somewhere, it can be entirely encapsulated, so you might not even notice until you see it in a stack trace. That doesn’t seem very intrusive?
Node API yes, NPM library no. If you add a dependency on a library that uses NPM you now depend on an entire web of transitive NPM dependencies, with all of the problems that entails. People don't dislike NPM because it's aesthetically displeasing—you can't just abstract away the problems with NPM. The ecosystem causes real problems with real software, and Deno initially recognized those real problems and set out to reset the ecosystem.
The only way in which NPM-compat is different than colored functions is that there's no static compiler feature telling you when you've added a dependency on a bunch of NPM libraries.
Though, it is nicer if it’s on jsr.io because you’ll see Typescript source code in the debugger.
There’s nothing about starting over that prevents ending up with a whole new rat’s nest of dependencies, if you’re not careful.
At least that's the theory. To be honest I don't see Deno's value add. The runtime is like... I mean node works fine at this point? And the capabilities system is both too pedantic and too simplistic, so it's not actually useful.
I don't understand the value add of Bun much either. "Native" Typescript support but at the end of the day I need a bundler that does more than what these tools tend to offer.
Now if one of these basically had "esbuild but built in"....
Although now that Node itself has basic TS support with type-stripping, this substantially improves matter. But that's a fairly recent thing, both Deno and Bun predate it by a long time.
Also Bun has a built-in bundler? I'm not sure how it compares with esbuild tho.
And for build scripts, it's not a usable approach because those are the ones that run the bundler. Now, one can argue that you shouldn't really need static typing for build scripts... and yet I have found valid bugs when switching them from JS to TS in the past.
Anyway, Node is rapidly improving its TS integration story, which is probably translating to less interest in Deno (and Bun). Although in case of Bun, the appeal of having a single coherent tool that covers building, testing, and publishing is very enticing after spending a few years in the Node ecosystem duck taping together various third party solutions to all of these.
I think my complaint is less that these runtimes do nothing, but more that I would expect them to do _even more_. But I'm not trying to make things much better.
I’m looking forward to whatever they’re going to do instead of KV, which I tried and is too limited, even for a KV store. (64k values are too small.) Something like Cloudflare’s Durable Objects might be nice.
You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
If enough people find Deno useful enough to skip some old libraries, maintainers are "forced", even thought Deno is not forcing anyone. If they do not adapt, then someone will just create a new library with better practices. In both cases there is pressure for better JS/TS evolution.
Why do you treat adding a feature (npm compatibility) like you’re losing something?
Because you are losing something: a better ecosystem. Standardizing around… standards is a good thing. When I dive into the dependencies of any given Node app it’s a mess of transpiled code, sometimes minified even, there’s no guarantee what API it’ll be using and whether it’ll run outside Node (is it using the W3C streams API or the Node streams API?). But inertia is a powerful force. People will just use what’s there if they can. So the ecosystem never gets made.
But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
By that logic we never build anything newer or better. Python 3 is better than Python 2 and sets the language up for a better future. Transitioning between the two was absolutely torturous and if they just prioritised popularity it would never have happened.
Why do you treat adding a feature (npm compatibility) like you’re losing something?
Because they are losing something.
All the time and money they are investing into node compat could have been used towards a Deno first party ecosystem. It's not like they have hundreds millions to spare. Deno is a small company with limited resources.
People kept complaining that they couldn't use Deno with NPM packages so Deno ended up focusing in providing faster horses.
In practice, it would stay a small group of hobbyists responsible for doing weird things like maintaining ports of npm libraries that deno users want, like official SDKs.
It’s a grim end state if you’re trying to be more than that. And it seems to only appeal to people with a weird vendetta against Node/NPM which is also a crappy user to cater to.
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
I'm sure you can find other projects that are going to fail, but why do you want to?
Node has lots of problems (I am basing this statement on the fact that it's a major tech project). None of them are sufficient to prevent it from being extremely widely used.
To fix those problems in a product that will be used, it is not sufficient to provide something sort of like Node but without those problems. You either have to:
1. Provide a tool that requires a major migration, but has some incredible upside. This can attract greenfield projects, and sometimes edge out the existing tool.
2. Provide a tool with minimal migration cost, and without the same problems. Maybe this tool can replace the existing one. Ideally there will be other carrots (performance, reliability, ease of use). Such a tool can get people to migrate if there are enough carrots, and the migration cost is low enough.
Deno was a classic example of messing this up. It's not #1 or #2, it has the worst of both worlds. The upside was that it did things "the right way", and the downside was that you couldn't run most code that worked on Node. This is the kind of approach that only attracts zealots and hobbyists.
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
Cloudflare Workers workerd comes to mind but it's fundamentally a different thing.
https://github.com/cloudflare/workerd
It's not meant to be a generalist backend runtime and it provides almost zero batteries.
The post is a good illustration of why that matters. Very little of it is about Deno itself, instead it’s mostly about the paid-for services Deno Inc offers. They have to prioritise and chase that because their investors want to see financial growth.
It’s the same reason they abandoned the idea of Deno being a fresh start in the JS ecosystem (an idea I loved) and instead started patching in Node compatibility layers. They had to reduce friction, not add to it. But for me that compromised the reason for using it in the first place.
Node has many flaws. And it’s boring. But its existence doesn’t depend upon the whims of random investors who might decide to pull the plug at any moment. So I’m just going to stick with it.
I started working with JS/TS just before Deno 2 came out and having, essentially, full node (and TypeScript) compatibility was the primary reason I switched to it. It is all just so simple in comparison to node.
But, I agree about the VC funding - it certainly gives cause for concern about Deno's direction and longevity. But what other option is there, really? Hopefully what was said in this post about reduction of Deno Deploy locations being a function of use rather than economics was true
Could it be that they added node compatibility because people wanted node compatibility
I imagine that’s exactly the reason! But they outlined their reasoning for a clean break pretty well in their 1.0 announcement post[1] and they haven’t, to my knowledge, posted a follow up “here’s why we were wrong about all that” post.
All of which is to say I understand the business reasons why they did it, but to me it compromises the original technical promise of Deno. A rebooted, sensible JS ecosystem was the reason I was interested in Deno originally. I use Node every day and I’m mostly happy with it but whenever I need to dive into a dependency to see what’s going on it’s a five layer deep rats nest of transpiled and sometimes even minifed code using different module formats and platform specific APIs. I’d love to be done with all that.
Sometimes it pays to be bold when you’re challenging an entrenched incumbent. Any non-Node JS platform has to pitch "don't use the status quo, take a risk, use me" and absent the original benefit I don’t see a good argument to use Deno, especially when factoring in the risk of VC-driven priorities. I’m not saying everyone has to agree with me on that but it’s my personal perspective.
Deno still has a permissions model that is very different and far more opt-in than Node. This post makes a case for thinking of Deno's deep, native OpenTelemetry support as something very new and different from Node's approach, and clearly important to the future of application deployment and observability.
Technically Deno is still very interesting in technical promise, especially compared to Node, even with a larger compatibility layer.
I and others are lamenting that the compatibility layer removed incentive to help create a new JS ecosystem that isn’t layers of garbage piled on top of each other. That new ecosystem is what I wanted and Deno is no longer the path to it. If that makes me a “dunce”, so be it.
Even for Node projects I've started searching JSR first and comparing JSR scores of libraries. If a project advertises first-class JSR support now, I pay more attention.
The compatibility layer works in both directions here. JSR is a fascinating growing home of Typescript-first, typedocs everywhere, ESM-native ecosystem. Most of those install just as well into Node as Deno (though they are easier to use in Deno, by far). I want Typescript-first. I want ESM-native. I have a registry now that gives me some indicator if and how much a package author cares about those things.
The compatibility layer didn't remove the incentive for Deno (the Company) to help create a JS ecosystem that isn't just layers of garbage dependencies piled into trenchcoats, the compatibility layer encouraged Deno to expand their horizons and make Node better too by providing a Registry like JSR. I think JSR is a better way to promote a new, fresher JS ecosystem (for all) than restricting what sorts of packages are compatible with the Deno runtime. It's got gamified test scores! It's got provenance data and supply chain transparency!
If they hadn't added the compatibility layer to Deno too many people would have chicken-and-egg excuses that all the packages they need are in npm and never even tried Deno. With the compatibility layer in place Deno gets to shine in its own place of what it can do better than Node, then suggest you try JSR in all your hobby projects, then get you using JSR even in Production on Node packages until it is even easier to convince your day job to switch to Deno for better performance and increased security and deep OpenTelemetry support… Deno's still helping a lot to create a new JS ecosystem with fewer layers of garbage, compatibility or not with the old layers of garbage.
Given that you still use Node, you might want to try Deno 2 out... It'll likely solve a lot of your headaches.
"But what other option is there, really?"
Not taking VC funding, having slow organic growth, making a good product, and having pride in your work?
Like, maybe I'm missing something, but why does the end goal always have to be VC funding and acquisition? Is it too much to ask to stay independent and just make something you take pride in and enjoy the craft over many years of a successful, but not self-canibalizing, business?
I dunno man, I just keep seeing every smaller business's end goal to be acquired or turn into a money pumping SASS and it's just depressing. Lets make good things, enjoy delivering a product that people like, and spread good. Keep you and, if you have employees, making a good living and be happy?
Also, are you aware that Deno is being built by literally the creator of Node? This isn't some get rich quick scheme - its something that he deeply wants to see exist. He's also leading the charge against Oracle (a genuine parasite) for the copyright/trademark of Javascript.
If investors pushed for it as well, then they were just being sensible...
Not really.
The biggest issue with Node is the dependance on the fragile NPM ecosystem. Strategically, fixing this is the thing that would distinguish Deno and make it more valuable.
And Node is already adding TS and other features that were initially the reason to leave for Deno.
You have to decide where to go and apparently not being a niche product was one the reasons, that's fine - but now they have to live with at least 2+ unhappy (ex?) users.
I really like coding in TypeScript and think that most of people's irritation with JavaScript isn't actually related to the language so much as the ecosystem of NPM. The exponentially growing web of dependencies and the constant churn of deprecations are exhausting, detracting from a core language that is now pretty solid.
Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors. And again, that's totally fair—resetting an ecosystem is risky and probably wouldn't have yielded the return they needed! But giving up on that was giving up on what made Deno different and interesting. If I'm going to use NPM anyway why not stick with Node while I'm at it?
IMO their logic still holds up. Dahl had a whole talk about the mistakes made with Node:
The original promise of Deno was a consistent ecosystem. Absent that it doesn’t matter to me all that much how great Deno is within itself, the case for using it simply isn’t compelling enough. These days the newer, standards-compliant Node APIs are pretty good too!
It may sometimes be difficult to find such an option, but that was always going to be the case without Node compatibility.
Now, in theory at least, you have the option of sticking with Deno native dependencies, and an escape hatch when none are available.
That seems like the most pragmatic solution to the ideology vs adoption dilemma.
Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors
Maybe it was too risky for users? The people with the most appetite for a new start and a new way of doing things are people who are suffering from their existing investment in Node. Making a halfway jump to a new platform with no path to completing the migration would leave their customers running on two platforms indefinitely. It's the worst-case outcome for a migration, going halfway and getting stuck with your feet in two canoes.
By supporting Node, Deno lets customers switch to something new and better and bring their legacy baggage along as well.
it’s mostly about the paid-for services Deno Inc offers.
In a way I think that's a good thing. Their plan for making money is to provide those services. That goal is enhanced by Deno being healthy. I would be more concerned if Deno was the product they were wanting to sell.
As long as Deno itself is FOSS, then I think I'm ok with it.
And it’s boring.
This is a feature. Once upon a time, Node was the new hotness that all the cutting edge hackers were excited to play around with, and needed a hard sell to management. It has since graduated to IBM status - i.e. "no one ever got fired for...". And thank god for that. It's the most mature possible ecosystem choice at this point for the niche it fills, and we are able to build rock solid maintainable systems with it (and hire people who know it deeply). That didn't come cheaply or easily (IO.js drama anyone?), and anything that wants to take its place will need to make it through the same process.
But there's something fundamental holding me back from investing my time in Deno, or Bun for that matter: they're both VC funded.
For me it is the lack of support for musl. Perhaps there is a connection between inattention to certain details and being VC-funded.
At some point, rather than coming up with native solutions to those pain points, they retreated and started leaning on backwards compatibility as a workaround.
Today, Deno feels more complex than Node does because it contains both approaches. And now there are lots of edge cases where a Node package ought to work, but doesn’t because of one unimplemented API or option or a bug that exists only in Deno. My favorite testing framework, AVA, still isn’t supported.
I used to just ignore the npm compatibility layer and target Deno itself, but that’s become more cumbersome to do over time. For example, look at `deno run —help` and look at how many command line options and env vars there are. It’s exploded in the past few years. A lot of that is for npm interoperability. For me, it’s just a lot of noise.
The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter. Yet they don’t seem to want to do that.
I really want Deno to succeed, if for no other reason than because it’s pushing Node to do things that they should’ve done years ago, such as adding a permission system. I just don’t think the current vision for Deno is very coherent or consistent with its original purpose.
My favorite testing framework, AVA, still isn’t supported.
Have you checked recently? The docs (https://docs.deno.com/runtime/fundamentals/testing/) specifically mention AVA as being supported. Then again, I'd assume that most devs using Deno just use the built-in `deno test` instead of a third-party testing framework.
The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter.
Again, have you checked recently? According to the docs this is supported: "Deno's built-in linter, `deno lint`, supports recommended set of rules from ESLint to provide comprehensive feedback on your code. (...) You can specify custom rules, plugins, and settings to tailor the linting process to your needs." (https://docs.deno.com/runtime/fundamentals/linting_and_forma...)
I've been using Deno for 6 years now. And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
I feel similarly. The standard configurations (e.g. tsconfig, linting, formatting) and bolts-included tooling (test, lint, fmt, etc.) are what make Deno so great for developers.
I've started using Deno in my spare time for various projects - and it just _feels_ more productive. I go from idea to testing TypeScript in minutes - which never happened in Node land.
The standard configurations (e.g. tsconfig, linting, formatting) and bolts-included tooling (test, lint, fmt, etc.) are what make Deno so great for developers.
And that's great for greenfield projects - although there's competition with Biome and Vite / Vitest for a lot of those - but the vast majority of Node use today is existing projects, and at least at one point Deno (and Bun, maybe others) were marketed (I think?) as a drop-in replacement for NodeJS. But maybe I'm misremembering.
Their purpose is to be a single toolchain & system for JS, and they’re solving a lot of problems in that area. No need to configure huge number of tools for a reasonable professional, production setup — even built in Otel! And if that’s their goal, why would they spend time on complete eslint compatibility? That’s the old world.
BTW, I don't use deno and haven't been following any news whatsoever so this is simply a shitty statement from an outsider. It is interesting that I tested deno a couple of times but kept using node until bun came around and I basically switched to bun. I can't say why exactly.
There’s been some criticism lately about Deno - about Deploy, KV, Fresh, and our momentum in general.
It seems like they never replied to the criticism against their momentum (something I haven't seen myself, what would the argument even be), was that intentional or just missed?
Some of that criticism is valid.
Would have been great to also outline what criticism is/was valid, and how they're aiming to solve those things. Sure, maybe a bit "shoot yourself in the foot" but personally I really prefer companies that are upfront about the drawbacks, and makes it more likely I'll chose them. Migadu is a great example of this, where they have a pro/con page where they are upfront about the drawbacks of using Migadu (https://migadu.com/procon/). Just the existence of that page is probably ~20% of why I chose Migadu in the first place.
Since the release of Deno 2 last October - barely over six months ago! - Deno adoption has more than doubled according to our monthly active user metrics.
The obvious question is: doubled, but compared to what? And what are they measuring? They’re not disclosing any real metrics on adoption.
I think what happened is that people were giving them the benefit of the doubt because they were new and you could imagine huge growth. The disappointment is by comparison to vague hopes and dreams.
[0] https://deno.com/blog/changes#a-large-and-growing-ecosystem
While some people whine about the Node.js compat, I'd assume it's the main point that kept Deno on life-support in the long run.
Bun did it right from the start and it seems people love it. Being quite a bit faster than Node.js (even with the compat APIs) and Deno obviously helps too. If they keep that going, they'd enter Go level of performance.
Yes, preferably safer than C ever was, but that isn't what a 21st century systems language is supposed to be.
This is really a case were Rust will shine compare to Zig.
I guess we’ll see soon enough what Deploy will become since that's "imminent".
KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.
Fresh is being refactored with an alpha in "late Q3 2025 (likely September)". It was a fairly basic framework to being with. The no compilation/build step was the only interesting idea and that's going away.
The runtime is actively developed but I find this statement amusing:
We’re not chasing feature parity with other runtimes.
The release notes on Node/NPM compatibility would suggest otherwise.
KV is dead
Yeah this is a terrible move. Companies aren't relying on KV precisely because it's in beta not because it was a bad idea. I use Cloudflare Workers KV a lot and I'm not interested in durable objects. I was really interested in Deno KV until now.
Plus the optics of announcing a product and abandoning it are not good. Ryan is a great technical guy but these decisions don't look good from a strategic perspective.
Over time, unless the team building the thing is entirely tone deaf, I'd expect each individual tool to improve as demanded/necessary. Not only that, but knowing that those tools are being thought about as parts of a whole is deeply comforting (I trust them more than standalone tools as interdependency headaches have likely been solved).
One of the biggest headaches in JS is the tendency for tool builders to just eschew responsibility in favor of sending their community on a goose chase. I commend the Deno folks for taking this approach. We should have more, not less of this attitude in the ecosystem.
Most developers weren’t deploying simple stateless functions. They were building full-stack apps: apps that talk to a database
Honestly that seemed really obvious from the start - it's hard to think of many use cases where this isn't the case. Glad they realised anyway.
it's hard to think of many use cases where this isn't the case
Especially in 2021 (iirc) the lambda and edge function architectures were being pushed hard, so it's understandable they went in this direction at the time. But like with all things, eventually things balance out and you realise most software is still "old fashioned".
So the quote was done around 60 yrs old. And he perished roughly 1/4 of the of the time later.
Demo was released in 2018, it has now quoted the statement, 7 yrs later. I guess the next 2 years are gonna be interesting?
It's not rare, so kinda.
The reason a project addresses these rumors at all means that they've noticed a trend and are worried about it.
Just like meta isn't publishing articles how react isn't going anywhere - they know it won't, despite the countless articles claiming otherwise.
What this kind of statement actually means it's basically "we're not secure, but we can't admit to it as that would cement it." Which funnily enough applied to Twain too, as he did indeed suffer from the illness people were gossiping about. It was just a lot less eminently dangerous then the rumors claimed
Someone mentioned to me "Deno-style event loops" / "Deno-style main loops". I asked what that is but they were gone. I've tried to look it up, to no avail.
I do quite a bit of work on low level event loops. I'm continually interested in how different projects are doing it and what ideas and tricks they come up with. It bugs me to no end that I can't find anything on what this "Deno style loop" is supposed to be.
Anyone know what's meant / have a pointer or two?
It’s like they looked at what Vercel did with introducing a deployment platform after their initial NextJS work and wanted to follow suit.
Can someone help me understand what was lost here? Is there no longer a way to use Deno without using the Node ecosystem?
But especially in the AI/LLM era it's even _more_ important to use what's most popular because the LLM will know more about it, and can pull information from vastly more resources, including it's most important "source" of all: The model weights.
I will say that I was disappointed when they added NPM into the project, I understand why they did it but I would have preferred they not do it.
With that said all of my blogs and client sites are all being happily built in lume with deno right now (hosted on cloudflare) and they have been great for years now. I am still very happy for having made that change.
One of the biggest questions we’ve been hearing is about Deno Deploy — specifically, the reduction in available regions. While we understand the optics of this scaling back, it isn’t for the reasons often feared or accused.Rather, reality is: most applications don’t need to run everywhere. They need to be fast, close to their data, easy to debug, and compliant with local regulations. We are optimizing for that.
Why does this sound very odd? I chose to not use Deno Deploy because region was not close enough and it would have just make everything slower than using other means. (Because there are many options to host data closer to overall end-users, and some regulations also happen on country level)
Business wise turn their deploy system into a resource for the browser base, for instance app store, for instance flash compute/rendering, for instance agent hosting services.
I wonder if this is true in general for most people on serverless these days. If so, whether this is what the original intention of this movement and whether these people just don't want to deal with docker/k8s.
That covers a massive proportion of the companies that don’t need or want massive scale.