A lot of people seem confused about this acquisition because they think of Bun as a node.js compatible bundler / runtime and just compare it to Deno / npm. But I think its a really smart move if you think of where Bun has been pushing into lately which is a kind of cloud-native self contained runtime (S3 API, SQL, streaming, etc). For an agent like Claude Code this trajectory is really interesting as you are creating a runtime where your agent can work inside of cloud services as fluently as it currently does with a local filesystem. Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
May I ask, what is this obsession with targeting the browser? I've also noticed a hatred of k8s here, and while I truly understand it, I'd take the complication of managing infrastructure over frontend fads any day.
Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
JS has the fastest, most robust and widely deployed sandboxing engines (V8, followed closely by JavaScriptCore which is what Bun uses). It also has TypeScript which pairs well with agentic coding loops, and compiles to the aforementioned JavaScript which can run pretty much anywhere.
Note that "sandboxing" in this case is strictly runtime sandboxing - it's basically like having a separate process per event loop (as if you ran separate Node processes). It does not sandbox the machine context in which it runs (i.e. it's not VM-level containment).
When you say runtime sandboxing, are you referring to JavaScript agents? I haven't worked all that much with JavaScript execution environments outside of the browser so I'm not sure about what sandboxing mechanics are available.
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It's interesting to see the difference in how both treat the module. It feels similar to a realm which makes me lean by default to not trusting it for untrusted code execution.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
The reference docs are auto generated from node’s TypeScript types. node:vm is better than using the same global object to run untrusted code, but it’s not really a sandbox
> It also has TypeScript which pairs well with agentic coding loops
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Typescript is probably generally a good LLM language because
- static types
- tons and tons of training data
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
> C#'s speed advantage over JS among many other things would make C# the main language
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
This is one of those, "in theory, there's no difference between theory and practice. In practice, there is" issues.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
That's not a fair comparison. In your example, you're talking about the average of developers in a language. In this situation, it's specific developers choosing between languages. Having the developers you already have choose language A or B makes no difference to their code quality (assuming they're proficient with both)
> In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do.
Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
> production-grade developer experience
Please, Maven and Gradle are crimes against humanity. There's a special place reserved for Gradle creators in hell for sure.
The "production-grade" developers should ditch their piece of shit, ancient "tooling" and just copy uv/go/dart/rust tooling.
>Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
Could also be a way to expand the customer for Claude Code from coding assistant to vibe coding, a la Replit creating a hosted app. CC working more closely with Bun could make all that happen much faster:
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
That's a really cool use case and seems super helpful. working cloud native is a chore sometimes. having to fiddle with internal apis, acl/permissions issues.
The writeup makes it sound like an acquihire, especially the "what changes" part.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
This matches some previous comments around LLMs driving adoption of programming languages or frameworks. If you ask Claude to write a web app, why not have it use your own framework, that it was trained on, by default?
Currently Claude etc. can interact with services (including AWS) via MCPs.
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
As a commandline end user who prefers to retreive data from the www as text-only, I see deno and bun as potential replacements (for me, not necessarily for anyone else) for the so-called "modern" browser in those rare cases where I need to interpret Javascript^1
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
> At the time of writing, Bun's monthly downloads grew 25% last month (October, 2025), passing 7.2 million monthly downloads. We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
> They didn't have to join, which means they got a solid valuation.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
Thanks I scrolled past that in the announcement page.
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
I don't like all of the decisions they made for the runtime, or some of the way they communicate over social media/company culture, but I do admire how well-run the operation seems to have been from the outside. They've done a lot with (relatively) little, which is refreshing in our industry. I don't doubt they had a long runway either.
> They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.
I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Given the worries about LLM focused companies reaching profitability I have concerns that Bun's runway will be hijacked... I'd hate for them to go down with the ship when the bubble pops.
I'm sort of surprised to see that you used Claude Code so much. I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc. And I know Bun started with an extreme attention to detail around performance.
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
I am not your target with this question (I don't write Zig) but there is a spectrum of LLM usage for coding. It is possible to use LLMs extensively but almost never ship LLM generated code, except for tiny trivial functions. One can use them for ideation, quick research, or prototypes/starting places, and then build on that. That is how I use them, anyway
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
> Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
I'll give you a basic example where it saved me a ton of time to vibe code instead of doing it myself, and I believe it would hold true for anyone.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful.
However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
Finding this comment interesting, parent comment didn't suggest any past association but it seemingly uses project reference as pivot point to do various outgroup counter signaling / neg bun?
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
more people should have such a healthy approach not only to llms but to life in general. Same reason I partake less and less in online discourse: its so tribal and filled with anger that its just not worth it to contribute anymore. Learning how to be in the middle did wonders to me as a programmer and I think as a person as well.
Isn't that still "acqui-hiring" according to common usage of the term?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
Acquihiring usually means that the product the team are working on will be ended and the team members will be set to work on other aspects of the existing company.
That is part of the definition given in the first paragraph of the Wikipedia article, but I think it’s a blurry line when the acquired company is essentially synonymous with a single open source project and the buyer wants the team of experts to continue developing that open source project.
I've never personally used Bun. I use node.js I guess. What makes Bun fundamentally better at AI than, say, bundling a node.js app that can run anywhere?
If the answer is performance, how does Bun achieve things quicker than Node?
on Bun's website, the runtime section features HTTP, networking, storage -- all are very web-focused. any plans to start expanding into native ML support? (e.g. GPUs, RDMA-type networking, cluster management, NFS)
Probably not. When we add new APIs in Bun, we generally base the interface off of popular existing packages. The bar is very high for a runtime to include libraries because the expectation is to support those APIs ~forever. And I can’t think of popular existing JS libraries for these things.
You said elsewhere that there were many suitors. What is the single most important thing about Anthropic that leads you to believe they will be dominant in the coming years?
No idea about his feelings but believing that they will be dominant wouldn't have to be the reason he chose them. I could easily imagine that someone would decide based on (1) they offered enough money and (2) values alignment.
How much of your day-to-day is spent contributing code to the Bun codebase and do you expect it to decrease as Anthropic assigns more people to work on Bun?
I contributed to Bun one time for SQLite. I've a question about the licensing.
Will each contributor continue to retain their copyright, or will a CLA be introduced?
With Bun's existing OSS license and contribution model, all contributors retain their copyright and Bun retains the license to use those contributions. An acquisition of this kind cannot change the terms under which prior contributions were made without explicit agreement from all contributors. If Bun did switch to a CLA in the future, just like with any OSS project, that would only impact future contributions made after that CLA went into effect and it depends entirely on the terms established in that hypothetical CLA.
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
Wondering to what degree this was done to support Anthropic’s web crawler. Would assume that having a whole JS runtime rather than just a HTTP client could be rather useful. Just hypothesising here, no clue what they use for their crawler.
I wonder if this is a sign of AI companies trying to pivot?
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
I think Deno's management have been somewhat distracted by their ongoing lawsuits with Oracle over the release of the Javascript trademark.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
Anthropic has been trying to win the developer marketshare, and has been quite successful with Claude Code. While I understand the argument that this acquisition is to protect their usage in CC or even just to acquire the team, I do hope that part of their goal is to use this to strengthen their brand. Being good stewards of open source projects is a huge part of how positively I view a company.
I’ll be honest, while I have my doubts about the match of interests and cohesion between an AI company and a JS runtime company I have to say this is the single best acquisition announcement blog post I’ve seen in 20 years or so.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
But how is another company that is also VC backed and losing money providing stability for Bun?
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
If that was genuinely happening here - Anthropic were selling inference for less than the power and data center costs needed to serve those tokens - it would indeed be a very bad sign for their health.
Those are estimates. Notice they didn’t assume 0% or a million %. They chose numbers that are a plausible approximation of the true unknown values, also known as an estimate.
This is pretty silly thing to say. Investment banks suffer zero reputational damage when their analysts get this sort of thing wrong. They don’t even have to care about accuracy because there will never be a way to even check this number, if anyone even wanted to go back and rate their assumptions, which also never happens.
I've seen a bunch of other estimates / claims of a %50-60 margin for Anthropic on serving. This was just the first one I found a credible-looking link I could drop into this discussion.
They had pretty drastic price cuts on Opus 4.5. It's possible they're now selling inference at a loss to gain market share, or at least that their margins are much lower. Dario claims that all their previous models were profitable (even after accounting for research costs), but it's unclear that there's a path to keeping their previous margins and expanding revenue as fast or faster than their costs (each model has been substantially more expensive than the previous model).
It wouldn't surprise me if they found ways to reduce the cost of serving Opus 4.5. All of the model vendors have been consistently finding new optimizations over the last few years.
I've been wondering about this generally... Are the per-request API prices I'm paying at a profit or a loss? My billing would suggest they are not making a profit on the monthly fees (unless there are a bunch of enterprise accounts in group deals not being used, I am one of those I think)
but those AI/ML researchers aka LLM optimization staff are not cheap. their salaries have skyrocketed, and some are being fought for like top-tier soccer stars and actors/actresses
The leaders of Anthropic, OpenAI and DeepMind all hope to create models that are much more powerful than the ones they have now.
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
> their investors might still take a bath if the very-ambitious aspect of their operations do not bear fruit
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
The bet, (I would have thought) obviously, is that AI will be a huge part of humanity’s future, and that Anthropic will be able to get a big piece of that pie.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
You are saying that you can raise $7b debt at double-digit interest rate. I am doubtful. While $7b is not a big number, the Madoff scam is only ~$70b in total over many years.
I am fairly skeptical about many AI companies, but as someone else pointed out, Anthropic has 10x'ed their revenue for the past 3 years. 100m->1b->10b. While past performance no predictor of future results, their product is solid and to me looks like they have found PMF.
Often it happens that VCs buy out companies from funds belonging to a fresh because the selling fund wants to show performance to their investors until "the big one", or move cash one from wealthy pocket to another one.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
It's an acquihire. If Anthropic is spending significant resources, or see that they will have to, to improve Bun internally already it makes a lot of sense. No nefarious undertones required.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
If it was an acquihire, still a lot less slimy than just offering the employees they care about a large compensation package and leaving the company behind as a husk like Amazon, Google and Microsoft have done recently.
From the acquirer’s perspective, you’re right. (Bonus: it diminishes your own employees’ ability to leave and fundraise to compete with you.)
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
> And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
> Every employee is a flight risk if you don't pay them a competitive salary
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
> aren’t hired away with promises of a salary but instead large signing bonuses
Now you're being nitpicky.
Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So?
VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt.
Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
> Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
You want those people specifically. To get them, you need to hire them for a lot more money than you pay your current folks. That causes a lot of resentment with folks and messes up things like salary bands, etc.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
I left out the part that the motivations for the acquirers were not to save money or to be slimy. It was the only way to get around overzealous government regulators making it harder to acquirer companies.
The real risk is not that Anthropic will run out of money, but that they will change their strategy to something that isn't Bun-based, and supporting Bun won't make sense for them any more.
I admit, it is a good acquisition announcement. I can’t remember the last acquisition announcement that was kept for more than 1-2 years. Leadership changes, priorities shift…
One thing I like about this, despite it meaning Bun will be funded, is Anthropic is a registered public benefit corporation. While this doesn't mean Anthropic cant fuck over the users of Bun, it at least puts in some roadblocks. The path of least-resistance here should be to improve Bun for users, not to monetize it to the point where it's no longer valuable.
As someone who have been using Deno for the last few years, is there anything that Bun does better? Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?). The last time I checked Bun's source code, it was... quite messy and spaghetti-like, plus Zig doesn't really offer many safety features, so it's not that hard to write incorrect code. Zig does force some safety with ReleaseSafe IIRC, but it's still not the same as even modern C++, let alone Rust.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").
As the victim of the larger pre-Shai-Hulud attack, unfortunately the install script validation wouldn't have protected you. Also, if you already have an infected package on the whitelist, a new infection in the install script will still affect you.
I decided to stick with Node in general. I don't see any compelling reason to change it.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
Search for pointer exceptions or core dumps on Bun's GitHub issues and you'll see why people (should) use Deno over Bun, if only because Rust is a way more safe language than Zig.
This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state. Whether it be kernel exception, pointer exception, or Rust's panic! - these things exist.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
Don't make a false equivalence, how many times does one get a panic from Deno versus a segmentation fault in Bun? It's not a similar number, and it's simply wrong to say that both are just as unsafe when that's plainly untrue.
The only time I got a segfault in Bun is when I used bun:ffi to wrap glfw and wgpu-native so I can threejs on the desktop. Ironically, the segfault was in wgpu. Which is Rust. But to be fair it was because the glfw surface had dirty flags for OpenGL and didn’t have the Vulkan extensions. So anyone would have faulted.
> This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
I agree. Pointing at Github issues is a strange metric to me. If we want to use that as a canary then you shouldn't use Deno (2.4k open issues) or Bun (4.5k open issues) at all.
I haven't verified this, but I would be willing to bet that most of Bun's issues here have more to do with interfacing with JavaScriptCore through the C FFI than Zig itself. this is as much a problem in Rust as it is in Zig. in fact, it has been argued that writing unsafe Zig is safer than writing unsafe Rust: https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
As someone who has researched the internals of Deno and Bun, your unverified vibe thoughts are flat out wrong. Bun is newer and buggier and that's just the way things go sometimes. You'll get over it.
I really want to like Deno and will likely try it again, but last time I did it was just a bit of a pain anytime I wanted to use something built for npm (which is most packages out there), whereas bun didn't have that problem.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Looking at Bun's website (the comparison table under "What's different about Bun?") and what people have said here, the only significant benefit of Bun over Node.js seems to be that it's more batteries-included - a bigger standard library, more tools, some convenience features like compiling JSX and stripping TypeScript types on-the-fly, etc.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
My team has been using it in prod for about a year now. There were some minor bugs in the runtime's implementation of buffers in 1.22 (?), but that was about the only issue we ran into.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
I tried several times to port Node projects to Deno. Each time compatibility had "improved" but I still didn't have a working build after a few days of effort.
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
Which makes sense given that a big impetus for Deno's existence was the creator of Node/Deno (Ryan Dahl) wanting to correct things he viewed as design mistakes in Node.
> Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?).
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
I’ve been using Deno too. Although npm support has improved and it’s fine for me, I think Deno has more of a “rewrite the world” philosophy. For example, they created their own package registry [1] and their own web framework [2]. Bun seems much more focused on preexisting JavaScript projects.
It's interesting that people have directly opposite opinions on whether Deno or Bun are meant to be used with the existing ecosystem - https://news.ycombinator.com/item?id=46125049
I don’t think these are mutually exclusive takes. Bun is essentially taking Node and giving it a standard library and standard tooling. But you can still use regular node packages if you want. Whereas Deno def leaned into the clean break for a while
As far as I know, modern Node compat in Deno is also quite great - I just import packages via 'npm:package' and they work, even install scripts work. Although I remember that in the past Deno's Node compat was worse, yes.
Is JSC less tested? I thought it was used in Safari, which has some market share.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
It just works. Whatever JavaScript/TypeScript file or dependencies I throw at it, it will run it without needing to figure out CJS or ESM, tsconfig, etc.
Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
Even for server ~~java~~typescript, I almost always reach for Bun nowadays. Used to be because of typestripping, which node now has too, but it's very convenient to write a quick script, import libraries and not have to worry about what format they are in.
I find comments like this fascinating, because you're implicitly evaluating a counterfactual where Bun was built with Rust (or some other "interesting" language). Maybe Bun would be better if it were built in Rust. But maybe it would have been slower (either at runtime or development speed) and not gotten far enough along to be acquired by one of the hottest companies in the world. There's no way to know. Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?
We can think of they making bun an internal tool, push roadmap items that fit their internal products, whatever, which doesn't answer the getting back money of the acquisition.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
Don't engage with this guy, he shows up in every one of these threads to pattern match back to his heyday without considering any of the nuance of what is actually different this time.
Twice as fast at executing JavaScript? There's absolutely zero chance this is true. A JavaScript engine that's twice as fast as V8 in general doesn't exist. There may be 5 or 10 percent difference, but nothing really meaningful.
Keep in mind that it's not just a matter of comparing the JS engine. The runtime that is built around the engine can have a far greater impact on performance than the choice of v8 vs. JSC vs. anything else. In many microbenchmarks, Bun routinely outperforms Node.js and Deno in most tasks by a wide margin.
You might want to revise what you consider to be "absolutely zero chance". Bun has an insanely fast startup time, so it definitely can be true for small workloads. A classic example of this was on Bun's website for a while[1] - it was "Running 266 React SSR tests faster than Jest can print its version number".
Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
I always figured Bun was the "enterprise software" choice, where you'd want to use Bun tools and libraries for everything and not need to bring in much from the broader NPM library ecosystem.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
Yes, both can pull in open source libraries and I can't imagine either dropping that ability. Though they do seem to have different eagerness and competency on Node compatibility and Bun seems better on that front.
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
They tried to realign package management with web standards and tools that browsers can share (URLs and importmaps and "cache, don't install"). They didn't offer compatibility with existing package managers (notably and notoriously npm) until late in that game and took multiple swings at URL-based package repositories (deno.land/x/ and JSR), with JSR eventually realizing it needed stronger npm compatibility.
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
Is it just me, but I don't find npm that slow? Sure it's not a speed demon, but I rarely need to do npm install anyways so it's not a bottleneck for me.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
I've been using Bun since 2022 just to be trendy for recruitment (it worked, and still works despite it almost being 2026)
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
I've seen a few of these seemingly random acquisitions lately, and I congratulate the companies and individuals that are acquired during this gold rush, but it definitely feels awkwardly artificial.
Quote from the CEO of Anthropic in March 2025:
"I think we'll be there in three to six months where AI is writing 90% of the code and then in 12 months we may be in a world where AI is writing essentially all of the code"
I think this wound up being close enough to true, it's just that it actually says less than what people assumed at the time.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
At an individual level, I think it is for some people. Opus/Sonnet 4.5 can tackle pretty much any ticket I throw at it on a system I've worked on for nearly a decade. Struggles quite a bit with design, but I'm shit at that anyway.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
From the article, Claude Code is being used extensively to develop Bun already.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
> You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
"Wasting" is doing a lot of work in that sentence.
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
Let me refer you back to the GP, where the CEO of Anthropic says AI will be writing most code in 12 months. I think the parent comment you replied to was being somewhat facetious.
Maybe he was correct in the extremely literal sense of AI producing more new lines of code than humans, because AI is no doubt very good at producing huge volumes of Stuff very quickly, but how much of that Stuff actually justifies its existence is another question entirely.
Why do people always stop this quote at the breath? The rest of it says that he still thinks they need tech employees.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
I actually like claude code, but that was always a risky thing to say (actually I recall him saying their software is 90% AI produced) considering their cli tool is literally infested with bugs. (Or it least it was last time I used it heavily. Maybe they've improved it since.)
Is this why everyone only seems to know the first half of Dario's quote? The guy in that video is commenting on a 40 second clip from twitter, not the original interview.
I'm curious what people think of quotes like these. Obviously it makes an explicit, falsifiable prediction. That prediction is false. There are so many reasons why someone could predict that it would be false. Is it just optimistic marketing speech, or do they really believe it themselves?
Everybody knows that marketing speech is optimistic. Which means if you give realistic estimates, then people are going to assume those are also optimistic.
What languages and frameworks? What is the domain space you're operating in? I use Cursor to help with some tasks, but mainly only use the autocomplete. It's great; no complaints. I just don't ever see being able to turn over anywhere close to 90% with the stuff we work on.
Hah. It can’t be “I need to spend more time to figure out how to use these tools better.” It is always “I’m just smarter than other people and have a higher standard.”
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
The tools produce mediocre, usually working in the most technical sense of the word, and most developers are pretty shit at writing code that doesn't suck (myself included).
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
I don't remember saying I worked with nextjs, shadcn, clerk (I don't even know what that one is), vercel or even JS/TS so I'm not sure how you can be right but I should know better than to feed the trolls.
I suspect you do not know how to use AI for writing code. No offence intended - it is a journey for everyone.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
Always the same answer. It's the user not the AI being blown out of proportion. Tell me, where are all those great amazin applications that were coded 95-100% by AI? Where is the great progress the great new algorithms the great new innovations hiding?
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
Thanks for this! I've been looking for a good guide to an LLM based workflow, but the modern style of YouTube coding videos really grates on me. I think I might even like this :D
This one is a bit old now so a number of things have changed (I mostly use Claude Code now, Dynamic context (Skills) etc...) but here's a brief TLDR I did early this year https://www.youtube.com/watch?v=dDSLw-6vR4o
1. I didn't say it was a best example, I replied to a comment asking me to "Post a repo" - I posted a repo. 2. Straw man argument. I was asked for a repo, I posted a repo and clearly you didn't look at the code as it's not an "AI code generator".
1. I didn’t ask for a repo.
2. Still wasn’t me. Maybe an AI agent can help you check usernames?
3. Sorry, a plugin for an AI code generator, which is even worse of an example.
How much time do you think you saved versus writing it yourself if you factored in the time you spent setting up your AI tooling, writing prompts, contexts etc?
it boils down to - we didn't have full conviction that over the long run we will prove superior to node.js, however a.i company burning a lot of cash, has invested in us by basing their toolchain on us - so they have no option to acquire-hire us.
I don't really see how Bun fits as an acquisition for an AI company. This seems more like "we have tons of capital and we want to buy something great" than "Bun is essential to our core business model".
If Anthropic wants to own code development in the future, owning the full platform (including the runtime) makes sense.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
Even outside of code development, Anthropic seems to be very strongly leaning into code interpreter over native tool calling for advancing agentic LLM abilities (e.g. their "skills" approach). Given that those necessitate a runtime of sorts, owning/having access to a runtime like Bun that could e.g. allow them to very seamlessly integrate that functionality into their products better, this acquisition doesn't seem like the worst idea.
They will own it, and then what? Will Claude Code end every response with "by the way, did you know that you can switch to bun for 21.37x faster builds?"
TypeScript is the most popular programming language on the most popular software hosting platform though, owning the best runtime for that seems like it would fit Pareto's rule well enough:
I think there's a potential argument to be made that Anthropic isn't trying to make it easier to write TS code, but rather that their goal is a level higher and the average person wouldn't even know what "language" is running it (in the same way most TS devs don't need to care the many layers their TS code is compiled via).
It's a JS runtime, not specifically servers though? They essentially can bundle Claude Code with this, instead of ever relying on someone installing NodeJS and then running npm install.
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
Sure, but Bun was funded by VCs and needed to figure out how to monetize, what Anthropic did is ensure it is maintained and now they have fresh talent to improve Claude Code.
Server here I used loosely - it obviously runs on any machine (eg if you wanted to deploy an application with it as a runtime). But it’s not useful for web dev itself which was my point.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
It doesn't make sense, and you definitely didn't say why it'd make sense... but enough people are happy enough to see the Bun team reach an exit (especially one that doesn't kill Bun) that I think the narrative that it makes sense will win out.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
Claude Code running on Bun is an obvious justification, but Buns features (high performance runtime, fast starts, native TS) are also important for training and inference. For instance, in inference you develop a logical model in code that maps to a reasoning sequence, and then execute the code to validate and refine the model, then use this to inform further reasoning. Bun, which is highly integrated and highly focused on performance, is an ideal fit for this. Having Bun in house means that you can use the feedback from all of automation driven execution of Bun to drive improvements to its core.
Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.
There's no reason to run agents on expensive AI platforms or on GPUs - when you can have the AI create an agent in JS and thus runs with very high performance and perfect repeatability on far less expensive CPUs.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
I use Claude Code CLI daily - it's genuinely changed how I work. The $1B number sounds crazy but honestly tracks with how good the tool is. Curious how Bun integration will show up in practice beyond the native installer.
Curious about the deal value/price — any clues whether it was just to make existing investors even (so say up to $30M) or are we talking some multiple? But if it's a multiple, even 2x sounds a bit crazy.
One option is that the current Bun shareholders didn't see a profitable future and didn't even care if they were made even and a return of the remaining cash was adequate.
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
i don’t get it either - bun being the foundation of tons of AI tools is like a best possible outcome, what were they hoping for when they raised the money? Or is this just an admission of “hey, that was silly, we need to land this however we can”? Or do they share major investors and the therefore this is just a consolidation? (Edit: indeed, KP did indeed invest $100M in Anthropic this year. I’m also confused - article states Bun raised 26M but the KP seed round was 7, did they do the A too but unannounced? Notably, the seed was summer 2022 and chatgpt was Nov 30, so the world is different, did the hypothesis change?)
It's more honest than the Replicate answer but I think inevitably if you can't raise the next round and you get distracted by the shiny AI that this is the path taken by many teams. There is absolutely nothing wrong with that. There was an exuberant time when all the OSS things were getting funded, and now all AI things get funded. For many engineer founders, it's a better fit to go build deep technical stuff inside a bigger company. If I had that chance I would probably have taken it too. Good luck to the Bun team!
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Same reason AIs also use Python and DBMSes offer JS or Py UDFs easily, interpreted languages take no build time and are more portable. JS is also very popular.
Sadly, this will be the trend with things moving forward. JS is perceived as a good language and LLMs are meant to make them even easier to write. It is not about the mertis of a language. It's about which languages LLMs are "good" at.
AI are good at JS because basically there is a ton of JS code available publicly without usage restriction: the JS code published to be executed in your browser. Most of JS code attached to web pages has no explicit license, but the implicit license is that anyone can download it and run it. Same for HTML and CSS. So using that public code to train models is a no brainer.
You could make a better argument for Go (compiles to native for multiple targets, zero actual dependencies (no need for a platform or virtual machine on the target)
Has CC always used Bun? When it tries it out many months ago it was an npm install not bun install in their instructions (although I did use bun install myself). Just odd that if they were using bun, why the installation wasn’t specifically a “bun install” (I suppose they were trying to keep it vanilla for the npm masses?)
Congratulations to the team. Knowing some of the folks on the Bun team I can not say I am surprised. They are the top 0,001% of engineers, writing code out of love. I’m hugely bullish on Anthropic, this is a great first acquisition.
Bun has completely changed my outlook on the JS ecosystem. Prior to Bun, there was little focus on performance. Now the entire space rallies around it.
> Prior to Bun, there was little focus on performance.
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
> Vibe-coded projects get bought by vibe-coded companies
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
> Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
> Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
My mistake, I was thinking of the wider ecosystem not the runtime, ie formatters, bundles and linters like Biome, oxc, etc being written in Rust or other compiled languages. That's where I saw the biggest speedup, because developers of them decided to use a compiled language to write them in instead of JS via a JS runtime where you'll inherently be limited by even a JIT language.
Machine code yes (along with Spidermonkey, JSC and Nashorn), the timeframe around 2005-2010 saw the introduction of JIT'ed JS runtimes. Back then however JS was firmly single-threaded, it was only with the introduction of SharedArrayBuffer that JS really started to receive multithreading features (outside of SharedArrayBuffer and other shareable/sendable types, a runtime could opt to run stuff like WebWorkers/WebAudioWorkers in separate processes).
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
Anyone know how much Anthropic paid for Bun? I assume it was at least $26M, so Bun could break even and pay back its own investors, but I didn't see a number in the announcements from Anthropic or Bun.
The standard argument here is that the maintainers of the core technology are likely to do a better job of hosting it because they have deeper understanding of how it all works.
Hosting is a commodity. Runtimes are too. In this case, the strategy is to make a better runtime, attract developers, and eventually give them a super easy way to run their project in the cloud. Eg: bun deploy, which is a reserved no op command. I really like Buns DX.
I mean if you're getting X number of users per day and you don't need to pay for bandwidth or anything, there's gotta be SOME way to monetize down the line.
If your userbase or the current CEO likes it or not.
No, but faced with either a loss or a modest return, they'll take the modest return (unless it's more beneficial to not come tax season). Unicorns are called unicorns for a reason.
This decision is honestly very confusing to me as a constant user of Claude Code (I have 3 of them open at the moment.)
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
Ink (and modern alternatives) probably are the best TUI toolkit. If you want to write a UI that's genuinely good, you need e.g. HTML, or some way to express divs and flex box. There isn't really another way to build professional grade UIs; I love immediate mode UI for games, but the breadth of features handled by the browser UI ecosystem is astonishing. It is a genuinely hard problem.
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
Ink seems to be the root cause of a major issue with the Claude Code CLI where it flickers horribly when it needs to repeatedly clear the screen and redraw.
The idea that you need or want HTML or CSS to write a TUI is missing the entire point of what made TUIs great in the first place. They were great precisely because they were clean, fast, simple, focused -- and didn’t require an entire web stack to draw colored boxes.
I'm not so sure about that. I've written some nontrivial TUIs in my time, the largest one being [1], and as the project got more complicated I did find myself often thinking "It sure would be nice if I could somehow just write this stuff with CSS instead of tiny state machines and control codes for coloration". There's no reason these languages couldn't compile down to a TUI as lean as hand-coloring everything yourself.
Yes, for simple projects, absolutely. But when you're shipping something as widely adopted as CC, I disagree. At the end of the day, you're making a UI. It happens to be rendered via the terminal. You still need accessibility, consistent layouts, easy integration with your backend services, inputs, forms, and so on. If you don't need that stuff, there are lots of other, simpler options. But if you do, your other options begin to resemble a half baked, bug filled reimplementation of the web. So just use the web.
“Port it to a different language” a language that’s more out of distribution? Bad devex. Store data as an unreadable binary file? Bad devex.
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
I have to admit this was my first thought, too. I'm pretty obsessed with Claude Code, but the actual app is so incredibly poorly engineered for something that doesn't even do that much.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
So many comments about reasoning here, yet none about the very obvious one, it's not stability of the infrastructure, it's future direction of a product like Claude Code. They need to know how to continue their optimisation machine to fit developers needs the best way possible (for good or for worse).
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
This reads more like Anthropic wanted to hire Jarred and Jarred wants to work with AI rather than build a Saas product around bun. I doubt it has anything to do with what is best for bun the project. Considering bun always seemed to value performance more than all else, the only real way for them to continue pursuing that value would be to move into the actual js engine design. This seems like a good pivot for Jarred personally and likely a loss for bun.
It doesn't read like that to me at all. This reads to me like Anthropic realizing that they have $1bn in annual revenue from Claude Code that's dependent on Bun, and acquiring Bun is a great and comparatively cheap way to remove any risk from that dependency.
I haven't had any issue moving projects between node, bun, and deno for years. I don't agree that the risk of bun failing as a company affects anthropic at all. Bun has a permissible license that anthropic could fork from, anthropic likely knew that oven had a long runway and isn't in immediate danger, and switching to a new js cli tool is not the huge lift most people think it is in 2025. Why pay for something you are already getting for free and can expect to keep getting for free for at least four years, and buy for less if it fails later?
This argument doesn’t make much sense to me. Claude Code, like any product, presumably has dozens of external dependencies. What’s so special about Bun specifically that motivated an acquisition?
A dependency that forms the foundation of your build process, distribution mechanisms, and management of other dependencies is a materially different risk than a dependency that, say, colorizes terminal output.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
> MIT code, let Bun continue develop it, once project is abandoned hire the developers.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
If they found themselves pushing PRs to bun that got ignored and they wanted to speed up priority on things they needed, if the acq was cheap enough, this is the way to do it.
I'm also curious if Anthropic was worried about the funding situation for Bun. The easiest way to allay any concerns about longevity is to just acquire them outright.
It's not easy to "just" fork a huge project like Bun. You'll need to commit several devs to it, and they'll have to have Zig and JSC experience, a hard combo to hire for. In many ways, this is an acquihire.
Nah, it reads like the normal logic behind the consulting model for open source monetization, except that Bun was able to make it work with just one customer. Good for them, though it comes with some risks, especially when structured as an acquisition.
So Anthropic sees its CLI (in TypeScript) as the first-class product and maybe planning to expand the claude code with more JS based agents / ecosystem? Especially owning the runtime gives a lot of control over developer experience.
uv is very forkable - dual-licensed under Apache and MIT, high quality codebase, it's Rust rather than Python but the Python community has an increasing amount of Rust experience these days.
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
Honestly, given the constant rollercoaster of version management and building tools for Python the move to something else would be expected rather than surprising.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
uv is a revolution in every possible positive sense of the word in the Python world and I've been here since 1.5. it is imperative that bitter oldtimers like us try it, I did and the only regret I've got is that I didn't do it sooner.
Honestly, that is an understatement. `uv run` has transformed how I use Python since 99% of the time I don't need to setup or manage an environment and dependencies. A have tons of one-off Python scripts (with their dependencies in PEP 723 metadata at the top of the file) that just work with `uv run`.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
Considering that 1) Bun is written in Zig, 2) Zig has a strict no-AI policy [1], and 3) Bun has joined Claude, it seems that Bun and Zig are increasingly culturally apart.
You’re reading a code of conduct for contributing to the zig project. I don’t think everything there is guidance for everything written in zig, eg ‘English is encouraged’ is something one might not want for a project written in zig by native French-speakers, and I don’t think that’s something zig would want to suggest to them. I read the AI part is much more motivated by the asymmetries of open source project contribution than any statement about the language itself. Fly-by AI contributions are bad because they make particularly poor use of maintainer time. Similar to the rule on proposing language changes, which can suck up lots of reading/thinking/discussion time. When you have people regularly working together (eg those people in anthropic working on bun) the incentives are different because there is a higher cost to wasting your colleague’s time.
Nothing I found says anything about Zig folks being inherently against AI. It just looks like they don’t want to deal with “AI Slop” in contributions to their project, which is very understandable.
What matters: it's staying open source and MIT licensed. I sincerely hope it stays that way. Congrats to the Bun team on making a great tool and getting the recognition they deserve.
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Nothing gives you long term stability in tech. You have to constantly work at staying stable, and it isn't always up to anything the company is in control of, no matter what ownership they have.
Godspeed. Seems like a good pairing. Bun is sort of the only part of the JS ecosystem I like, and Code has become such an important tool for my work, that I think good things will come out of this match. Go Bundler as well.
I’m curious to what the acquisition price was. Bun said they’ve raised $26 million so I’m assuming the price tag has to be a lot higher than that for investors to agree to an acquisition.
Wouldn’t it make more sense to write the same functionality using a more performant, no-gc language? Aren’t competitors praised for their CLIs being faster for that reason?
With AI tooling, we are in the era where rapid iteration on product matters more than optimal runtime performance. Given that, implementing your AI tooling in a language that maximizes engineer productivity makes sense, and I believe GC does that.
JS/TS has a fundamental advantage, because there is more open source JS/TS than any other language, so LLMs training on JS/TS have more to work with. Combine that with having the largest developer community, which means you have more people using LLMs to write JS/TS than any other language, and people use it more because it works better, then the advantage compounds as you retrain on usage data.
One would expect that "AI tooling" is there for rapid iteration and one can use it with performant languages. We already had "rapid iteration" with GC languages.
If "AI tooling" makes developers more productive regardless of language, then it's still more productive to use a more productive language. If JS is more productive than C++, then "N% more productive JS" is still more productive than "N% more productive C++", for all positive N.
but they are a company that burns billions every year in losses and this seems like a pretty random acquisition.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
I'm wondering if Bun would be a good embedded runtime for Claude to think in. If it does sandboxing, or if they can add sandboxing, then they can standardize on a language and runtime for Claude Code and Claude Desktop and bake it into training like they do with other agentic things like tool calls. It'd be too risky to do unless they owned the runtime.
on the post they try to reassure the following question
"If I bet my work project or company's tech stack on Bun, will it still be around in five or ten years?"
and the thing is that we don't know if Anthropic itself will be around 5 to ten years
It seems the default is node (despite the project docs saying to use bun and all example script documentation using bun). It will use bun if told, but there’s definitely nothing saying to use node and it uses that anyway.
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
> So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
...Did you miss the part where Bun used Claude to generate that PR?:)
Sounds like the goal is to bundle up Bun with Claude Code insanely tightly, to the point where it doesn't matter if you have nodejs installed locally, but also they can optimize key things for Claude Code's Bun runtime as needed. It's a brilliant acquisition, and bun stays open source, which allows it to continue to grow, to Anthropics benefit and everyone else's.
I just ln bun to npm, npx, and node. This has the added benefit of letting ts_ls and various other tools work without requiring me to have both node and bun installed locally.
Congrats Jarred and team! You have saved humanity many hours already, and I'm sure with Anthropic's backing, you will spare us many more. Farewell would-be headaches from Node & NPM tooling and waiting for builds and tests and package updates. Exciting times ahead!
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
Ha, Physics majors get the same talk about law school. It's just the selection bias of selecting for people willing to make hard pivots filtering out the under-achieving, go-with-the-flow types.
i really think this is part of the pitch deck for bun's funding. that a bigger company would acquire it for the technology. the only reason an AI company or any company for that matter would acquire it would be to:
Not saying it’s 100%, there’s still the repl missing but all of node’s API is available in the sense that it’s ABI compatible (or will be very near term).
If they keep it MIT licensed, if/when things come crashing down, I think its reasonable to think Bun would continue on in some form, even if development slows pace without paid contributors.
I’ve never understood the security utility of the Deno flags. What practical attack would they protect you from? Supply chain seems to be the idea, but how many npm packages do people use that neither:
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
Interesting. Looking through a strategic lens, I feel like this is related to the $1,000 free credit for Claude Code Web (I used a few hundred). What the heck are they aiming for? CodeAct? (https://arxiv.org/abs/2402.01030)
Can anyone provide some color around this: "I started porting esbuild's JSX & TypeScript transpiler from Go to Zig"? Hypothetical benefits include monolanguage for development, better interoperability with C and C++, no garbage collection, and better performance. What turned out to be realized and relevant here? Please, no speculation or language flames or wars.
Hahaha congratulations. This is amazing. The most unlikely outcome for a devtools team. Fascinating stuff.
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
I don't know for sure, but it's definitely the first tool of that value to have a persistent strobing (scroll position) bug so bad that passersby ask me if I'm okay when they see it.
Man, I had never even put words to that problem but you are right that it is beyond annoying. It seems to me like it worsens the longer the Claude instance has run - I don't seem to see it early in the session.
Yeah, issues have been open on GitHub for months. I've tried shortening my scrollback history and using other emulators but it doesn't seem to make a difference. It's pretty frustrating for a paid tool.
It doesn't make a lot of sense that they'll compare Microsoft 365 Copilot with Claude Code, though? Like it is a legit CLI tool but we should ignore it because it shares the name with something else?
Terraform gets to $600mm if you squint really hard make up stuff. Kubectl though. Whatever you want to say about kubernetes complexity, it does get a bunch of money run through it. We could also look at aws-cli, gcloud and az, and if we assign cloud budgets that get run through there, I'm sure it's in the hundreds of millions. Then there's git. Across the whole ecosystem, there's probably a cool couple billion floating through there. gh is probably much smaller. Other tools like docker and ansible come to mind, though those are not quite as popular. Cc only hits $1B ARR if you squint really hard in the first place, so I think in this handwavy realm, I'd say aws-cli comes first, then kubectl, then git, with maybe docket and terraform in the mix as well. Nonetheless, Claude is a really awesome cli tool that I use most days, I find.
Good luck, always worried about stuff like that because it happened so many times and the product got worse eventually. At the same time, ai understand how much effort went into building something like Bun and people need to fund their life's somehow, so there's that.
Congrats. This is the first time I remember reading a genuine, authentic story about a sale. Much preferred over “this is about continuing the mission until my earn-out is complete.”
Look, if a terminal emulator can raise $67 million by riding the AI hypewave then a Javscript runtime can do the same. Nobody ever said that AI investments and acquisitions have to make any sense.
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
Yeah that’s the main part that puzzled me, super happy for the team that they got a successful exit, but I wouldn’t really consider Anthropic’s situation to be stable…
Yeah, no reader of tech news will take an acquisition of a company with four years of runway as anything but decreasing the odds their product will still be around (and useful to the same audience…) in four years. Even without being tied to a company with lots of exposure to a probable bubble.
How so? Presumably Jarred got a nice enough payout that if Anthropic failed, he would not need to work. At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
> At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
Is there any historical precedent of someone doing that?
Opus 4.5 is not living in vacuum. It’s the most expensive of models for coders and there is Gemini 3 pro - with many discounts and deepseek 3.2 that is 50x cheaper and not much behind.
> I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?
Can't you just do it as a library and train/instruct the LLM to prefer using that library?
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
> production-grade developer experience
Please, Maven and Gradle are crimes against humanity. There's a special place reserved for Gradle creators in hell for sure.
The "production-grade" developers should ditch their piece of shit, ancient "tooling" and just copy uv/go/dart/rust tooling.
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
0: https://bun.com/docs/runtime/s3
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
https://github.com/yt-dlp/yt-dlp/wiki/EJS
where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.
Happy to answer any questions
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
putting everyone using the generated outputs into a sort of unofficial grey market: even when using first-party tools. Which is weird.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
[0] https://filepilot.tech
[1] https://terminal.click
[2] https://handmadecities.com/news/summer-update-2025/
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
This sounds so cringe. We are talking about computer code here lol
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
If the answer is performance, how does Bun achieve things quicker than Node?
Do you think Anthropic might request you implement private APIs?
I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?
Thanks
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
Congratulations.
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
Will this make it more or less likely for people to use Bun vs Deno?
And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
I use Hono, Zod, and Drizzle which AFAIK don't need Node compat.
IIRC I've only used Node compat once to delete a folder recursively with rm.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
I don't think they're doing that.
Estimates I've seen have their inference margin at ~60% - there's one from Morgan Stanley in this article, for example: https://www.businessinsider.com/amazon-anthropic-billions-cl...
Not estimate, assumption.
The best one is from the Information, but they're behind a paywall so not useful to link to. https://www.theinformation.com/articles/anthropic-projects-7...
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
Incorrect - that was the fraudulent NAV.
An estimate for true cash inflow that was lost is about $20 billion (which is still an enormous number!)
anthropic's unit margins are fine, many lovable-like businesses are not.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
You have no responsibility for an unrelated company's operations; if that was important to them they could have paid their talent more.
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So? VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
Reminds me of when Tron, the crypto company, bought BitTorrent.
If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.
Bun seems more alive than Deno, FWIW.
Does that mean anything at all?
OpenAI is a public benefit corporation.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
1: https://go.dev/ref/mem
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
[1] https://jsr.io/ [2] https://fresh.deno.dev/
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
I haven't had that experience with deno (or node)
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
Anthropic chose to use Bun to build their tooling.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
[1]: https://x.com/jarredsumner/status/1542824445810642946
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
https://docs.deno.com/api/deno/
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
Elaborate? I believe Zig's donors don't get any influence and decision making power.
Investors must be happy because Bun never had to find out how to become profitable.
except this sense:
> Investors must be happy because Bun never had to find out how to become profitable.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
and
Implementing the Decisions
are complementary, one of these is being commoditised.
And, in fact, decimated.
Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn
I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784
You can see my site here, if you'd like: https://chipscompo.com/
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
My best writing on this topic is still this though (which doesn't include a video): https://simonwillison.net/2025/Mar/11/using-llms-for-code/
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
Acquisition of Apple Swift division incoming?
https://github.blog/news-insights/octoverse/octoverse-a-new-...
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?
I'm not sure if Joyent have any significant role in Node.js maintenance any more.
regardless, it's certainly not MS.
Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.
Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.
Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.
Anything is greater than 0
That perspective following “in two-three years” makes me shudder, honestly.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
Being able to create an agent in any language to run on any hardware has always been possible hasn't it?
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
Plus many other scenarios.
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Genuinely perplexed.
I dislike it also..
Like I’ve said: NativeAOT
https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...
Congrats to Jarred and the team!
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
Don't read a snarky comment so literally ;)
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
The reality is that the insane "JS ecosystem" will rally around whatever is the latest hotness.
In the article they write about the early days
Why do investors invest into people who build something that they give away for free?Why invest into a company that has the additional burden of developing bun, why not in a company that does only the hosting?
There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...
If your userbase or the current CEO likes it or not.
Acquisition seems like a large overhead and maybe a slight pivot to me.
moreover, now they can make investments in order to make it an an even more efficient and secure runtime for model workspaces.
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
I don't know why it's even necessary for this.
https://github.com/atxtechbro/test-ink-flickering
Issue on Claude Code GitHub:
https://github.com/anthropics/claude-code/issues/769
[1]: https://taskusanakirja.com/
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
How was Go involved there before Zig?
The first hints of what become Bun were when Jared experimented at porting that to Zig.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
If they don't want to maintain; GitHub fork with more motivated people.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
#2, if you don't like uv, you can switch to something else.
uv probably has the least moat around it of anything. Truly a meritocracy: people use it because it's good, not because they're stuck with it.
Python is doing great, other than still doing baby steps into having a JIT in CPython.
[1] https://ziglang.org/code-of-conduct/#strict-no-llm-no-ai-pol...
That's like saying GCC and NodeJS are culturally apart, as if that has significant bearing on either?
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Sure. But everything is relative. For instance, Node has much more likelihood of long term stability than Bun, given its ownership.
Given how many more dependencies you need to build/maintain a Node app, your Bun application has a better chance of long term stability.
With Node almost everything is third party (db driver, S3, router, etc) and the vast majority of NPM deps have dozens if not hundreds of deps.
Feels like maybe AI companies are starting to feel the questions on their capital spending? They wanna show that this is a responsible acquisition.
Put the Bun folks directly on that please and nothing else.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
Because now the Bun team don't have to redirect their resources to implementing a sustainable business model.
No they don't.
IOW look where the puck is going.
https://github.com/oven-sh/bun/pull/24578
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
> that the Bun Claude bot created a PR for about 3 weeks ago
The PR with bad code that's also been ignored was made by the bot that Bun made, and brags about in their acquisition post.
...Did you miss the part where Bun used Claude to generate that PR?:)
1. User krig reports an issue against the Bun repo: https://github.com/oven-sh/bun/issues/24548
2. Bun's own automated "bunbot" filed a PR with a potential fix: https://github.com/oven-sh/bun/pull/24578
3. taylordotfish (not an employee of Bun as far as I can tell, but quite an active contributor to their repo) left a code review pointing out many flaws: https://github.com/oven-sh/bun/pull/24578#pullrequestreview-...
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
That's 100% what happened to Bun. It's useful (like really useful) and now they're getting rewarded
1. acquire talent.
2. control the future roadmap of bun.
i think it's really 1.
...but hey, things are different during a bubble.
This will make sure Bun is around for many, many, years to come. Thanks Anthropic.
Why Bun?
Easy to setup and go. bun run <something.ts>
Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)
Typescript Supported. (No need to tsc, bun can transpile for you)
Binary builds. (single executables for easy deployment)
Full Node.js Support. (The whole API)
Full NPM Support. (All the packages)
Native modules. (90% and getting better thanks to Zig's interop)
S3 File / SQL Builtin. (Blazingly Fast!)
You should try it. Yes, others do these things too, but we're talking about Bun.
And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.
Well, until the bubble bursts and Anthropic fizzles out or gets acquired themselves.
* Get run by devs with filesystem permissions
* Get bundled into production
And apparently the submission's source for being the only org I can tell that anticipated this: https://www.theinformation.com/articles/anthropic-advanced-t...
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
[^1]: https://github.com/sst/opencode [^2]: https://github.com/sst/opentui
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
Prior to that GitHub Copilot was either the VS Code IDE integration or the various AI features that popped up around the GitHub.com site itself.
To some degree have “opinionated views on tech stacks” is unavoidable in LLMs, but this seems like it moves us towards a horrible future.
Imagine if claude (or gemini) let you as a business pay to “prefer” certain tech in generated code?
Its google ads all over again.
The thing is, if they own bun, and they want people to use bun, how can they justify not preferencing it on the server side?
…and once one team does it… game on!
It just seems like a sucky future, that is now going to be unavoidable.
https://www.youtube.com/watch?v=6hEiUq8jWIg
and when this bubble pops down goes bun
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
Is there any historical precedent of someone doing that?
The effective demand for Opus 4.5 is bottomless; the models will only get better.
People will always want a code model as good as we have now, let alone better.
Bun securing default status in the best coding model is a win-win-win
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
Why?
[1] www.bunn.com
Regards.
> We’re hiring engineers.
Careers page:
> Sorry, no job openings at the moment.
https://www.anthropic.com/jobs?team=4050633008
It’s wild what happens when a generation of programmers doesn’t know anything except webdev. How far from grace we have fallen.
That's quite a bit harder if your tool is built using a compiled language like Go.
Thank you for showing exactly why acquisitions like this will continue to happen.
If you don't support tools like Bun, don't be surprised to see them raise money from VCs and get bought out by large companies.