https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
Java is possibly the safest bet on the future, it's open source both in spec and in the most common implementation (OpenJDK), and is so widely used that there are multiple FAANG companies critically dependent on Java working that alone could continue the development of the platform were anything happen.
Besides, Oracle has been a surprisingly good steward of the language.
That's all that's needed to create a sense of caution for would-be adopters.
Oracle lost the lawsuit and I do agree with the decision in that APIs should be freely replicated, but let's not pretend that Google was some saint good guy here fighting the good fight, they were just cheap and aggressively capitalistic.
Legally the case was about copying declaring code from a proprietary product, not an open source one.
These famous cases set the legal tone for the entire world actually.
The implications of a judgment in favor of Oracle were staggering. Any codebase that is extensively dependent on a proprietary API is legally locked in to using that company's proprietary implementation as long as that company asserts copyright on its API. Anyone who implements the same API to offer a drop-in alternative to the proprietary product is infringing -- even if a someone privately reimplements the API without distributing the reimplementation.
Which immediately implicates WINE (Windows), Mono (.NET), ReactOS (Windows), Darling (macOS/Darwin), GNUStep (Cocoa/OpenStep), Anbox (Android), Ruffle (flash), GNU Octave (MATLAB), Mesa 3D (Direct3D), ZLUDA (CUDA), and DXVK (Direct3D 9/10/11), to name a few of the most popular...
Reimplementing an interface != stealing code.
It has similar bases on facts.
To this day Android Java is not fully compatible with Java proper, and Kotlin became Google's version of C#.
Plus last time I checked Oracle lost that lawsuit.
This is only true for older .NET Framework applications.
<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders
EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!
So how did it work back in the day, people would just submit text and it would get upvoted? I always assumed like half of them were just made up.
As far provenance, I assume a lot of them were made up too, but this one was real.
If anyone ever requested/used an eggdrop(?) bot from #farmbots or #wildbots on quakenet then thanks to you too; that was certainly one of the next steps down the path I took. A (probably very injectable) PHP blog and a bunch of TCL scripts powering bots, man I wish I could review that code now.
To everyone else: I acknowlege that this post is not adding value but if you were one of the lucky 1/10000 you would understand that I have no choice.
seems to work. relies on that individual quote being indexed, and google SERPs feeling like returning full results at the moment, of course. when the latter fails, I've found success with site: queries on Bing (of all places.)
you could also run java with js if you are brave enough https://kreijstal.github.io/java-tools/
But the majority of projects are on a newer JDK than 8 for quite some years now.
Now if you complain about slice handling, I'm with you.
Java is the running joke of verbosity, and you are too if you seriously argue that it's not.
Evidence is easy - think of a problem and ask LLM to generate idiomatic examples (leverage Java streams, with functional decomposition, etc) in Go and Java and with error handling. You will find that more often than not, the Java line count is far smaller.
Idiomatic, Modern Java is written quite differently. Today, Go has a lot of arcane, noisy, complex code too. Ex: many, many k8s Go projects.
But I do know that you are meaning stuff like AbstractFactoryFactory, but you do realize that there is zero need to write anything like that and you can (and people do) write bad code in any language?
Also, Java's ecosystem is unparalleled (top 3 in size, depending on domain it usually has the best packages (e.g. typical backend-related functionality)), has stellar performance, a huge developer base, best-in-class IDE support, even LLMs understand it exceptionally well (given how widely represented it is in the training corpus, plus has a decent type system) if that's your thing.
For a typical backend system, you really have to have a good reason to choose something else at this point.
People end up choosing something that has batteries included so they can focus on solving business problems. A programmer who will superficially understand SpringBoot without understanding how it works. Really, there is no magic there - its a few core concepts - annotations, bytecode enhancement and dynamic proxies. Maybe Im missing one or two. Everything else is built on top of this.
This is regardless of language/ecosystem. If I do not understand the fundamental concepts, I will never be successful in that ecosystem.
But that's like saying that a bicycle is better than a car, because the first is simpler to understand. (At the same time, nothing prevents you from assembling a bicycle in Java if that's indeed what you need. But for general long distance travel you are better off starting with a car frame, aren't you?)
Why wouldn't I choose java
I left a place using Java to run edge apps and the footprint was a major issue.
For example, if I have a struct `PageEntity` with a field `Id`, and I am iterating over a slice of such IDs, I would prefer using `pid` instead of `pageEntityId` as the variable name. But Java APIs and conventions tend to use these longer names, so I find it takes more thinking to remember the different names instead of quickly seeing the behavior of code at a glance.
Java also tends to have a lot of inheritance which results in these long glued-together names and makes it harder to follow program flow because behaviors get introduced in multiple different places (i.e., it has the opposite of locality of behavior).
But those are just my opinions and experiences! I know many people love Java, and it is a versatile and powerful language.
By contrast `bun install` is about as good as it gets.
Please give me Java tools over C, C++, JavaScript or Python ones, any day of the week.
Only .NET and Rust compare equally in quality of DX.
AI tools value simplicity?!?
Check in the Python dependency management chaos, what it is the proposal this month, from what AI startup doing Python tools in Rust?
How many mass security incidents have there been with npm just the last few weeks?
If the build script being a DSL is the issue, they're even experimenting around declarative gradle scripts [0], which is going to be nice for people used to something like maven.
The problem with Gradle is that it never had a clear philosophy to begin with. It's trying to be everything to everybody, changes best practices every year and has enough features that the project at hand could entirely be built out of Gradle scripts itself.
And oh, it still requires an update to run everytime a new JDK is released even though the SDK is the most backward compatible thing ever written.
Gradle is better from this perspective, and hopefully with its "kotlinization" we will see some stability, which was the biggest issue it had before.
In any case, what are the proposed benefits of the "kotilization"? I tried it about a year ago but realized that it's just a syntax level-wrapper around the same old DSL underneath. In the end, I still viewed it as an ill-described DSL with a massive learning curve outside of happy-paths.
> The challenge
> Traditional tool calling creates two fundamental problems as workflows become more complex:
> Context pollution from intermediate results: When Claude analyzes a 10MB log file for error patterns, the entire file enters its context window, even though Claude only needs a summary of error frequencies. When fetching customer data across multiple tables, every record accumulates in context regardless of relevance. These intermediate results consume massive token budgets and can push important information out of the context window entirely.
> Inference overhead and manual synthesis: Each tool call requires a full model inference pass. After receiving results, Claude must "eyeball" the data to extract relevant information, reason about how pieces fit together, and decide what to do next—all through natural language processing. A five tool workflow means five inference passes plus Claude parsing each result, comparing values, and synthesizing conclusions. This is both slow and error-prone.
Basically, instead of Claude trying to, e.g., process data by using inference from its own context, it would offload to some program it specifically writes. Up until today we've seen Claude running user-written programs. This new paradigm allows it the freedom to create a program it finds suitable in order to perform the task, and then run it (within confines of a sandbox) and retrieve the result it needs.
Super premature optimization. It’ll hallucinate what lines it needs to read, it’ll continuously miss critical context in favor of trimming tokens.
Luckily we can now hook and force the agent to read full files at least once.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
Clever!
To be honest, that sounds more like a pitch for deno than for bun, especially the “paranoidly sandboxed” part.
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?
There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI
My personal experience and anecdotal evidence is in line with this hypothesis. Using the likes of Microsoft's own Copilot with small simple greenfield TypeScript 5 projects results in surprisingly poor results the minute you start leaning heavily on type safety and idiomatic techniques such as branded types.
> There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI
There are also flat earth conferences.
You declare your schema with a good TS ORM then use something like TRPC to get type inference from your schemas in your route handlers and your front end.
You get an enforced single source of truth that keeps the AI on track with a very small amount of code compared to something like Java.
This really only applies to full stack SAAS apps though.
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
Makes me wonder what a theoretical “best possible language for vibe coding” would look like
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
That's not true, if anything, C# is faster and also compiles fast enough.
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
I've seen codebases of varying quality in nearly every language, "enterprise" and otherwise. I've worked at a C# shop and it was no better or worse than the java/kotlin/typescript ones I've worked at.
You can blame the "average" developer in a language for "not caring ", but more likely than not you're just observing the friction imposed by older packaging systems. Modern languages are usually coupled with package managers that make it trivial to publish language artifacts to package hubs, whereas gradle for example is it's own brand of hell just to get your code to build.
do so because a boss told them 'thats the way we deal with correctness and performance around here'
the fact that their boss made that one decision for them does not somehow transmit the values behind the one decision.
> Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
Netflix is Java. Amazon is mostly Java. Some of the biggest open source projects in the world are Java. Unity and Godot both use C# for scripting.I don't know where you're getting the impression that Java and C# are somehow only for "enterprise monkey who barely knows outside of their grotesque codebase"
You can add Meta, Google and Palantir to your list and it won’t change that average Java dev is from an Eastern hemisphere and knows nothing about Java outside of JavaEE/Spring.
See how generalizations work?
A typical backend developer using C#/Java is likely solving more complicated problems and having all the concerns of an enterprise system to worry about and maintain.
Dismissing a dev or a system because it is enterprisy is a weak argument to make against a language. A language being used a lot in an enterprise to carry the weight of the business is a sign the language is actually great and reliable enough.
All of them explicitly don’t have to care about performance from the start because of VMs + GC, only when scale starts to matter you start to optimize.
Tooling argument is especially funny to me, given how shit tooling ecosystem is. Sure it is ol’ reliable, but average Java dev is so stuck in their ways that they’ve never even tried to dwell out of their Java cave to see what’s out there.
IntelliJ consuming ALL of RAM, literally as much as it can get hands on. Gradle taking what’s left, rebuilds taking minutes to complete or requiring elaborate setup to have proper hot reload. Both TS and Python have far more expressive and powerful type systems than even modern Java. “Production grade tooling” my ass.
Funny to see Java shmucks looking down at JS/Python folks, as if Java at the time wasn’t picked for literally same reasons as Python/JS nowadays.
what makes you think so?
I believe strong typing is very very useful for human coding,
I'm not convinced its so 'very very' for agents.
I've noticed LLMs just slap on "as any" to solve compile errors in TypeScript code, maybe this is common in the training data. I frequently have to call this out in code review, in many cases it wasn't even a necessary assertion, but it's now turned a variable into "any" which can cause downstream problems or future problems
I tell the LLM to include typing on any new code.
The agent is running the test harness and checking results.
Otherwise they’d be building these types of things in Rust.
I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?
Can't you just do it as a library and train/instruct the LLM to prefer using that library?
The whole point every CEO with a toe in the AI pool can't stop bleating on about is that software engineering is dead and replaced by AI.
bun is MIT licensed, they could take bun free of charge and use their Phd level software engineer god machine to iterate on it.
The acquisition makes more sense. A few observations:
- no acquisition amount was announced. That indicates some kind of share swap where the investors change shares for one company into another. Presumably the founder now has some shares in Anthropic and a nice salary and vesting structure that will keep him on board for a while.
- The main investor was Kleiner Perkins. They are also an investor in Anthropic. 100M in the last round, apparently.
Everything else is a loosely buzzword compatible thingy for Anthropic's AI coding thingy and some fresh talent for their team. All good. But it's beside the point. This was an investor bailout. They put in quite a bit of money in Bun with exactly 0 remaining chance of that turning into the next unicorn. Whatever flaky plan there once might have been for revenue that caused them to invest, clearly wasn't happening. So, they liquidated their investment through an acquihire via one of their other investments.
Kind of shocking how easy it was to raise that kind of money with essentially no plan whatsoever for revenue. Where I live (Berlin), you get laughed away by investors (in a quite smug way typically) unless you have a solid plan for making them money. This wouldn't survive initial contact with due diligence. Apparently money still grows on trees in Silicon Valley.
I like Bun and have used it but from where I'm sitting there was no unicorn lurking there, ever.
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
I mean would it have made sense to acquire golang if it were on sale?
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
bun ≠ front end development tool
hasn't been like that for many years
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
https://github.com/yt-dlp/yt-dlp/wiki/EJS
where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)
What about people who do not own a TV
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
> They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Sounds like "monetizing Bun is a distraction, so we're letting a deep-pocketed buyer finance Bun moving forward".
And kept their fraudulent name.
I hope they won't be as douchey.
They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.
I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.
Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.
Anthropic are on track to reach $9BN in annualised revenue by the end of the year, and the six-month-old Claude Code already accounts for $1BN of that.
like when a political leader says they have full faith in one of their ministers, you know said minister will be gone by next week.
> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.
Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.
(I’m half joking, that’s awesome for Zig!)
- a javascript developer
Video games are of course a different story.
True, but where is the fun in that?
Now the real question is, does the game loads significant better now, or does the performance still suck? In which case it might be more an excessive case of yak-shaving. And if yes, when can we except the release?
Builders build. Sometimes it's not about picking the right tools for the job or starting with the right choices. Sometimes it's just about building.
To use your road analogy - sometimes people just go for a drive. Sometimes those people end up right where they are supposed to be.
Isn’t every success story really an example of survivorship bias?
No, survivorship bias (in this context) means to wrongly see a minor subgroup as the majority. But the successful subgroup is not always a minority, or falsely labelled.
Nobody made DOOM in Excel because they thought it made a good engine.
Thanks for assuming I “read” about bundlers somewhere, though. I’ve been using (and configuring) them since they existed.
It's obvious why he didn't write the game in x86 assembly. It's also obvious why he didn't burn the game to CD-ROM and ship it to toy stores in big box format. Instead he developed it for the web, saving money and shortening the iteration time. The same question could be asked about next.js and especially about taking the time to develop Bun rather than just scrapping next.js for his game and going about his day. It's excellent for him that he did go this route of course, but in my opinion it was a strange path towards building this product.
I also wonder how many people who sing the praises of an HTML file with a script tag hosted by Nginx or whatever have ever built a significant website that way. There’s a lot of frustrating things about the modern JS landscape, but I promise you the end results were not better nor was it easier back before bundlers and React.
Happy to answer any questions
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
You get a feel for how much direction they need after working for a while and tooling and accessible documentation is really important for quality.
Then you give them a task and review the results. In (backend/systems) programming it's pretty binary whether a solution works or not, it's not a matter of taste but something you can just validate with hard data.
I've done so many tiny/small/medium sized utilities for myself in the last year it's crazy[0]. A good bunch of them are 95-100% vibecoded, meaning I was just the "project manager" instructing what features I want and letting the agent(s) make it work.
I think I have a pretty good feel for the main agentic systems and what they can do in the context of what I do so I know what to tell them and how - each has its own distinct way of working and using the wrong one for the wrong job is either stupid, frustrating or just a waste of time.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
putting everyone using the generated outputs into a sort of unofficial grey market: even when using first-party tools. Which is weird.
Nobody knows WHO has the copyright but it's been decided in courts that AI definitely doesn't own it.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
> had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types
Folks at Bun are "Zig people" for obvious reasons, and a link was made with Handmade software. This happened multiple times before with Bun specifically, so my response is not a "pivot" of any kind. I've highlighted and constrasted our differences to prevent further associations inside a viral HN thread. That's not unreasonable.
I also explicitly congratulated them for the acquisition.
I'm willing to wager that 99.99% of readers do not associate "Handmade" with the org you're associated with, and that most didn't know it existed until this comment. So yes "really", without OP replying, it's understandable that the poster you're replying inferred it had nothing to do with you.
I took your comment as referring to people who like to manually unroll their loops rather than members of some particular community, so GP's nitpicking on specifics of that interpretation looked out of place.
Thank you! I appreciated how you wrote up this clarifying.
This sounds so cringe. We are talking about computer code here lol
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
For example: what’s in the middle for programming?
For me 0 is writing 0 and 1. For others 0 is making the nand ports.
And 100 is ai llm vibe.
So 50/middle would be what exactly? It all depends.
Same for anything really. Some people I know keep saying not 8 not 80 to mean the middle.
Like what’s in the middle for amount of coding per day? 12 h? 8h? 2h?
What’s middle for making money? 50k, 500k, 500m?
What’s the middle for taking cyanide ? 1g? 1kg?
What about water? What about food? What about anything?
As you can see, it’s all relative and whomever says it, is trying to push his narrative as “middle” aka correct, while who does more or less is “wrong”.
You see how makes no sense this in the “middle” concept?
Something like that. What you think?
For an average person: maybe means 8 hours.
For a dev maybe means: 16 hours
For a farmer: 2 hours.
Who’s right?
> So 50/middle would be what exactly? It all depends.
Using LLMs to explore code bases, doing deep research, asking it to do code reviews or find bugs, but not pushing code that LLMs have authored might be one example in the middle.
Cyanide has an LD50 (50% chance of death) in the 1-2 mg/kg range when taken orally. So middle for taking cyanide is probably 1.5mg/kg. 90mg for someone 60kg.
Sadly the middle ground in other topics is less easy to define!
Bottom is 0? Or is 1mg? Or what?
Top is killing a human by dose? Killing an elephant dose? Make you feel dizzy?
What’s too much? What’s being in the middle?
You see, it’s a stupid logic “in the middle”. The point is being in the middle or moderate is not always “good”.
Also it’s hard to define. Tbh only non logical people throw words like “in the middle” etc
One of my favorite things is describing a bug to an LLM and asking it to find possible causes. It's helped track something down many times, even if I ultimately coded the fix.
Bun genuinely made me doubt my understanding of what good software engineering is. Just take a look at their code, here are a few examples:
- this hand-rolled JS parser of 24k dense, memory-unsafe lines: https://github.com/oven-sh/bun/blob/c42539b0bf5c067e3d085646... (this is a version from quite a while ago to exclude LLM impact)
- hand-rolled re-implementation of S3 directory listing that includes "parsing" XML via hard-coded substrings https://github.com/oven-sh/bun/blob/main/src/s3/list_objects...
- MIME parsing https://github.com/oven-sh/bun/blob/main/src/http/MimeType.z...
It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.
And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.
I can also see why it's a very good fit for LLM-heavy workflows.
[1] https://codeberg.org/ziglang/zig/src/commit/be9649f4ea5a32fd...
Do you think Anthropic might request you implement private APIs?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:
If the answer is performance, how does Bun achieve things quicker than Node?
I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?
Thanks
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one
asking the real questions
Congratulations.
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
Are you confusing practice with rehearsal?
Huh, this feels very odd to read and buying a company outright is definitely not the only way to push Bun to be excellent. Contributing to Bun from their developers, becoming a sponsor, donating through other means, buying 'consulting services' or similar, or even forking it and keeping it up to date would all be also steps towards keeping the Bun excellent.
This is vendoring a dependency on steroids, and first moment interests of community are not aligned with what Antropic needs, it will be interesting to see how this unfolds. History has thought us that this will end up with claims in the blog post not holding much weight.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
I don't think they're doing that.
Estimates I've seen have their inference margin at ~60% - there's one from Morgan Stanley in this article, for example: https://www.businessinsider.com/amazon-anthropic-billions-cl...
Not estimate, assumption.
The best one is from the Information, but they're behind a paywall so not useful to link to. https://www.theinformation.com/articles/anthropic-projects-7...
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
Devs can write at a very fast rate with ai.
You still need to check it or at least be aware it's a translation. The problem of extra puns remains.
我不会说任何语言,我否认一切
Incorrect - that was the fraudulent NAV.
An estimate for true cash inflow that was lost is about $20 billion (which is still an enormous number!)
anthropic's unit margins are fine, many lovable-like businesses are not.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
You have no responsibility for an unrelated company's operations; if that was important to them they could have paid their talent more.
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So? VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
Reminds me of when Tron, the crypto company, bought BitTorrent.
If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.
Bun seems more alive than Deno, FWIW.
Does that mean anything at all?
OpenAI is a public benefit corporation.
Elaborate? I believe Zig's donors don't get any influence and decision making power.
Will this make it more or less likely for people to use Bun vs Deno?
And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
People who like Bun for what it is are probably still going to, and same goes for Deno.
That being said I don't see how Anthropic is really adding long term stability to Bun.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
I use Hono, Zod, and Drizzle which AFAIK don't need Node compat.
IIRC I've only used Node compat once to delete a folder recursively with rm.
Amount of people at big corps that care about their lawsuit, and would switch their IT guidelines from node to Deno due to such heroic efforts?
Zero.
I'm not sure it will make much of a difference in the short term.
For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.
For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.
I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).
----------------------------
The two largest problems I see in JS are:
1. Terrible security practices
2. A lack of a standard library which pushes people into dependency hell
Deno fixes both of those problems with a proper permission model and a standard library.
----------------------------
> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.
But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.
Bun has a better standard library than Deno. You get a DB driver, S3 client, etc which on Deno are all third party deps. It was the main reason I got interested in Bun. The speed is nice though during dev. Everything feels instant.
Aren't you hosting Bun on a 3rd party host though?
Wouldn't you want to use the 3rd party's own dependency to connect to the 3rd party's services?
---
I wasn't sure how bun handled it's db driver so I just checked the docs. Looks like they try to have a single api for multiple databases by writing queries in template strings.
I can see the appeal of a single api, but using template strings looks like a bad way to do it.
What happens when someone adds a keyword or function into a query that exists in one database engine but not another? Or even in one version of a database but not another? Or even in the same database engine and same version but with different settings?
It's a terrible debugging experience trying to resolve those kinds of issues when you're depending on a database that someone else has set up.
You need a different driver for each database engine, or you can have a unified api if you use an ORM since it can translate or shim your query into the SQL supported by your database engine.
It fades away as a direct to developer tool.
This is a good thing for Deno.
Not yet; similar concerns were addressed by Dahl 6mo ago: https://deno.com/blog/greatly-exaggerated / https://archive.vn/L6His
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
We have yet to witness a segfault. Admitedly it's a bunch of micro services and not many requests/s (around 5k AVG).
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
Curious about safety here: Are kernel / cross-thread resources (ex: a mutex/futex/fd) released on unwind (assuming the stack being unwound acquired those)?
All modern OS’s behave this way. When your process starts and is assigned an address, you get an area. It can balloon but it starts somewhere. When the process ends, that area is reclaimed.
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
It feels like a very unfocused project, every few months they introduce a new feature that is to replace an entire class of tools, while the original promise of "drop in Node replacement" was never fulfilled (or at least, that was the vibe for the past years when I was still paying attention).
We'll see how this works out but speed + lack of scope never works out well in the long term.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
I haven't had that experience with deno (or node)
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
1. https://github.com/denoland/deno/issues?q=is%3Aissue%20state... 2. https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...
Node.js is a no-brainer for anyone shipping a TS/JS backend. I'd rather deal with poor DX and slightly worse performance than risk fighting runtime related issues on deployment.
Linux needs to be a first class citizen for any runtime/langauge toolchain.
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
johnfn@mac ~ % time deno eval 'console.log("hello world")'
hello world
deno eval 'console.log("hello world")' 0.04s user 0.02s system 87% cpu 0.074 total
johnfn@mac ~ % time bun -e 'console.log("hello world")'
hello world
bun -e 'console.log("hello world")' 0.01s user 0.00s system 84% cpu 0.013 total
That's about 560% faster. Yes, it's a microbenchmark. But you said "absolutely zero chance", not "a very small chance".Lol, yeah, this person is running a performance test on postgres, and attributing the times to JS frameworks.
The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
Anthropic chose to use Bun to build their tooling.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
Something about moral and philosophical flexibility.
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
I don't mean to disparage you in particular, this is like the 1000th time I've seen this.
More broadly, I think the observation tends to get repeated because C and Zig share a certain elegance and simplicity (even if C's elegance has dated). C++ is many things, but it's hardly elegant or simple.
I don't think anyone denies that Zig can be a C++ replacement, but that's hardly unusual, so can many other languages (Rust, Swift, etc). What's noteworthy here is that Zig is almost unique in having the potential to be a genuine C replacement. To its (and your) great credit, I might add.
>> At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
@GP: This is not a great take. All four languages are oriented around manual memory management. C++ inherits all of the footguns of C, whereas Zig and Rust try to sand off the rough edges.
Manual memory management is and will always remain necessary. The only reason someone writing JS scripts don't need to worry about managing their memory is because someone has already done that work for them.
But completely agree. Its a perfect replacement for C++ and I would say the natural spiritual successor of C.
I gave up using Rust for new projects after seeing the limitations for the kind of software I like to write and have been using Zig instead as it gives me the freedom I need without all the over-complication that languages like C++ and Rust bring to the table.
I think people should first experiment see for themselves and only then comment as I see a lot of misinformation and beliefs more based on marketing than reality.
Thank you very much for your wonderful work!
A lot of stuff related to older languages is lost in the sands of time, but the same thing isn’t true for current ones.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.
What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?
Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?
Additionally, depending on round, they also have multiples, like 2x meaning they get at least 2x their investment before anyone else gets anything
Investors must be happy because Bun never had to find out how to become profitable.
except this sense:
> Investors must be happy because Bun never had to find out how to become profitable.
Maybe an easier first step would be to open source Claude Code...?
Codex has the opposite issue. It has an open client, which is relatively pointless, because it will accept only one system prompt and one prompt only.
In the article they write about the early days
We raised a $7 million seed round
Why do investors invest into people who build something that they give away for free?Why invest into a company that has the additional burden of developing bun, why not in a company that does only the hosting?
There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...
If your userbase or the current CEO likes it or not.
(It was reverted after the situation gained visibility, as is tradition.)
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
and
Implementing the Decisions
are complementary, one of these is being commoditised.
And, in fact, decimated.
Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.
It's amazing!
My boss has dubbed it "programming at the speed of thought" which I'm sure he's picked up from somewhere. I've seen other people say that.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn
I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784
You can see my site here, if you'd like: https://chipscompo.com/
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
My best writing on this topic is still this though (which doesn't include a video): https://simonwillison.net/2025/Mar/11/using-llms-for-code/
I was honestly baffled how fast Claude knocked this out.
tbf, Google has long been an AI corp. The Big Labs are trying to get in to other products/businesses just like Google did.
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
Plus many other scenarios.
On the opposite spectrum it's just that Claude and Bun are great technologies that joined forces.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
"the full platform"
there are more languages than ts though?Acquisition of Apple Swift division incoming?
https://github.blog/news-insights/octoverse/octoverse-a-new-...
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.
Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.
Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.
Yeah.. you do not know what you are talking about.
I run my test suites on both node and bun because I prefer to maintain compatibility with node and it works just fine. I use some of bun's APIs, like stringWidth and have a polyfill module that detects if it's not bun and loads an alternative library through a dynamic import.
Not NPM compatible? Bun is literally a drop in for NPM, in fact you could use it solely as a replacement for NPM and only use node for actual execution if that's your thing. It's much faster to run bun install than npm install and it uses less disk space if you have many projects.
The developer experience on bun is so much better it isn't funny. It remains prudent, I would agree, to not depend too heavily on it, but Bun is many things: a runtime, a set of additional APIs, a replacement for the dogslow npm, a bundler etc. Pick the parts that are easily discarded if you fear Bun's gonna go away, and when using their APIs for their better performance write polyfills.
>Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.
Or maybe you should have a cold hard look in front of the mirror first and think twice before opening your mouth because the more you write the more the ignorance is shown.
Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?
I'm not sure if Joyent have any significant role in Node.js maintenance any more.
regardless, it's certainly not MS.
Anything is greater than 0
I did end up fixing Node.js compatibility later but it was extra work. Felt like they just created busy-work. Node.js maintainers should stop deprecating perfectly good features and complicating their modules.
- Anthropic can't write its TUI app in anything else more suitable than Javascript
- They feel the need to buy a portion of their software supply chain
That perspective following “in two-three years” makes me shudder, honestly.
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Genuinely perplexed.
Like I’ve said: NativeAOT
https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...
on Linux only with CGO_ENABLED=0 and good luck using some non web related 3rd party module which can be used with CGO disabled.
I dislike it also..
Might also be a context window thing. Idk how much boilerplate C# has, but others like Java spam it.
You can run also C# code very quickly, but have the option (but not the need) to AOT compile it. I would say the only real edge JS has is the ability to run natively in the browser. It was built for that purpose, and in my opinion, that is where it should have stayed.
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
Running code is absolutely going to happen for coding assistant.
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
I don't know why it's even necessary for this.
https://github.com/atxtechbro/test-ink-flickering
Issue on Claude Code GitHub:
The web is the half-baked, bug-filled reimplementation of the native application stack. We figured these problems out a long time ago, and the constant wheel re-invention from folks who only know and refuse to use anything but the web is getting old. I'm tired of everything you produce being slow, buggy, and consuming gigabytes of RAM.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
moreover, now they can make investments in order to make it an an even more efficient and secure runtime for model workspaces.
Except Node's author already wrote its replacement: Deno.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
Because now the Bun team don't have to redirect their resources to implementing a sustainable business model.
No they don't.
https://www.wheresyoured.at/why-everybody-is-losing-money-on...
> The Information earlier reported on some of the financial figures for both companies.
> The documents show that OpenAI expects to burn $9 billion after generating $13 billion in sales this year, while Anthropic expects to burn almost $3 billion on $4.2 billion in sales—roughly 70% of revenue for both.
https://archive.is/e7pg9 / https://www.wsj.com/tech/ai/openai-anthropic-profitability-e... (paywall)
https://fortune.com/2025/11/12/openai-cash-burn-rate-annual-...
Do you really think Anthrophic's annual expenses are in single digit billions? Or OpenAI's annual expenses being less than $9 billion?
> people latch on to them because it confirms their own ideals
I think this applies universally, even to yourself, no? You're so deadset on believeing Anthrophic is not losing billions, you're debating semantics and borderline insulting my reading skills.
IOW look where the puck is going.
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
This just isn't the hard part of the product.
Like if I was building a Claude Code competitor and I acquired bun, I wouldn't feel like I had an advantage because I could get more support with like fs.read?
Agentic systems are no longer just model-driven — they are increasingly runtime-driven through tools, sandboxes, and orchestration layers. When agents are writing code, executing tests, calling external tools, and validating outcomes, the reliability of the entire system becomes a function of the runtime and environment, not just the model.
Controlling the runtime gives Anthropic vertical control across:
- Packaging and distribution - Execution semantics - Performance characteristics - Security surface area - Operational stability
Of course, owning a JavaScript runtime does not magically make runtime deterministic. The dominant sources of variability still sit lower in the stack: OS behaviour, native dependencies, filesystem semantics, scheduling, hardware differences, and network effects. Runtime ownership is necessary, but not sufficient, for true runtime determinism.
What this does signal is a strategic shift in where value is being captured.
Model capability is no longer the only critical layer.
The execution plane is becoming just as strategic too, can see this in ChatGPT 5’s dynamic model routing capability as well as ecosystem wide Tool Calling and MCP adoption, which are all aimed at increasing LLM determinism through traditional software stacks.
Vertical integration here looks far less like a tooling move and far more like a control-plane strategy: consolidate the stack, reduce dependency entropy, and shrink the surface area where behaviour can diverge.
Intelligence may live in the model, but reliability, scalability, and trust are increasingly properties of the system that executes it.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
So, yeah, uv is nice, but for me didn't fundamentally change that much.
uv seems like a great tool, but I remember thinking the same about pipenv, too.
#2, if you don't like uv, you can switch to something else.
uv probably has the least moat around it of anything. Truly a meritocracy: people use it because it's good, not because they're stuck with it.
Python is doing great, other than still doing baby steps into having a JIT in CPython.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
Being able to create an agent in any language to run on any hardware has always been possible hasn't it?
[1] https://ziglang.org/code-of-conduct/#strict-no-llm-no-ai-pol...
That's like saying GCC and NodeJS are culturally apart, as if that has significant bearing on either?
> going to replace all software developers with AI
No?
> building a product that is supposed to make software cost 0 right
No?
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Sure. But everything is relative. For instance, Node has much more likelihood of long term stability than Bun, given its ownership.
Given how many more dependencies you need to build/maintain a Node app, your Bun application has a better chance of long term stability.
With Node almost everything is third party (db driver, S3, router, etc) and the vast majority of NPM deps have dozens if not hundreds of deps.
How was Go involved there before Zig?
The first hints of what become Bun were when Jared experimented at porting that to Zig.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
If they don't want to maintain; GitHub fork with more motivated people.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
Is there any historical precedent of someone doing that?
Monty forking MySQL after the Oracle acquisition is the only real example that comes to mind. Oracle being Oracle resulted in OpenOffice getting forked, too, but that was mostly driven by Canonical.
The effective demand for Opus 4.5 is bottomless; the models will only get better.
People will always want a code model as good as we have now, let alone better.
Bun securing default status in the best coding model is a win-win-win
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
npm install -g @anthropic-ai/claude-code
I thought claude code just used Nodejs? I didn't realise the recommended install used a different runtime. curl -fsSL https://claude.ai/install.sh | bash
That install script gives you a single binary which is created using Bun.Feels like maybe AI companies are starting to feel the questions on their capital spending? They wanna show that this is a responsible acquisition.
Put the Bun folks directly on that please and nothing else.
Acquisition seems like a large overhead and maybe a slight pivot to me.
Congrats to Jarred and the team!
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
Don't read a snarky comment so literally ;)
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
The key architectural difference is that Node.js implements the HTTP stack and other low level libraries in JavaScript, which gives it memory safety guarantees provided by the v8 runtime, while Bun/uWebsockets are a zig/C++ implementation. for Node.js, which is focused on enterprise adoption, the lower performance JS approach better aligns with the security profile of their enterprise adoption target.
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
The reality is that the insane "JS ecosystem" will rally around whatever is the latest hotness.
v8 is one of the most advanced JIT runtimes in the world. A lot of people have spent a lot of time focusing on its performance.
Prior to that GitHub Copilot was either the VS Code IDE integration or the various AI features that popped up around the GitHub.com site itself.
That said, I can easily imagine that the developer experience is superior for Zig. More fun.
At this point however I don't really like the way zig is going. So I won't really advocate for zig at this point, but I can say when I first started using it, it was fun and I could just write code that worked.
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
That's 100% what happened to Bun. It's useful (like really useful) and now they're getting rewarded
1. acquire talent.
2. control the future roadmap of bun.
i think it's really 1.
...but hey, things are different during a bubble.
It would be ironic if deno ended up having more builtin AI bloat despite Bun being the one owned by an AI company. But I wouldn't be surprised.
And apparently the submission's source for being the only org I can tell that anticipated this: https://www.theinformation.com/articles/anthropic-advanced-t...
https://github.com/oven-sh/bun/pull/24578
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
> that the Bun Claude bot created a PR for about 3 weeks ago
The PR with bad code that's also been ignored was made by the bot that Bun made, and brags about in their acquisition post.
...Did you miss the part where Bun used Claude to generate that PR?:)
1. User krig reports an issue against the Bun repo: https://github.com/oven-sh/bun/issues/24548
2. Bun's own automated "bunbot" filed a PR with a potential fix: https://github.com/oven-sh/bun/pull/24578
3. taylordotfish (not an employee of Bun as far as I can tell, but quite an active contributor to their repo) left a code review pointing out many flaws: https://github.com/oven-sh/bun/pull/24578#pullrequestreview-...
This will make sure Bun is around for many, many, years to come. Thanks Anthropic.
Why Bun?
Easy to setup and go. bun run <something.ts>
Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)
Typescript Supported. (No need to tsc, bun can transpile for you)
Binary builds. (single executables for easy deployment)
Full Node.js Support. (The whole API)
Full NPM Support. (All the packages)
Native modules. (90% and getting better thanks to Zig's interop)
S3 File / SQL Builtin. (Blazingly Fast!)
You should try it. Yes, others do these things too, but we're talking about Bun.
And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.
Well, until the bubble bursts and Anthropic fizzles out or gets acquired themselves.
* Get run by devs with filesystem permissions
* Get bundled into production
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
[^1]: https://github.com/sst/opencode [^2]: https://github.com/sst/opentui
To some degree have “opinionated views on tech stacks” is unavoidable in LLMs, but this seems like it moves us towards a horrible future.
Imagine if claude (or gemini) let you as a business pay to “prefer” certain tech in generated code?
Its google ads all over again.
The thing is, if they own bun, and they want people to use bun, how can they justify not preferencing it on the server side?
…and once one team does it… game on!
It just seems like a sucky future, that is now going to be unavoidable.
Or perhaps they have in their agreement that Anthropic's own rules don't apply to them?
Serious question, in all this madness
and when this bubble pops down goes bun
What? Oh, lots of slop code?
[1] www.bunn.com
Why?
Regards.
> We’re hiring engineers.
Careers page:
> Sorry, no job openings at the moment.
/s
Thank you for showing exactly why acquisitions like this will continue to happen.
If you don't support tools like Bun, don't be surprised to see them raise money from VCs and get bought out by large companies.
It’s wild what happens when a generation of programmers doesn’t know anything except webdev. How far from grace we have fallen.
That's quite a bit harder if your tool is built using a compiled language like Go.