> There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
People are free and probably do this because it is slow. Alternatives often are not a bad thing.
I have zero issues with people vibe coding alternative Homebrew frontends, it's good for the ecosystem for there to be more experimentation.
What I take objection to is when one or more of these happen:
- incorrect compatibility claims are made (e.g. if you're not running Ruby, no post-install blocks in formulae are gonna work) - synthetic benchmarks are used to demonstrate speed (e.g. running `brew reinstall openssl` in a loop is not a terribly representative case, instead a e.g. cold `brew upgrade` of >10 packages would be). to be clear, I'm sure most of these projects are faster than Homebrew in fair benchmarks too! - incorrect claims about why Homebrew is slow are made (e.g. "we do concurrent downloads and Homebrew doesn't": true a year ago, not true since 5.0.0 in November 2025) - it's pitched as a "replacement for Homebrew" rather than "an alternative frontend for Homebrew" when it's entirely reliant on our infrastructure, maintainers, update process, API, etc.
Even on the above: of course people are free to do whatever they want! It's just at least some of the above hinders rather than helps the ecosystem and makes it harder rather than easier for us as a wider open source ecosystem to solve the problem "Homebrew is slow" (which, to be clear, it is in many cases).
And to be fair, when I was at 4.x version, 90% of the time I was in the happy path, my "being slow" issue was when download speeds got really bad, sometimes caused by my ISP, so my end.
As others mentioned, homebrew is a great piece of software, thank you, not only you but everyone who maintains it.
Python has powered Linux package management to reasonable result for a long time, Python itself is ironic for having tricky platform constraints that ended up being best solved with uv's excellent rust solver. For homebrew I would personally not stress over a Rust frontend - but if it keeps some of the FUD out then maybe it's worth it!
Alternatives are always good but IMO brew is just not something I interact with all that much and to me it's "good enough". It works and does what I expect, although to be fair maybe I'm on the happy path <shrug>.
Exactly. I’ve been using MacPorts for ages and I love it.
/me ducks.
I didn't know about the pending, official Rust frontend! That's very interesting.
It's like yum vs apt in the Linux world. APT (C++) is fast and yum (Python) was slow. Both work fine, but yum would just add a few seconds, or a minute, of little frustrations multiple times a day. It adds up. They finally fixed it with dnf (C++) and now yum is deprecated.
Glad to hear a Rust rewrite is coming to Homebrew soon.
It was mostly precipitated by when containers came in and I was honestly shocked at how fast apk installs packages on alpine compared to my Ubuntu boxes (using apt)
For example pacman does not need to validate the system for partial upgrades because those are unsupported on Arch and if the system is borked then it’s yours to fix.
* it’s purpose built for mega-sized monorepo models like Google (the same company that created it)
* it’s not at all beginner friendly, it’s complex mishmash of three separate constructs in their own right (build files, workspace setup, starlark), which makes it slow to ramp new engineers on.
* even simple projects require a ton of setup
* requires dedicated remote cache to be performant, which is also not trivial to configure
* requires deep bazel knowledge to troubleshoot through its verbose unclear error logs.
Because of all that, it’s extremely painful to use for anything small/medium in scale.
Anyway the python program would call into libsolv which is implemented in C.
dnf5 is much faster but the authors of the program credit the algorithmic changes and not because it is written in C++
dnf < 5 was still performing similarly to yum (and it was also implemented in python)
I'm perhaps not properly understanding your comment. If the algorithmic changes were responsible for the improved speed, why did the Python version of dnf perform similarly to yum?
Because how often are you running it where it's not anything but a opportunity to take a little breather in a day? And I do mean little, the speedups being touted here are seconds.
I have the same response to the obsession with boot times, how often are you booting your machine where it is actually impacting anything? How often are you installing packages?
Do you have the same time revulsion for going to the bathroom? Or getting a glass of water? or basically everything in life that isn't instantaneous?
I can’t say that’s the only reason it’s slow of course. I’m on the “I don’t use it often enough for it to be a problem at all” side of the fence.
I think how to marry the Ruby formulas and a Rust frontend is something the Homebrew devs can figure out and I'm interested to see where it goes, but I don't really care whether Ruby "goes away" from Homebrew in the end or not. It's a lovely language, so if they can keep it for their DSL but improve client performance I think that's great.
When you say "Rust frontend", is the vision that Homebrew's frontend would eventually transition to being a pure Rust project — no end-user install of portable-ruby and so forth?
If so (ignore everything below if not):
I can see how that would work for most "boring" formulae: formula JSON gets pre-baked at formula publish time; Rust frontend pulls it; discovers formula is installable via bottle; pulls bottle; never needs to execute any Ruby.
But what happens in the edge-cases there — formulae with no bottles, Ruby `post_install` blocks, and so forth? (And also, how is local formula development done?)
Is the ultimate aim of this effort, to build and embed a tiny little "Formula Ruby DSL" interpreter into the Rust frontend, that supports just enough of Ruby's syntax + semantics to execute the code that appears in practice in the bodies of real formulae methods/blocks? (I personally think that would be pretty tractable, but I imagine you might disagree.)
(Just kidding, thank you for creating homebrew and your continued work on it!)
Why? I think I am seriously starting to contract as case of FOMO. I feel like Rust is rapidly gaining territory everyday. I mean, that's fine and all, I suppose. I have never used it, so I have no real opinions on the language.
However, how is this effort different than uv vs pypi? why is this a bad thing?
This is literally what "compatible" means, how else did you expect then to frame it?
Where can I read more on this effort?
The real compatibility test isn't "runs all Homebrew formulae" — it's "runs the 15-20 formulae each developer actually uses." A tool that handles those correctly and fails clearly on edge cases is more useful in practice than a technically complete implementation that's slower.
What's missing from this thread is any data on that surface area, not more benchmark numbers.
You cannot really be compatible with this unless you run the Ruby as the install scripts could do whatever arbitrary computations
In reality most recipes contain a simple declarative config but nothing stops you from doing Ruby in there.
Hence to achieve total compatibility one would need to run Ruby
That said, it's also been a while since I've really had any huge complaints about brew's speed. I use Linux on my personal machines, and the difference in experience with my preferred Linux distro's package manager and brew used to be laughable. To their credit, nowadays, brew largely feels "good enough", so I honestly wouldn't even argue for porting from Ruby based on performance needs at this point. I suspect part of the motivation might be around concerns about relying on the runtime to be available. Brew's use of Ruby comes from a time when it was more typical for people to rely on the versions of Python and Ruby that were shipped with MacOS, but nowadays a lot of people are probably more likely to use tooling from brew itself to manage those, and making everything native avoids the need to bootstrap from an existing runtime.
I would agree with you that probably Ruby itself is probably not the bottleneck (except maybe for depsolving cuz that’s cpu bound)
> nanobrew
> The fastest macOS package manager. Written in Zig.
> 3.5ms warm install time
> 7,000x faster than Homebrew · faster than echo
It presents itself as an alternative to Homebrew.
You won't be having situation where one uses yarn and someone uses pnpm on the same project tho.
I definitely have thought something along those lines (mostly when I go to install a small tool, and get hit with 20 minutes of auto-updates first).
Pretty sure I also will not be adopting this particular solution, however
I agree it’s annoying, but I haven’t turned it off because it’s only annoying because I’m not keeping my computer (brew packages) up-to-date normally (aka, it’s my own fault).
On the average day, I get maybe two or three package upgrades from it. Sometimes, they're packages that I'm extremely grateful to have updates for immediately (like rust-analyzer), and other times they're things that I don't use very often (or don't use directly), so I wouldn't likely remember to upgrade them at all if I didn't make a habit of it.
> Most of us want to wait a little while for the bugs to be worked out of fresh releases. And hey, if everything is working today... why would I want to risk potential breaking changes?
I felt like I was pretty clear in my original comment that I didn't know whether other people upgraded as often as me or not. That being said, it does sound like you've been having an experience you're unhappy with, and I'm not, so I'm not sure why you're so confident that the way I'm using it is weird. It's very possible that it would not end up being something you or others are happy with, but it's more weird to me that you think this is such a huge deal when it seems like the most obvious way in the world to use a package manager to me.
This is not something that's solved by updating less frequently though. It would be solved by a 'minimum age' setting, but `brew` aren't planning on implementing that, with arguably valid reasoning: https://github.com/Homebrew/brew/issues/21421
Minimum age solves a related problem - it gives maintainers some margin of time in which to discover vulnerabilities and yank the affected versions.
However, minimum age also delays you getting bug fixes (since those also need to age out).
In an ideal world one would probably be able to configure a minimum-age-or-subsequent-patch-count rule. i.e. don't adopt new major/minor package versions until either 1 month has elapsed, or a minimum of 2 patch versions have been released for that version.
It constantly blows my mind how insanely long it takes just to do a few simple things on the fastest hardware I've ever owned in my life.
Never had any issues.
Edit: no, it won't...
However, this is a vibe-coded app, with around 30 commits per day, which I don't let install packages on my machine.
https://github.com/asdf-vm/asdf/issues/290#issuecomment-2365...
Yea, I know. It's open source. They can do what they want. Still sucks.
I don’t think it’s reasonable to expect an open source project to support everything
I get it - it’s a different beast with very different ideas behind it, but MacPorts is BSD-solid, and that’s a lot.
MacPorts has some level of support for PowerPC, but anything that isn't in the most recent ~3-4 releases is likely to be cut off from any number of packages at useful versions. (There's substantial work down to support Rust on much older versions of macOS, but there's also versions above which Rust has cut off older macOS versions.)
I believe that there's a recommended stream for when you need older versions support, but it's definitely a secondary target from what I've been reading on the MLs.
That makes no sense then. A power user may still want to run older OS versions for a reason. Take the training wheels off it and then it'll be a power user tool.
No doubt there are edge cases like that, but I don't fault a project for not catering to the < 1% of users who would fall into that bucket and would probably be the ones that cause trickier support cases. These would maybe also be the user that could just install it without homebrew then, it's not like homebrew is the only way to install software.
There's also https://github.com/dortania/OpenCore-Legacy-Patcher for the adventurous.
Also, the writing is on the wall: Ultimately, Homebrew will be ARM-only, once Apple's legacy support becomes ARM-only. At which point it's game-over for Intel Macs.
Homebrew solves the "availability of software" problem in the Mac ecosystem, but it does not solve the "Need to stay on the new hardware treadmill" problem.
After installing, 'nb list' and thus eg. 'nb outdated' will yield the empty list! I have absolutely no use for a competing homebrew installation that is mostly compatible ..
Btw, I noted this:
> Zerobrew is experimental. We recommend running it alongside Homebrew rather than as a replacement, and do not recommend purging homebrew and replacing it with zerobrew unless you are absolutely sure about the implications of doing so.
So I guess its fine to run this alongside Homebrew and they don't conflict.
It appears that Nanobrew is not.
I care about the light-weight efficiency of these new native code variants much more when I want to use brew on some little Linux container or VM or CI, than I do for my macOS development machine.
>Immediately get an error saying the install path is too long and needs to be fixed as /opt/zerobrew/prefix is too many bytes.
Yeah gonna need some work.
Same. Whatever happens, the new version should support Brewfile.
My mistake was when I upgraded from my 2017 iMac (Intel processor) to an Apple silicon Mac at the start of 2024 and migrated via Time Machine I did not do anything extra specifically for Homebrew. I just assumed that as things got updated via the normal periodic Homebrew updates I run it would start grabbing the Apple silicon binaries for binary things it installed.
It turn out that is wrong. They made Apple silicon Homebrew kind of independent of Intel Homebrew. Intel Homebrew uses /usr/local and Apple silicon Homebrew uses /opt/homebrew. This allows having both native and Intel Homebrew installed at the same time if you need both.
The correct way to migrate from an Intel Mac to an Apple silicon Mac is to install Apple silicon Homebrew on the new Mac, and then install all the packages you want. Intel Homebrew works fine on Apple silicon Macs so you can use the Intel Homebrew that migrated via Time Machine to make the package list to use with Apple silicon Homebrew (or you can make it on the old Mac).
I only noticed this because I was trying to build something from source using some libraries that were installed via Homebrew and running into problems. An LLM was helping figure this out and it was telling me I might have to manually symlink those libraries from where they were in /opt/homebrew to where the build process for the thing I was building expected to find them and I didn't have a /opt/homebrew. The libraries were somewhere in /usr/local. I then noticed those libraries were not for Apple silicon, checked other things installed view Homebrew and saw nothing was for Apple silicon, and realized I had the wrong Homebrew.
Since the first 3 has no dependency on D, a better way would be to install them in parallel while D is still downloading.
nb info --cask codex-app
nb: formula '--cask' not found
nb: formula 'codex-app' not found
I already have a Brewfile in my dotfiles stored in git, but wanted a way to setup all the little things on my Mac like trackpad settings, dock settings, file associations, etc. nix-darwin is the obvious solution.
Gave the task to ChatGPT and it came back saying it's a good way to get started, but then offered a middle-ground of an idempotent script to set things up. So I investigated the latter, and after a couple of minutes I now have a setupmac() function in my .bash_profile (yeah I use bash) which mostly consists of a bunch of 'defaults' commands and a few other things, and now continue with brew for managing software and setupmac() to setup everything else, and of course manually manage my dotfiles for ghostty/nvim.
I wish I had this earlier, because I just set myself up on 3 different Macs in the last week or so. I'm also glad I don't need to learn a new language and tooling for something pretty simple. Everything is a bit disjointed and not as automated as a proper nix setup and doesn't have that fidelity that nix has, but it's straight-forward, compact in that it sits in my brain easily, and easy to execute.
Do they use some kind of Ruby parser to parse formulae?
[0]: https://github.com/Homebrew/homebrew-core/blob/26-tahoe/Form...
Tried the same package with brew. Worked like a charm.
Uninstalled nanobrew.
It's terrible
There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
Homebrew is working on an official Rust frontend that will actually have full compatibility. Hopefully this will help share effort across the wider ecosystem.