IMO, this has been their assumption for years, and it actually turned me off when I tried getting used to Mac circa 2006-2007. Coming from Windows at the time, I just couldn't get over a weird anxiety that my application window wasn't maximized, because it didn't look like it completely snapped into the screen corners.
Now, using 34-inch ultrawide monitors almost exclusively, I never maximize anything... it'd be unusable.
Browsers only ever get maximized to the left/right half screen for me too
Which is something macos should really improve on though, the ux is pretty bad compared to Windows and Linux there
I maximize windows of graphics and video editors.
All the rest I'd prefer to just summon as-needed and then dismiss without navigating away from the windows I care about.
sway/niri want me to tile every window into some top-level spot.
Took me a while to admit it, but the usual Windows/macOS/DE "stacking" method is what I want + a few hotkeys to arrange the few windows I care about.
[1]: https://github.com/esjeon/krohnkite [2]: https://github.com/paulmcauley/klassy
Hover over the green button in the top left of the window. I recently found out about that menu for moving a window between screens, which is also an option it has. (I also just found them in the Window menu if you prefer that. I dont; the options take an extra level of hovering to get to.)
Apple then made things go full screen, but in a special full screen mode, so macOS worked more like the iPad.
By the time they added a way to maximize windows in the way Windows does, the idea of maximizing an app has largely worked its way out of my workflow. It was always too much trouble, and I find very few apps where it provides much benefit. Web browsers, for example, often end up with a lot of useless whitespace on the sides of the page, so they work better as a smaller window on a widescreen display. In an IDE, it really depends on what’s being worked on and if text wrapping is something I want. Ideally lines wouldn’t get so long that this is a problem.
With the way macOS manages windows, I often find it easiest to have my windows mostly overlapped with various corners poking out, so I can move between app windows in one click. The alternative is bringing every window of an app to the front (with the Dock or cmd+tab), or using Mission Control for everything, neither of which feels efficient.
I could install some 3rd party window management utility, I suppose, but in the long run, it felt easier to just figure out a workflow that works on the stock OS, so I can use any system without going through a setup process to customize everything. It’s the same reason I never seriously got into alternative keyboard layouts.
Full Screen Mode was their answer to maximize, going back many years now (10.7).
Except Safari, which just fills out the window's height vertically. Kinda weird to make an exception like that but I don't hate it, because I generally use Safari for reading, and shrinking the browser's width forces lines of text to not get too long if the website's styling isn't setting that manually.
When I use the Window menu, Zoom replicates what double-clicking the top title bar does, while Fill maximizes the window. This holds true with the behavior you describe in Safari as well.
It just seems like a lot of apps treat Zoom and Fill the same now (I tried Calendar, Notes, TextEdit, and NetNewsWire), which adds to the confusion.
I’ve never found a setup with multiple desktops or similar with a way to quickly switch between apps I’m using more than “editor slightly more left, browser slightly more right, …” and just clicking on a border I know brings that app to the front. I’m sure many think I’m crazy. That’s ok. :)
That said, I generally hate the new OSX UI. Every UI element that is non usable just became larger and wastes space I should be able to utilize. Likewise, it made some operations insanely frustrating (here’s looking at you, corner drag resize!).
The assumption is that the window should be the size of the content of the document inside.
It turns out that this approach works well for many applications, especially what the mac was designed for in the 80s and 90s. And it's horrid for modern "pro" applications.
I haven’t maximized a window in years. They look ridiculous like that. Especially web pages with their max width set so the content is 1/4 the screen and 3/4 whitespace.
If I ever accidentally full screen a window, and it’s not in night mode, I am instantly blinded by a wall of mostly white empty background!
I frequently use macOS on a projector, it doesn't quite fill my wall floor to ceiling but it comes close. I don't use full screen often, but I do it occasionally as a focusing strategy, and it's fine.
You're shining a bright light on a wall, which you are looking at.
With a monitor you are shining a bright light at your face, while staring directly at the lightbulb!
If you're using a monitor in the dark the way you use a projector, you should turn the backlight down. If you're using it in a well lit room, the brighter backlight should have less of an effect.
It’s probably a me problem, but I’m going to open stuff and then leave it scattered around all day. It’s fine.
I don’t use more than a couple of virtual desktops either. Just one for current tasks and one for background apps.
My actual biggest pet peeve with this setup is the vast number of web sites that deliberately choose to limit their content to a tiny column centered horizontally in my browser, with 10cm of wasted whitespace on each side.
Somewhat relatedly, we use Windows at work, and it drives me crazy when I hop on a computer after someone's been using it and they have every single thing maximized, even Windows Explorer, on 27" monitors. A maximized browser, I get... I don't do it myself but I understand how it can be useful, but maximizing Windows Explorer is just insane to me, and yet a lot of my coworkers do it.
I sometimes maximize something - other than video calls: those are always full-size - on the laptop screen, but otherwise not at all.
I can see how a full-screen IDE makes sense, but I don't use one, so I always want a couple of terminal sessions running alongside my editor.
There are vanishingly few contexts in which I find full-screen helpful. Not criticizing anyone else, or recommending my way of working, but it's what works for me.
[0] I would like better support for desktop management: naming and shortcutting, particularly. Years ago I tried some (I think it was Alfred, or a predecessor) add-on that promised that, but it was super flaky. Does anything exist that works well?
I think there's a conflict between the users who use it on studio displays and users who use it on 13 inch laptops. The Mac team at apple won't pick a side or come up with two solutions.
That's not completely true, they've been pushing swipe between fullscreen apps for a while.
But that doesn't make any sense on an iMac.
So the recommendation from pro users is to use Alfred to manage windows.
I can’t tell if this is a serious comment or humor.
However, after the internship I went right back to fullscreen/window tiling in linux, so I can't say I really preferred it. Even now as a Gnome user with a big monitor and magic trackpad on my desk - which gives me ~equal access to either approach - I fullscreen everything.
But for other apps where interactions tend to be brief like Finder, Messages, Notes, Music, etc - yeah I don't usually expand them to full screen.
In general my browser is dead center or slightly to the right so I can access my other windows (terminal, throw away text editor, etc) easily where command tab is insufficient (when I have multiple terminal windows, eg)
I suppose you could splurge for a Mac desktop and then get the cheapest, smallest screen possible, but I hope it’s rare.
I'd like to be able to snap things to the middle third, especially on the ultrawides.
Only little calculator widgets, property panels, and modal dialogs that get immediately closed after use don't get maximized or at least docked to fill some region. I hate the cluttered, layered feeling of having a bunch of non-full-screen windows overlapping, I want to have a dozen apps open and making optimal use of the available display area.
Also just want to be 100% clear: Tahoe is bad and I hate the changes and I don't think the OS should prefer one way of working over the other. I just hope it's helpful to explain my perspective.
I have a 39" ultrawide and I keep every window maximized. I have OCD about this. I can't stand things all layered on top of each other. I like to focus on one screen at a time.
Chromium browsers have been rolling out split tabs and I use that on a couple of tasks where I'm constantly cutting/pasting between sites, but that's about it.
so in response I changed my windowing strategy to having a set of windows floating around at exactly the size I want them, and then the advantage of the enormous screen is just how many windows I can have open at once
that being said, I use KDE not MacOS, and 90% of Mac users I'd guess are on laptops, so using this strategy sounds completely insane to me. On laptops I still default to fullscreening or "half-screening" most apps.
As you said, browser and IDE are the big exceptions, plus things like Lightroom or my 3d printer's slicer.
Even VS Code usually lives as a smaller window when I'm using more a text editor rather than as an IDE.
I have been using it for years and I just gave up entirely on managing anything and if I zoom out to see all my windows it looks like the freaking Milky Way from windows I forgot
Meanwhile, I want to use my graphical, mutli-window preemptive multitasking operating system to, you know, use multiple applications at the same time.
Trying to maximize a window, even 23 years ago when I first moved to OS X, was a completely manual process. It was designed around windows, not walls. And screens were much smaller and lower res back then.
This goes towards something that I've felt for a little while: at some point in time around the early 2000s, operating system vendors abdicated their responsibility to innovate on interaction metaphors.
What I mean is, things like tabbed interfaces got popularized by Web browsers, not operating systems. Google Chrome and Firefox had to go out of their way to render tabs; there was no support built into the OS.
The OS interfaces we have now are not appreciably different from what we had in the early 2000s. It seems absurd that there has been almost no progress in the last 25 years. What change there has been feels like it could have been accomplished in user-space, plus it doesn't get applied consistently across applications, thus making it feel like not a core part of the OS.
MacOS in particular was supposed to an emphasis on the desktop environment being the space of window and document level manipulation, as exemplified by the fact that applications did not have their own menubars. All application menu bars were integrated together at the top of the screen. Why should it be any different with any other UI organizational feature? Should not apps merely be a single window pane, accomplishing a single thing, and you combine multiple apps together to get something akin to an IDE out of them?
Well, I don't know if they should be. But they can't. Because OS vendors never provided a good means to do it. Even after signalling they wanted it.
Look at how older versions of Word, Excel, and Visual Studio worked. The tool trays stay consistant as you move between document windows. The entire application is minimizable and quittable together as one.
Photoshop still uses this metaphor. In the ealry and mid-2000s, Photoshop on Windows had a window for the application separate from the documents, but on Apple OS9 and OSX, the only representation of the application itself was in the menu bar. Document windows and tooltray windows both floated in the same desktop space as every other window.
I haven't checked on the GNU Image Manipulation Program, but I seem to remember it retained the same "no application window, tooltrays and doc windows exist in the DE" metaphor for much longer than Photoshop.
There is also a difference in the way that Chrome renders tabs in the window title area. That's a part of the UI chrome that one would expect to be in the perview of the UI toolkit, but Google took it on themselves.
https://en.wikipedia.org/wiki/Tab_(interface)
Don Hopkins himself can enlighten us about it (NeWS) better than me literally anyone in this thread, jut wait.
My computer was running so slowly that I had to minimize transparency in system preferences somewhere. I think I also turned off opening every app in its own space. And I hid the icons on the Desktop in Finder settings somehow, which helped a lot. There are countless other little tweaks that are worth investigating.
I also highly recommend App Tamer (no affiliation). It lets you jail background apps at 10% cpu or whatever. It won't help with WindowServer or kernel_task (which also often runs at 100+% cpu), but it's something.
I can't help but feel that there's nobody at the wheel at Apple anymore. When I have to wait multiple seconds to open a window, to switch between apps, to go to my Applications folder, then something is terribly wrong. Computers have been running thousands of times slower than they should be for decades, but now it's reaching the point where daily work is becoming difficult.
I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples. Then we can finally boot up with better alternatives that force Apple/Microsoft/Google to try again. I could see Finder or File Explorer alternatives replacing the native ones.
That's because some app is spamming window updates.
It's been an ongoing problem for many releases. AFAICT, WindowServer 100% CPU is a symptom, not a cause.
FWIU there's really no backpressure mechanism for apps delegating compositing (via CoreAnimation / CALayers) to WindowServer which is the real problem IMO.
https://news.ycombinator.com/item?id=47282085#47310011
Probably my least favorite redesign in the whole update. Why is everything an oval? It's just bizarre.
If the biggest flaw of a OS is the border radius of its windows, you've got yourself a pretty decent OS!
It's not gonna make me leave my darling Linux, ofc, but i think this whole debacle can only be interpreted as praise.
On second thought, it might also be considered a mediation on people's tendency to bike-shed.
Or to stay it another way, if we see shit like this then we know the whole thing is a hack.
For example, there is not much you could do to Finder to make it worse.
This argument would also make Windows 11 a pretty decent OS by extension via "If the biggest flaw of a OS is the position of the start menu you've got yourself a pretty decent OS".
In general I could use any minor nuisance as a proof of decency - or inject some to form this argument on purpose as a manufacturer.
People don't like if their environment changes in minor unsolicited ways. There's always gonna be fuzz about these things and that means that the fuzz itself can't be used to make any strong argument whatsoever.
That’s way more than just the “position of the start menu”
As someone who works on Windows, Mac, and Linux; Windows stands alone in my opinion as the "stepping on legos with no socks on" of operating systems.
There are loads of other flaws with the OS. It just so happens that people care a lot about the design of Apple's products, so people talk about these details.
MacOS has been shit for as long as I've used it (8 years) and probably for much longer than that. There are many lists available of MacOS problems (https://old.reddit.com/r/MacOS/comments/12rw1sn/a_long_list_... for example), it's just that there's not much point making a new article about the Finder that's been shit, and unchanged, for a decade.
And the updates to Music (formerly iTunes) are so bad the entire team should be dressed down, Steve Jobs style.
There are things which definitely do bother me like the Liquid Glass, but the window corners really don't bother me. And I'm into design and constantly inspect parts of ui with Digital Color Meter app.
I get the UI consistency thing but it's okay to transition to new UI things gradually than making radical changes all at once. If this is still an issue 2yrs from now it will be more of a concern about their commitment.
What does it say?
Ads in a start menu can die in a fire though.
If you want ads in Spotlight or Launchpad, telling people to tolerate "opinionated, and likely worse but also not breaking" features is exactly how you get it. It's how Windows got there.
The platform would aggregate by major/minor version, and you could see in totality whether the current version of macOS/iOS would make Steve proud of miserable.
Ultimately I decided against it, for defamation/cease-and-desist reasons, and not wanting to find out. But it needs to exist.
Rounded corners are just...bizarre. Just because the laptop casing is physically rounded !? (Yet the menubar squares it off off at the top, and the bezel squares it off on the bottom...)
True, the "blessing" of forced online accounts, telemetry and advertisement didn't arrive to MacOS, yet. But, I wonder how long it will take us to get there.
Not really, if you have malware that has root access on your system I think you're already pretty screwed, especially considering that you don't even need root to read all your saved passwords and personal files https://xkcd.com/1200/
Does anyone actually do this? Especially for heavy-duty applications like my web browser and IDE, this has always felt like a bizarre assumption to me.