The greatest invention was the global menu bar, as seen on Apple computers. That is the only iteration of the menu bar that respects Fitts's Law: by putting it at the top of the screen, the menu bar has infinite height, so it is infinitely easy to reach.
The global menu bar is bad UX for a windowing desktop because it visually disconnects actions from the window they belong to. It's not always obvious which window has the focus and using the menu bar can easily trigger an action in a different window or application than intended.
On Windows, I don’t stop to think “hmm, which edit menu is this”. And, if I have multiple windows from the same application open (multiple document interface), I know which document the edit menu is for.
There is a genuine tradeoff here. Global menu bars are easier targets and waste less space. Per-window menu bars are more clearly bound to their window. Neither approach is better on all counts.
To be fair, Microsoft got the memo, it took all the useful parts of the global menu bar and put them in the start bar, which is itself glued to the bottom of the screen and thus respects Fitts' Law. Windows did put the application-specific menus into the per-window title bar, but even then the most important part of the title bar is the section with the close/minimize/maximize buttons, and on Windows a maximized window puts those buttons at the edge of the screen, again as per Fitts' Law, whereas Mac disobeys Fitts' Law here and puts those buttons below the global menu bar.
> Windows did put the application-specific menus into the per-window title bar
Except that's not what Windows does. Just about every app has a different place where you can find the menu (if there is one). Looking at a couple Microsoft apps on Windows 11:
- File Explorer: menu in an ellipsis in the toolbar; toolbar is left aligned so the ellipsis menu is in the middle of the window for a reasonably wide window.
- Edge: menu in an ellipsis to the right of the address bar.
- Windows Terminal: menu in a tiny down-facing caret next to the new tab button.
That's referring to the long-ago time period when MS first decided to forego the global application menu bar in favor of the Windows taskbar. Indeed, the fact that many programs no longer feature traditional menu bars, but that they do still feature close/maximize/minimize buttons in their titlebars, only further reinforces how much more important those buttons are. Even when I used Mac, even when using apps that had useful entries in the global application menu bar, I cannot remember a single instance of voluntarily using the global application menu bar for anything (other than "please close this app", because the year is 2022 and Mac still refuses to acknowledge that closing the last window means I want the app to close).
Good observation, though Microsoft forgot it (again) when they designed Windows 11 and moved the Start button from the bottom-left corner to somewhere in the middle of the screen.
It's movable of course, but this is still a downgrade from win10. My biggest peeve is that there seems to be no option anymore to not-condense icons, and you can't change the notification color (when an app flashes its icon).
I miss notifications on my work computer all the time because the flash color just doesn't stand out very much with my dark theme, and I can't even expand the width because...well it feels like they released an unfinished and unpolished product, again.
I hate that there's trade-offs that shouldn't need to exist with every new version.
Meanwhile my mac I don't think has ever lost functionality or customizability in the last..8ish years I've been using it as my daily driver.
edit: caveat...this is a work computer and obviously has group policies/etc that might affect things. I will correct this post if I'm wrong :)
Yes, I'm not trying to defend their modern design (in)sensibilities. :P I switched from Windows to Linux (with an aborted attempt at Mac) after realizing the OS-level ads weren't going anywhere.
Not sure how that's relevant to my comment; Microsoft was indisputably "inspired" by Apple, in the same way that Apple was indisputably "inspired" by Xerox.
I disagree. I have a 38" ultrawide monitor (that I love). That is a long way for the mouse pointer to go every time I want to select something from the menu bar.
To make matters worse, I have progressive lenses and the "sweet spot" for the text to be in focus within my field of view is relatively narrow. So not only would I have to move my mouse quite a bit if I had a global menu, I have to move my head quite a bit as well.
I think Fitt's Law is something like Moore's Law: both couldn't be counted on to go on indefinitely.
Fortunately for me, I use KDE and could switch between the two if I wanted.
The global menu bar is a design flaw and evolutionary dead end. It doesn't respect Fitt's law when people have multiple monitors. It also implies alt-tab to switch application Vs alt-tab to switch window which is, again, broken when you have multiple monitors (alt-tab to switch window is the correct choice).
Not only that but Fitt's law is dying as mice become less and less used as a means to interact with a device. Touchpads and touchscreens have usurped them.
Fitts's Law is still alive and kicking in the touchpad era. Until we have anything that drives a cursor, it is relevant. A global menu bar is infinitely tall on a touchpad as well because I can randomly flick my finger up and I will have reached it.
And I do not get what a global menu bar has to do with window switching and monitor switching. Tiling WM fans all know that focus-follows-mouse is saner than alt-tabbing like a maniac, and this doesn't invalidate the role of a global menu-bar.
I know it’s not everybody’s cup of tea but I wish that global menubar options on Linux were more robust.
There are extensions to add one to GNOME and XFCE, and KDE has it as a built in option, which is great. Problem is, this feature relies on programs advertising their menus via dbus, and a lot of them don’t bother at all, meaning that it’s sitting up there blank half the time. The only reason that this is even possible is because the desktop paradigm moved menus into the space of the app UI, which I think is a mistake. Menus are such an important accessibility feature that even if they’re glued to windows, it should be a standardized system owned widget so overzealous IKEA-minded designers can’t screw with them.
This is very frustrating to me because with hamburger menus and window attached menus I feel like I’m chasing a frequently used widget around the screen.
I’ve never really understood this. None of the things you actually want to click obey Fitt’s law. It’s really an argument for having the toolbar at the top of the screen instead. Or having a taskbar at the bottom, which Windows had long before the Dock came along.
(I’m not really the target market for any of these things, to be clear, gimme the screen space back and don’t make me touch the mouse).
The Control Strip in System 7 predates the Start button and taskbar found in Windows 95. If we go back further, NeXTSTEP introduced the dock and let’s not forget that AmigaOS and RISC OS both had very similar functionality too, along with others that I’m forgetting that all predate Windows 95.
The Control Strip was pretty anaemic though. It was more like the tray (i.e. a dumping ground for stuff people don't regularly interact with). But still, when you do want to play a CD, there you have it in one click, rather than one click blessed by Fitt's Law to drop a menu down, followed by moving back down carefully to find the thing you actually want. In the meantime I just typed the command with autocompletion in Emacs and moved on with my life.
That was designed for a non-multitasking OS with a tiny screen. You have to wonder why they never stopped drinking their own kool-aid when technology changed.
But Microsoft never got the memo.