The important part to note is that the M1 hardware doesn't map well to newer OpenGL standards because Apple deprecated OpenGL in 2018:
> Regrettably, the M1 doesn’t map well to any graphics standard newer than OpenGL ES 3.1. While Vulkan makes some of these features optional, the missing features are required to layer DirectX and OpenGL on top. No existing solution on M1 gets past the OpenGL 4.1 feature set.
> How do we break the 4.1 barrier? Without hardware support, new features need new tricks. Geometry shaders, tessellation, and transform feedback become compute shaders. Cull distance becomes a transformed interpolated value. Clip control becomes a vertex shader epilogue. The list goes on.
OpenGL has been officially deprecated since macOS Mojave (2018), so it shouldn't come as a surprise to anyone that in 2024 the hardware doesn't map well to newer OpenGL features. The media narrative is trying to push this as an "outdoing Apple at their own game" thing, but Apple very clearly stopped supporting OpenGL and did so with advance warning years ago.
That said, it's an impressive accomplishment that they managed to translate the newer calls into compute shaders and other tricks. It's very impressive work.
They shouldn't have deprecated OpenGL though. It's really important for desktop software. Several games have lost Mac support due to this like elite dangerous.
Of course they push metal but that's not interesting to desktop developers.
One of the coolest things (IMO) about the entire Asahi effort, and why I'm not at all surprised that they surpassed Apple, was the dedicated effort to build bespoke developer-friendly Python tooling early in the reverse engineering process.
> Since the hypervisor is built on m1n1, it works together with Python code running on a separate host machine. Effectively, the Python host can “puppeteer” the M1 and its guest OS remotely. The hypervisor itself is partially written in Python! This allows us to have a very fast test cycle, and we can even update parts of the hypervisor itself live during guest execution, without a reboot.
> We then started building a Python implementation of this RPC protocol and marshaling system. This implementation serves a triple purpose: it allows us to parse the DCP logs from the hypervisor to understand what macOS does, it allows us to build a prototype DCP driver entirely in Python, and it will in the future be used to automatically generate marshaling code for the Linux kernel DCP driver.
If you watch any of Asahi Lina's streams from the time before they had their full drivers implemented in Rust, she's able to weave together complex bitflag-manipulating pipelines at the speed of thought with self-documenting code, all in Python running on the host machine, all while joking with viewers via her adorable avatar. I've never seen anything like it before. The whole workflow is a tremendous and unprecedented accomplishment by the entire Asahi team.
I completely agree, this is a great talk. I can listen to Bryan and friends speak for hours. I'm not sure if he's a natural story teller or if he's learned over time, but that combined with his experience just makes for a treat.
Knowing how to build effective scaffolding, in the right sequence, that continuously enables tighter and faster iteration loops on a project is a huge meta-skill that contributes to why the high end of experienced engineers are orders of magnitude more effective than others.
I was wondering if it is possible to formalise this and cast it into a book or lecture so that new devs don‘t have to grind out ten years of hard-gained experience. But at this point I think the only real way to really learn this is to go e.g. through dependency hell yourself at least once.
Mmh, is that a similarity with postmarketos? I have a fuzzy memory that pmbootstrap or some other tool was kind of important to start porting to new devices?
6 years ago I said I'd never give apple another cent, but the Asahi Linux project, and especially their efforts around OpenGL, or more specifically their ES 3 support, finally convinced me to pick up a secondhand M1 last month. Amazing work by the team!
I wonder if one day the USB4 ports will be able to fully support USB3, so confusing.
Anyway I haven't felt the need to boot macos so far :) and the installation was a breeze, big thanks to the team.
As a bonus the unified memory allowed me to get an LLM running locally, but I suspect it is CPU bound and probably not using the new GPU driver.
I love this work, but I will point out that the Asahi GPU driver still struggles on certain real world workloads.
The one I run into quite often is that Google Maps will hang for long periods of time. This is a known issue and they are working on it, so I'm sure it will be fixed sometime this year.
> For a bit of context -- Google Maps loads images to the GPU at.. inopportune times. While games would typically load their images during a load screen (so slow image loading just means longer loading screens), Google Maps loads when scrolling around I think (so slow image loading means the whole map stutters). I don't think there's a fundamental driver bug we can fix here, but we can make image loading a lot faster which makes the symptoms go away.
like it uses mesa and similar parts of the normal GPU support stack
and also involved some kernel patches outside of the rust kernel module as far as I vaguely remember
really a grate achievement where in the end rust was like a tool to make certain parts easier but all the hard parts are in the end unrelated to rust (like reverse engineering, coming up with solutions for the many issues of mapping GPU APIs Apple doesn't care much about to hardware mainly focused on on just Metal etc.)
Just the thought of how to prioritize things in all of the reversing is enough to make me overwhelmed. Have they talked about how they prioritize?
"Asahi's most recent update blog post, published in mid-January, highlighted HDMI support, support for DRM-protected websites via Google's proprietary Widevine package, Touchbar support for the handful of Apple Silicon Macs that use one, and more."
Seems odd to me that a Linux open source focused something would spend so much effort on supporting DRM over USB3 functionality. So for them to go that direction implies to me they have good reasons. Are they trying to satisfy users? My knee jerk reaction would be the users willing to use this would be accepting of not supporting DRM.
The Asahi team isn't some monolithic corporation with a prioritised backlog.
The skillsets and amount of work involved in getting Widevine working (quickly porting the binary from ARM64 ChromeOS) vs USB 3 (reverse engineer Apple's unique undocumented implementation and write a Linux kernel driver for it) are completely different.
It's open source, if someone volunteers a Widevine implementation for Asahi then the maintainers aren't going to say no.
Generally speaking Alyssa does GPU reverse engineering, Marcan does the other hardware reverse engineering, Asahi Lina writes the GPU driver and everyone else does various miscellaneous bits (like userspace binaries).
> The skillsets and amount of work involved in getting Widevine working (quickly porting the binary from ARM64 ChromeOS) vs USB 3 (reverse engineer Apple's unique undocumented implementation and write a Linux kernel driver for it) are completely different.
for one data point, I'd much rather have DRM supported than USB 3 support- one lets me do casual things and use my machine while I relax or do whatever, and for most USB 3 devices I can just suck it up and wait for a file transfer, or do what I normally do and work off of my NAS.
> Rosenzweig's blog post didn't give any specific updates on Vulkan except to say that the team was "well on the road" to supporting it. In addition to supporting native Linux apps, supporting more graphics APIs in Asahi will allow the operating system to take better advantage of software like Valve's Proton, which already has a few games written for x86-based Windows PCs running on Arm-based Apple hardware.
Does anyone know whether these improvements will also help with gaming on macOS? I assume that Mac ships its own driver without these APIs, but e.g. will Proton for Mac or Whisky (Wine for Mac) be able to make use of them?
Apple should embrace this if the project is in need of reference materials.
Apple is a hardware company primarily, so if someone buys their hardware because it will be compatible with an OS they may like more, that's still a sale for Apple. No funds are lost, because MacOS doesn't charge for a new install on Apple hardware.
Apple does not want to be a hardware company, in fact when they sell lots of hardware their stock price goes down. If they can't charge a monthly fee for it, they're not doing it.
I didn’t say they’re not a hardware company, I said they don’t want to be one because it makes shareholders very uneasy about the stability of the company.
Honestly, I have been a bit disappointed by Apple. On one hand, they make some seriously awesome hardware with amazing performance doing tasks like rendering and whatnot. On the other hand, they seem incapable of getting awesome performance in other areas, the hardware is incredibly locked away from the supposed owners of the devices, and the pricing for RAM and storage are insane. I don't want to dislike Apple because the hardware is great and their stand against US law enforcement on things like encryption, but a handful things shatter the experience for me. Asahi seems to be solving part of the problem, but the hardware being so ... expensive and locked away remains an issue.
I imagine they are, as well as the e2e encryption added to icloud after they dropped the CSAM thing.
My pet theory is basically a rouge VP started the on device scanning project with whatever external org they worked with, and basically blindsided the rest of the company. Tim Cook probably called the person and was like what the fuck. Or at least I’d like to imagine that. I wonder if that VP still works at Apple?
My understanding is that on-device scanning was meant to prevent an argument from {government, law enforcement, 'think of the children' types} against E2EE solutions.
Apple's solution for a lot of things in general, is to favour on-device solutions.
True but it's really hard to imagine how they ever expected the "my own device spies on me by design" to fly. Maybe they miscalculated that the "for the children" argument would convince the consumer which was obviously a huge miscalculation. Even though they clearly thought of safeguards so it wouldn't lead to many issues in practice, it's just so incredibly wrong at the core that it creeps people out just thinking of it.
I can't think of anyone in my circles who would accept this. It's just so extremely out of bounds. And I lost a lot of respect for Apple as a result considering they thought it was ok to propose this.
And it would have been total theater anyway because obviously the real offenders would just use something else.
one could argue that it was intended not to fly and that apple successfully threaded the needle of getting E2EE out the door without invoking a legislative response on the basis of "think of the children/terrorists".
look at what apple did: they lined up E2EE, got the pushback from law enforcement, floated a solution that would have accomplished the stated goals, then effectively invoked the court of public opinion which said "hell no" and then just launched E2EE anyway.
Now we are in the end-state of successfully having E2EE without a bunch of backdoors, and the only real victim is apple's public reputation with a bunch of android nerds who were never going to like them anyway.
> the only real victim is apple's public reputation with a bunch of android nerds who were never going to like them anyway.
I don't agree. I have several Apple fan friends whose admiration for Apple really dropped. They're still on iOS and Mac because they're so invested (walled garden and all). But the enthusiasm and deep devotion is gone.
Not just because of that, though it was the first drop. But also the recent sideloading malicious compliance thing. That deep trust is gone.
I used to be an apple fan too but I already dropped off earlier (for want of deciding what I can run on my own hardware). So I'm definitely an android nerd. But this made waves even inside the Apple camp.
It feels like the hardware engineers are just marching blind making stronger hardware for these computers by the year. Like the hardware feels like it has no constraints when you run a game. The emulation layers the game has to run through seem to limit the fps to 40, but interestingly there is zero performance impact to changing graphical settings all the way to ultra for many games that are able to run through rosetta. This shows there is a ton of headroom available if apple developers bothered making it easy for game developers to write natively for these computers.
Instead, good faith between apple and third party developers is totally burned and not even Valve is writing games for mac anymore despite how absurdly performant these devices are, especially compared to ten years ago when mac gaming was supposed to get its renaissance per apples marketing and the good relationships they had with developers back then.
I switched to an M1 Air back at the beginning of 2021, was on Linux (desktop machine) about a decade before that. It's basically purely a work machine (software and management).
The hardware is amazing. The battery life is astounding.
The software, in almost every area, is inferior to my experience in Linux. The Apple parts generally worst of all (only Microsoft Office on it is worse). In the end, I'm probably 70% as efficient on macOS as I was on Linux after 3 years of using it.
The one thing macOS did better than my Linux box, hands-down, was supporting a 4k monitor. My Linux install was a little long in the tooth, though, so hopefully it's gotten better?
I really look forward to using Asahi when/if it's ready. I'd love to go back to Linux.
Example gripes:
- macOS offers this great feature that announces the time every half hour (or whatever you set it to). For a year, after a major OS update, it just didn't work after reboot unless you went in and changed it. Solutions? Wait for Apple to fix it (or write a wonky script to fix it on boot, which I did). Eventually they did, but ug.
- to support using 3 finger clicks as middle-click I had to buy software. On Windows this is just built in. On Linux it's on by default.
- to make Spotify work somewhat properly I had to download separate software (BeardSpice). After 3 years, I forget what that actually does
- if I open Spotify and hit the Play button on my keyboard, Apple Music opens. (Once I actually play Spotify it fixes it, though)
- it took me a LONG time to figure out how to set Excel to open CSVs instead of Apple's Numbers app
- The Apple calendar app alerts me whenever I create an event in another app (Outlook, Google, whatever). Like I got invited. To an event I created. It didn't used to do that...
It seems a lot of those gripes are about specific apps, or simply not understanding that MacOS does somethings differently.
By contrast I have never been able to use Linux successfully as a desktop OS because there is a dearth of polished commercial apps, and the HW compatibility has always been iffy. MacOS generally works out of the box with the HW, and the apps are generally highly polished.
I caveat the apps as “generally” because there are some turds for sure. iTunes and now Music have always been horrible, and somehow they managed to make Music even worse than iTunes.
Historically Macs do benefit from 3rd party software (see also Alfred), over time the best of it seems to eventually get baked into the OS. I don’t see that as a downside really.
It's always about specific apps. I need about 10 apps to work on my work machine, and if any of them is fighting the OS it would be enough to degrade the whole experience.
In the olden days it was having a decent unix comand line environment. Windows didn't, osx and linux did, and many of us switched. For others it's a good Adobe suite experience. For others it will be Excel and Office etc.
I totally sympathise with parent and Apple Music starting up every damn time the play button is pressed without a media app already started. That's a button I used to press a lot, and at some point I stopped playing music on the mac altogether.
The move to kill kernel extension nullified a lot of the historical advantage of the mac: many stuff that was changing low level behavior became a lot more unreliable . Having keyboard remappings get stuck or unresponsive for instance is a huge QOL degradation.
Linux's desktop experience also lacks polish in a lot of areas. It's not a talent or expertise issue, it just comes down to lack of resources. You can do a lot of polishing if you say here's 50 million dollars and a staff of 50, have at it.
This x100. I have regularly tried to set up a Linux environment, thinking that there really isn't that much of a barrier to being productive on Linux, but with the lack of polished commercial apps and the hardware quirks I always seem to spend more time finding workarounds than actually working.
I have found more recently that I can get something reasonable (for me) set up, although I'm confident I'll always need a Mac nearby to do some things on.
I didn't really run into this as an issue at all. Then again, "commercial apps" sounds largely like a relic of the 90's. It has Slack and Teams and Spotify and can run most games thanks to Valve.
Biggest thing I ran into was that MS Word is too buggy to 100% reliably read anything created in alternate programs. I know it's Word and not LibreOffice or anything because it happens between different versions of Word too.
The tone of the community is really different, too—and imo the worst thing about the transition from Linux -> OS X. And ok, it's not a fair comparison, because Linux users tend to be way more technical due in part to the OS family's barriers of entry.
Still, it's annoying when you search for how to do something in OS X and the first few threads are full of people saying "that isn't how it works! Instead of wanting to change it, you need to understand and follow the Apple Way™!" You're usually able to find the solution after a bit more searching, but the Apple Forums are especially bad.
I still remember the apologists (circa 2011) trying to justify OS X's system-wide autosave feature when it came out despite its obvious pitfall -- users could make inadvertent edits to their documents without noticing. When users pointed out that Apple's own XCode didn't adopt that feature, the apologists' excuse was simply "IDEs are different".
Autosave also has versioning. You can always go back through the auto save history. You can also revert to the original opened file. This is done from the file menu.
Also, if like a good mac user you have Time Machine configured and on, you can browse your versions further into the past.
Heavy iWork user, I don’t know about Xcode though.
Does the version browser show a line-by-line diff? When I last used it ~10 yrs ago it displayed a fancy Time Machine-like interface that was useless for telling if I accidentally inserted a character in page 11.
If your mental model for how the software works is not an accurate abstraction for how it actually functions, you're going to be frustrated. If, when confronted with evidence that your mental model is inaccurate, you refuse to update your mental model, you will have difficulty coming up with solutions for the things that bug you, and difficulty even asking the right questions.
Being mad at software for operating in a different paradigm than you are used to is not productive. This all applies just as much to users coming to macOS from Windows (or imitations thereof) to macOS as it does to users coming to git from svn.
The "unreasonable men" who come up with better solutions are usually the people who have the deepest understanding of what is wrong with the status quo. I don't think the people who refuse to understand the problem are usually coming up with good solutions.
Understanding is a totally orthogonal concern. Just because someone doesn't like "The Apple Way(tm)" doesn't mean they don't understand it. Believing so just gives the "Apple's a cult" type people more ammo.
What most people on this thread are asking for is to be given the ability to mold/configure things that most other non-Apple apps provide. It's a well known trope to have the Apple culture say "You are wrong, do it this other way instead" rather than "here's how you get what you want".
Okay yes but git and an OS are very different pieces of software. One does one thing (version control), and to be effective at that single task its highly opinionated. This is fine and good; he restriction allows the tool to laser focus on being good at a few things. OTOH an OS needs to do many very different things, as many different things as there are users. Creating a single paradigm, and obstructing basic configuration stuff like mouse input settings, is the opposite of what you want in this case. An OS should be flexible customizable, to allow the user to fine-tune it to match their use case needs as closely as possible. Just telling an OS user "get with the program" kind of misses the point, IMO. I prefer OSs which meet my needs, rather than the other way around.
You may wish to note that I'm explicitly not telling the OS user to just "get with the program". I'm telling the OS user that they need to understand the OS before they can effectively modify it to suit their needs.
> Still, it's annoying when you search for how to do something in OS X and the first few threads are full of people saying "that isn't how it works! Instead of wanting to change it, you need to understand and follow the Apple Way™!"
> Being mad at software for operating in a different paradigm than you are used to is not productive. This all applies just as much to users coming to macOS from Windows (or imitations thereof) to macOS as it does to users coming to git from svn.
I guess I can see how you intended your comment to mean what you're saying, but reading these two comments in sequence I think one can understand my confusion. You may wish to directly say what you mean, rather than speak in glib generalities.
I don't think I'm the one being glib here. I'm trying to both illustrate that there's a problem that truly is general and not specific to macOS, and trying to make the fine distinction between telling users to understand and suck it up vs telling users to understand so that they can better figure out how to get what they want. That latter distinction is what you seem to be having more trouble with.
The way these modern macs are set up in terms of permissions is really strange and has put some projects on a backburner for me.
My latest woe is getting newsboat to refresh my rss feeds either as a cronjob or launchd service. On my mojave machine this was as trivial as it sounds, either a single line in crontab or a single line with some window dressing to make it into a launchd plist file then its OK. I try and do this now and it doesn’t work, since it appears newsboat runs and pulls changes but is unable to actually write changes to the cache database now.
I have gone down the rabbit hole now of giving full disk access to the shell, cron, launchd, launchctl, and the newsboat binary, but no dice. I can’t get it to work and I can’t find anything about why this shouldn’t work other than permitting full disk access. Even then that shouldn’t be required because the cache.db file sits in the users home directory.
I have looked at the newsboat logs with the most verbose setting and there’s nothing standing out, other than the run taking 1/10th the time as when I just run newsboat -x reload interactively, and no writing to the cache.db file. Nothing in std err or out. According to those newsboat specific logs it seems to connect to the feeds and output some new article titles along with the time to connect. It does manage to generate the cach lock file but thats it. Its maddening for sure because I feel like at this point I’m out of levers to try and pull.
By 4k monitor do you mean a high DPI monitor? Not much has changed in regards to resolution support, that should always have been fine, but a lot has changed in regards to high DPI support with Wayland now having "real" (direct) fractional scaling which makes it even better at approaching the problem than macOS which has "fake" (render and scale) fractional scaling. KDE Plasma and QT apps (the default desktop environment for Fedora Asahi) should support this already, I don't remember if Gnome has gotten around to defaulting to the newer protocol yet or if it was still experimental.
On Linux, I plugged in the 4k and everything is tiny tiny tiny. I tried to make it a little bigger, and some stuff grows, other stuff doesn't, and some stuff just broke. Then in the end I was sort of stuck in this weird frankenstein of settings that looked like crap on any monitor. Like I said, this was over 3 years ago and the Linux install was very old at that point (I think it was a Linux Mint based on 18.04 at the latest, possibly even 16.04 still).
Mac has a slider that changes the scaling of everything, smoothly and all at once.
See but that's exactly the thing, on Mac you can just pick a scaling and every scaling option looks flawless across the board.
Integer scaling is silly for me: it's either huge or tiny.
Yes, KDE does it better and with Wayland you can also do fractional scaling. But the original point stands: it sucks on (most) linux and works flawlessly on Mac.
It's not that Gnome can't, it's that it's not recommended. Gnome actually handles e.g. 150% scaling the exact same way as macOS does: render at a higher resolution then downscale. The reason it's not recommended over integer scales is that's inefficient, not that it doesn't work. The same recommendation exists on macOS. The difference in quality of life is macOS is generally better about assuming a given display should be high DPI the first time you use that display (and doesn't even show you the tiny options at all by default in such cases). So long as you select the same scale things should seem identical between Gnome and macOS though.
Windows/Android also use the superior direct render approach. The reason you always heard about Windows having horrible scaling was due to legacy Windows apps not having an understanding of how to use scaling, the actual scaling approach itself is very good (and most all apps still getting updates should be good by now).
> See but that's exactly the thing, on Mac you can just pick a scaling and every scaling option looks flawless across the board.
Not really, if you have a discerning eye.
Any non-integer resolution on macOS is rendered at twice that resolution, and then raster-scaled down and anti-aliased. So, if you have MacBook with a 2560×1400 display and you select any other resolution besides that and 1280×800 (exactly half), the desktop render will be raster scaled.
To be fair, it works better than Wayland's previous behaviour because of the 2× rendering factor, but you can absolutely still see scaling artifacts if you look for them, and I certainly do. Ringing around text in a dark-mode editor is especially obvious.
The only OS that has gotten HiDPI right is Windows.
“Better” still has some asterisks, for example though KDE is by far the best of the major DEs in terms of fractional scaling support under Wayland, it still has some oddities like Aurora window decoration themes not drawing properly with fractional scaling, limiting the user to one of a tiny handful of C++ window decorations (the overwhelming majority are Aurora). GTK apps under KDE can act a bit funny too, with e.g. your cursor drawing at 2x when hovering over a GTK window.
I deal with these things daily using a Thinkpad with a display that requires 1.5x scaling to be usable. Can’t wait for these issues to be solved and for the asterisks to disappear.
> GTK apps under KDE can act a bit funny too, with e.g. your cursor drawing at 2x when hovering over a GTK window.
Yea, GTK4 to my knowledge until now doesn't support fractional scaling, but the new™ ngl and vulkan renderers should be able to do fractional scaling. If it's actually hooked up to do so is another question.
Some people prefer Linux (or Windows) to macOS, and that's perfectly reasonable. Some of your gripes are a little curious, though:
> Solutions? Wait for Apple to fix it (or write a wonky script to fix it on boot, which I did). Eventually they did, but ug.
Writing a wonky script to fix something on boot sounds eerily like my Linux experiences.
> to support using 3 finger clicks as middle-click I had to buy software. On Windows this is just built in. On Linux it's on by default.
This could also be phrased as "in some ways macOS works differently to Windows and Linux. However, there is a solution for those who prefer the Linux way."
> to make Spotify work somewhat properly I had to download separate software...
That's Spotify, not macOS.
> if I open Spotify and hit the Play button on my keyboard, Apple Music opens
Yes, Apple Music opening for inexplicable reasons is very annoying.
> it took me a LONG time to figure out how to set Excel to open CSVs instead of Apple's Numbers app
That's just not understanding how macOS works. A 10-second web search would have ended your suffering.
> The Apple calendar app alerts me whenever I create an event in another app (Outlook, Google, whatever).
Your list sounds more like app related things rather than OS related things to be honest. Also, contrasting your issue list with the issue list of Linux, where Wayland barely works, audio issues are constant, the whole system is so delicate that an automatic update can easily brick the OS, 10 different ways to install apps leading to no consistency at all (flatpak, snap, .tar.gz, AppImage, native binary, apt-get, software center ...), makes MacOS sound really great.
My main gripe - how on vanilla MacOS alt(cmd)-tabbing cycles through applications first and then _all their windows_ rather than windows first, doing _nothing_ on a per application basis.
I cannot imagine how this matches at all to anyone's workflow. Nobody at apple works on a single project across multiple applications (documentation in browswr, your IDE, an emulator, some terminals, etc.).?
You have to install a specific app and give it some crazy permissions to fix those issues. Pfft
Command-` is the shortcut you want for cycling through windows within an app, otherwise Exposé/Mission Control is the layer that can be configured to think in “windows” instead of in “applications”.
Tiling WMs tend to make heavy use of the super key. I would hate to lose all my keybinds because of copy/paste and such moving. I do think the macOS approach is conceptually cool (emacs/readline binds everywhere is neat) and makes sense, but I also hate actually using macOS, especially window management stuff. I guess if there were a way for people to have it both ways, all would be well.
How? They exist and they're customizable, unlike Windows's.
And I'll bet you you can't bind xkill to Ctrl-Alt-Esc, and your cursor turns into a little skull-and-crossbones, and whatever you click immediately vaporizes via a SIGKILL, on a Mac. :p
I'm referring specifically to the use of ctrl as a primary key for both the shell and the ux. How did this happen?
Anyway, keybindings are certainly changeable if you're ready to dive into a hundred distinct projects each defining their own key handling. There's no truly system wide way to alter system keybindings in any meaningful sense.
It happened because Windows adopted IBM CUA, and the ‘desktop Linux’ crowd had a fetish for copying Windows. Before that, X11 programs were typically fully configurable, with key bindings in X resources.
Currently I think KDE is the least-bad option, as common shortcuts can be remapped globally, but it could be a lot better — Qt can universally remap ‘Control’ shortcuts to the GUI key, but it's only available on Mac builds.
On balance, I really like that Ctrl is overloaded in that way. Means that Super/Win/Cmd can be used for a wide variety of things on top.
For example, I use Super for tiling window management, launching programs, and other things. On macOS, with Yabai and skhd, you can't use plain Cmd to do that.
Overloading Ctrl has downsides, but I think it is a net benefit overall.
I mean yea, but in practice I run into frustration that the terminal doesn't have a key dedicated to commands far more than I need to do something globally.
Anyway, you can use cmd, you just need to have an actual command (non-modifier-key) to pass it—I have no use for a key that does one thing.
The problem is that many apps on macOS use Cmd + one or more other modifiers for their own shortcuts. On Linux at least, no apps that I know of use Super in any shortcuts.
I'm trying to manage working on both Linux and MacOS, and this has been the number one frustration. Fortunately, toshy.app (as well as into) exists and does a pretty great job at mapping shortcuts to match MacOS.
I thought Asahi Linux's sole developer didn't allow discussion of his project here? I remember there being quite a lot of fallout from that, what happened?
Probably just not wanting to deal with the unsavory side of the lurking userbase on HN or the poor, ill-informed discussion on sensitive/political topics? There are certain things HN is not good at talking about
There are multiple people I've seen express this in-desire towards HN
They can’t forbid discussion but they can try to make it so people visiting from HN can’t see the page; they aren’t the only ones, either, as the poster upthread mentioning jwz’s image redirect noted.
for a short time, he was monitoring the referral header and made his website display an offensive message if you clicked the link from HN.
HN's mods then edited links to his domain so they would emit the referral header, and he responded by implementing a browser exploit to detect if certain HN URLs were in your browser history.
In any case, I apologize for using the wrong wording.
The M series chips are relatively unique in the market right now. Being able to do real work for several hours on battery is quite impressive. Hopefully that carries over to a linux install.
Interesting! I didn't know they were the same person, but coming from a queer perspective, I'm inclined to just treat people as they ask to be treated. If someone is doing impressive work, I think it's important to respect the way they want to be treated, even if it's non-traditional.
Some parts of the programming community has always embraced counterculture/things the public sees as bizzare. Surprise surprise, it's also the part that spends a lot more on the computer than even the already addicted average programmer. So they tend to be godly at the craft compared to us lesser mortals that still maintain some normal social behaviour and they can produce amazing stuff like this. Be thankful for it.
We knew you could, but you might as well not. Even if you don't see the empathy angle, who benefits from alienating people who do amazing work for free?
Even if you restrict his work to technology, it might be a wash at best.
Wasn't he one of those leading the charge in getting Cloudflare to drop a site due to bullying allegations? I think I'd rather have Cloudflare be neutral than have a few Linux drivers for new MacBooks a couple of years sooner than otherwise.
I didn't follow the saga as closely as I should have, perhaps, given the gravity. But this* didn't take long to find, only shortly preceding the issue.
But I don't see it as alienating. Marcan seems an intelligent person so he knows that his vtuber persona is weird. I suppose other people noticing and commenting on it is something he likes or even encourages. It's like... people who dye their hair green, or who wear thick golden chains or teeth grills. They want the attention.
As someone who is weird along many dimensions, I do expect people to treat harmless weirdness as normal. Expanding the set of acceptable behaviors to encompass all harmless ones is an important part of social progress.
Maybe consider that the loudness you resent is the result of reflexivity, and in a world where no-one would flip an eyelid over such persona creation and whatnot, there would only be the loudness you'd mistakenly perceive.
I'm wondering if you call authors using pen names "weird".
If I ever at all did youtube videos of anything at all, I would more likely than not adopt some persona (and give it a weird accent to boot) so it would distinctly be not "me".
Look at you, HideousKojima. What's your real name? Why do you pretend to be this online alias? Is it normal to hide your identity and interact as if you were someone else?
Take it with as much grains of salt as you want: IMO we're watching anime eat the world, Just like software did. Anime style audio-visual stimuli are synthetic data with human and its sexual drive in loop, now also connected with Internet mob judgement system like Twitter for a decade or so. Instagram style beauty filters and Diffusion ML models are no technical match, as their feedback loops are much longer, more constricted by ideologisms as well as by physicality.
Some platforms like TikTok and App Store had been fighting to un-realize the shift but that's only been delaying eventuality. We are already seeing increasingly beautified Tim Cook, anthropomorphized Earth, and two Chinese anime games on Apple presentations last year, there's only going to be more.
This is such lazy framing. What the team has done is great, but this article says they’ve beat Apple in a race Apple refused to run.
Duh. Of course they won.
The original blog post has tinges of this framing, which just reads sort of like dunking/hostility to me (for the reason above). This Ars article, and others I’ve seen, just run with it and make it the centerpiece.
Apple isn’t trying. It’s not a race. So framing it like that is a disservice to the reader.
Just celebrate the accomplishment. Write about the fact the team is doing so great. You don’t need to shove it into some incorrect narrative.
https://news.ycombinator.com/item?id=39371669 (100+ comments)
Conformant OpenGL 4.6 on the M1 (rosenzweig.io)