Hacker Newsnew | past | comments | ask | show | jobs | submit | more kryptiskt's commentslogin

Wasn't it just a couple of weeks ago that they started supporting M3? That smells like progress to me.


One can start working on creation of a teleportation device. Doesn't mean we have it.

https://asahilinux.org/docs/platform/feature-support/m3/

What do you see as progress here? Nothing is supported, everything is "to be announced" (i.e. unsupported).


they likely meant this progress post showing a desktop booting on an M3 mac: https://www.reddit.com/r/AsahiLinux/comments/1qnddjd/m3_now_... albeit with software graphics


Looks believable that they are indeed the devs behind the project, but it's weird to post stuff like that to... reddit? They have a site for the project, why not post there?


You could read the updates... https://asahilinux.org/2025/10/progress-report-6-17/

Not marketing a not yet complete feature on their website makes total sense. People on internet hating Asahi linux just cause seems like weird to me.


> New MICROVM kernel for x86, supporting both i386 and amd64, NetBSD 11.0 introduces a dedicated MICROVM kernel designed for extremely fast virtual machines boot, leveraging PVH boot, VirtIO MMIO, and multiple kernel optimizations, it can boot in about 10 ms on 2020-era x86 CPUs.

Exciting


This sounds pretty cool. I have a couple of old DELL 630s that were automotive diagnostic computers, due to them being the last model with a real hardware serial port. Now I am thinking of reviving them with Linux, but just to host old windows VMs (all the auto diag software is windows only). Maybe I should give netbsd a try here.


Why would that be? Do you expect this to make your Windows VMs start up faster?

It won't. This applies only to NetBSD guest VMs.


great callout


> If you want to code by hand, then do it! No one's stopping you. But we shouldn't pretend that you will be able to do that professionally for much longer.

Bullshit. The value in software isn't in the number of lines churned out, but in the usefulness of the resulting artifact. The right 10,000 lines of code can be worth a billion dollars, the cost to develop it is completely trivial in comparison. The idea that you can't take the time to handcraft software because it's too expensive is pernicious and risks lowering quality standards even further.


I thought typos was a signifier for human-created these days, because an LLM is unlikely to land on something that is not a word.


I agree with you. I'll speculate no further.


If you are a bank or a bookmaker or similar you may well want to have total control of physical access to the machines. I know one bookmaker I worked with had their own mini-datacenter, mainly because of physical security.


I am pretty forward-thinking but even when I started writing my first web server 30+ years ago I didn’t foresee the day when the phrase “my bookie’s datacenter” might cross my lips.


Most trading venues are in Equinix data centers.


It looks like that if you want logically separated commits from a chunk of programming you have done. Stage a file or a hunk or two, write commit message, commit, rinse and repeat.


Absolutely: for all meaningful work I prefer small, logical commits using git add -p or similar, both for history clarity and for reviewer sanity. In initial “spike” or hack sessions (see early commits :)), it’s sometimes more monolithic, but as the codebase stabilized I refactored to have tidy, atomic commit granularity. I welcome suggestions on workflow or PR polish!


C# doesn't depend on a VM these days when it is AOT compiled. Same for Java, though C# is rather more user friendly in how it goes about it.


> C# doesn't depend on a VM these days when it is AOT compiled

Maybe I’m being pedantic, but this is an oxymoron. Also the premise is incorrect. It’s not like the VM is gone. Merely baked into the code at compile time. It compiles IL to native code. Same for IL2CPP. The VM is still there.

The term “virtual machine” is confusing. I think you meant to say JIT compiler :-)


The trouble with that level of pedantry is that you then can point to LLVM as a VM and say that Clang and other C/C++/Rust tools that AOT through LLVM are "too tied to a virtual machine". Then you can go back through the history of cross-platform optimizing C/C++ compilers and find VMs in the design in almost all of those, too. LLVM is not hiding it in its very clear name, but low-level VMs were a thing for decades before someone named LLVM.

VMs have a long history in cross-compilation, even for "low-level" languages like C/C++. The AOT versus JIT distinction is blurry, and the "VM language" versus "non-VM language" boundary is blurrier still, especially when you take into account "standard runtimes" such as glibc and vcrt and whether or not those are statically linked.

Is a C program with a compiled with Clang through the LLVM dynamically linking a glibc and statically linking a Boehm GC library "running in a VM"? There's no wrong answer, it's a lot shades of gray. I believe almost every pedantic way to answer that has an equally pedantic counter-argument.


IBM had such a culture back in the day, where they feted 1 kloc/day programmers. That was what Bill Gates sneered at with the "Measuring software productivity by lines of code is like measuring progress on an airplane by how much it weighs" quote.


But why? It's not all that much harder to do original work than redrawing the output of an AI.


I'm not an artist, so I won't comment on the "ease", but if it were true, then why would a ban would be required?

Regardless, you should check out the AI features in the adobe products [1]. Generative removal, fill, etc [2].

AI, in modern tools, is not just "draw the scene so I can trace it".

[1] https://www.adobe.com/ai/overview/features.html


Games Workshop is more entrenched in their niche than any of the FAANGs. They can do what they want, because nobody else can do WH40K.


I can guarantee you that there are more than a few small producers in Guangzhou that can, and are using whatever advantage they can leverage (including AI, like the rest of China's industry).


It's a license and IP issue. Nothing technical otherwise 3d printers should have put GW out of business overnight.


It's like saying because even the newest Pokemon game has shitty textures which anyone can do better than with AI, you can compete against Pokemon.

People don't buy merchs of these well-established IP for their astonishing production value. And especially not for how cheap they are.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: