Vibe coding should be done in Python, and probably only in Python.
If a project is important enough to require C or x86 assembly, where memory management and undefined behavior have real consequences, then it’s important enough to warrant a real developer who understands every line. It shouldn’t be vibe coded at all.
Python’s “adorable concern for human problems” isn’t a bug here, it’s a feature. The garbage collection, the forgiving syntax, the interpreted nature: these create a sandbox where vibe coded solutions can fail safely. A buggy Python script throws an exception. A buggy C program gives you memory corruption or security holes that show up three deployments later.
The question isn’t what language should AI write. It’s what problems should we trust to vibe coding. The answer: problems where Python’s safety net is enough. The moment you need C’s performance or assembly’s precision, you’ve crossed into territory that demands human accountability.
I don't think we will ever reach the same accuracy simply because some languages are harder than others. Writing correct C (not just "working C") is difficult because of the number of footguns and the amount of reasoning and implicit state tracking you have to do in your head. LLMs are not free from this, if anything that's harder for them. Meanwhile languages like Java are more difficult just because they are verbose and spread information over many files. Languages that are explicit, not overly verbose and make it difficult to make subtle mistakes will always have the advantage with LLMs
I more bullish on the Python -> Rust pipeline. The two languages have a lot of overlap in philosophies, have great interop, and have similar levels of guard rails (when it comes to multithreading Rust even beats Python in terms of safety). And both languages seem well suited to being vibecoded
I wouldn't limit it strictly to Python, but I agree that it should be limited to memory safe languages. Vibe code in C#, Rust or PHP if that suits your requirements, but C or assembly are indeed poor choices for the reasons you list
Vibe coded Python can certainly have security holes too.
If you want a language that protects you from the largest amount of problems, how about Rust? Vulnerabilities will still be possible, but at least data races won't be possible.
> The moment you need C’s performance or assembly’s precision, you’ve crossed into territory that demands human accountability.
I disagree. I write a lot of one-off numerical simulations where something quick and dirty is useful but performance matters and the results can be easily verified without analyzing every line of code. Python would be a terrible choice.
Am I the only one who feels like this is the output of an LLM?
A poorly written comment by a human wastes time. A vibe comment by an LLM wastes both time and electricity that only shows up when global warming reaches 3c.
The question isn't if the comment is valuable or not. It's whether it is ethical or not to waste peoples time with AI slop.
Ok but what happens when you reach the point where the LLM makes fewer mistakes than the human? Where it’s better at spotting potential memory corruption and security holes?
It doesn’t feel like we’re very far from that point.
If a project is important enough to require C or x86 assembly, where memory management and undefined behavior have real consequences, then it’s important enough to warrant a real developer who understands every line. It shouldn’t be vibe coded at all.
Python’s “adorable concern for human problems” isn’t a bug here, it’s a feature. The garbage collection, the forgiving syntax, the interpreted nature: these create a sandbox where vibe coded solutions can fail safely. A buggy Python script throws an exception. A buggy C program gives you memory corruption or security holes that show up three deployments later.
The question isn’t what language should AI write. It’s what problems should we trust to vibe coding. The answer: problems where Python’s safety net is enough. The moment you need C’s performance or assembly’s precision, you’ve crossed into territory that demands human accountability.