Lambdas and some other complicated looking extensions have been on the plate for quite some time, and it seems like none of them made it into C23. Lambdas in particular have been promoted for the most part by a single person AFAICT. "C is becoming into everything C++ has" is a drastic misrepresentation.
I think I take the middle position between you and GP, C is slowly working it’s way towards unnecessary complexity. Most of the ‘X language is a C replacement’ posts and threads here on HN seem to disregard C’s development (particularly from 2017 onward) through committee standards releases. Everyone keeps arguing about Rust, Zig, Odin, etc and claiming one or the other is clearly more C like and the others are too complicated/expansive/growth-prone. But no one ever stops and claims that the simple C in there heads is not C in 2022, but C in 1990.
C is not yet at the point where added features, idioms, or concepts have totally removed the ability to return to or regularly restrict yourself to some ‘simple’ subset of the language. However, the push to add more and more to the standard does not inspire hope that this situation will remain.
So while I don’t think it’s there yet, it certainly looks like complexity via continual expansion is the path forward. I may not like it, but it does seem to be a reasonable supposition by GP.
I agree, there is some proposals in there that I doubt they fit well with the culture. However what has actually been accepted that makes you think C in the heads is not C in 2022? In my head, the biggest practical change from 1990 C is declare-anywhere (including declare-in-for-loop) which in my estimation has afforded a lot of added ergonomics at practically zero semantic cost. The next most relevant one is standardization of memory models, which is a good one AFAICT (but I used it only a little, so far). One unfortunate addition from 1999, VLAs, has even been demoted in the following revision.
In 2022 I can jump into basically any C codebase and be immediately productive. Which can't be said of some other languages.
I don’t think there is any one addition to the C that has moved C to far up the complexity hierarchy. It is the cumulative effect of all the proposals submitted, some of which are approved, and the movement to continue to add more. Like I said, C still enables developers to restrict themselves to a simple and long existing subset of features and idioms making coding in C a productive exercise for experienced developers. But the continued push from all directions, and with varying likelihood of being accepted, to change the language wear on me, to the point I continually want to just make a C99 clone with my preferred improvements and just use that instead.
Also, I suspect I phrased the ‘C in head CS C in 2022’ line backwards from what I thought I was saying. In my mind I was saying the arguments for the simplicity of C had less to do with the language as it exists now and more to do with the closeness of the language to some ISA/actual machine’s operating characteristic in the past. It was that closeness that enabled (or necessitated) C’s simplicity.
It wasn’t about the changes the committee has made to the language since the 1990’s. The tie in point was (in my mind) implied to be that now that C has become even less of a mapping to actual hardware the simplicity sought in a C replacement should look different, but commenters seem to be looking at a very surface level.
Consider that some of the C23 changes have in fact cleared up ambiguities and removed support for ancient hardware. So I find that the renewed pushes behind the standard are not all promises of eventual disintegration. There are a few accomplished and considerate people in WG14. If you visit the page I linked above you'll get a sense of that.
I never particularly cared about ISAs - when I think about the "simplicity" and "closeness to hardware" I mean the control over data layout and control flow. Data layout and architecture of the global data flow is often the key to 90-100% of the performance gains you can expect; and more or less stable binary interfaces mean interopability and modularity, minimizing ecosystems lock-in. Where you need to use a certain instruction is the rare case, and you are recommended to code it in assembly or using compiler extensions.
Because NULL is dogshit. It's a #define 0. That's not one way to do an operation, that's one way to do an operation, horribly badly. That int on your stack ? Sure it can equal NULL. Hope that wasn't the result of (2 - 2).
In C the NULL macro can be defined as either (void*)0 or 0. It's only mandated as 0 in C++.
The nullptr concept was introduced into C to fix a type ambiguity when NULL is used with generic selection or varargs functions. The ambiguity could have been solved by mandating that NULL be defined as (void*)0. My issue with nullptr is its an overkill solution that unnecessarily duplicates the concept of NULL in the language.