Windows is not exactly free of this sort of nonsense either. Just recently I built a new PC for a friend, and we wanted to keep using his old SSD and Windows installation. After messing about with Bitlocker recovery keys which was already cumbersome enough, we ran into a catch-22 issue where we needed internet access to be able to log in and verify his Microsoft account, but we needed to install a driver for the new motherboard's networking chipset first, for which you need to be able to log in to an account first. Eventually we found that you can use USB tethering from a phone to gain internet access, for which no special driver is needed, which got around the issue but it was not exactly an obvious solution.
That 20% is mostly covered by competitive online multiplayer games that use kernel-level anti-cheat systems which will only work on Windows. There's not a whole lot Valve can do about that, other than continuing to push Linux for gaming and hope that it gets popular enough to create an incentive for anti-cheat providers to start targeting Linux as well.
I never understood why game devs don't just segregate players based on their anti-cheat status. Have a setting in the game like "only play with anti-cheat verified players" that defaults to yes.
That way Linux gamers can still play with other Linux gamers if they want (and cheaters).
Not an ideal situation but probably better than nothing.
I think that would make Linux players into second-class citizens who could only play in a pool that is 90+% filled with Windows cheaters.
Segregating into two pools: Windows-verified, or Linux-unverified, would probably not work for Linux users either. It'd be the same problem (on a smaller scale) as not including kernel anticheat in Windows. No fun for the non-cheaters.
I'm not a gamer though, so I may be missing important details.
The real "a-ha" demystifying moment for me was not so much learning about the elementary rotation, translation or even perspective projection operations. It was understanding how all of those operations can be composed together into a single transformation and that all that 3D graphics really is, is transforming coordinates from one relative space to another.
One important revelation in that regard for instance, was that moving a camera within a world is mathematically exactly the same as moving the world in the opposite direction relative to the camera. Once you get a feel for how transformations and coordinate spaces work, you can start playing around with them and a whole new world of possibilities opens up to you.
Though in the real-world case, there's an important difference that breaks the symmetry: You experience acceleration, whereas everybody else standing around you doesn't.
Id Software very much skirted the edge of legality by making Commander Keen outside of office hours while still employed by SoftDisk and using SoftDisk computers, and SoftDisk could have easily sued them if they wanted to. They managed to avoid that by striking a deal where the Id guys would continue to make games for SoftDisk while working on Keen and later Wolfenstein 3D.
There was a lot of code reuse between games. John Carmack is on record somewhere that the enemy navigation code from Doom and Quake still has its origins in some of the earliest 8-bit games he wrote in the 1980's.
Honestly, I think that if Steve Jobs had lived, he would have continued to push the industry in a direction more aligned with his tastes, others would have followed suit, and whatever hot topics we'd be discussing today, they would be very different from the ones we are discussing now.
I think he would have been all over AI, and would have pushed Siri ahead instead of letting the product stagnate. I suspect he'd have pushed into robotics as well, especially home automation robots. Home automation in general, in fact.
His whole thing was being the smartest, most tasteful, and most creative person in the room. There was a lot of illusion/delusion there, but even with his failures he was absolutely focused on product design, user experience, and aesthetics in a way that Cook's Apple isn't.
Cook's Apple is a hugely successful predatory and cynical cash extraction bureaucracy, with a world-leading hardware division and a shockingly mediocre and failing software division.
The goal is penny-pinching acquisition, so we can expect more and more of this from Apple until there's a change of leadership. (If we're lucky...)
The magic of Jobs is/was he truly was a self-starter and self-taught man; he had the rare mix of traits necessary to be a visionary.
Frankly I think Jobs saw Cook as a key operator to ensure the firms future survival and future growth; I'd imagine Jobs foresaw the tremendous impact the smartphone would have and all Cook had to do was be a shrewd operator as Apple had built such a huge advantage over competitors by the time he was dead.
He seemed very content in the end that Apple is on the right track and set up correctly for the future. I don't think he was talking about profit margins, but rather about the soul of the company, if there is such a thing.
Sad but probably true. I hadn't really considered that aspect. Anyone so influential no doubt changed the whole Zeitgeist, not just their own company's course.
I had this epiphany last year when I went through some old holiday pictures and saw a photo of a monument in a location that I had no memory of. So I spent some time retracing our steps on that day, based on other pictures from around the same time, and places that I knew we visited. It took a while, but eventually I managed to zero in on the place and felt pretty satisfied as I starred the location on Google Maps.
Since the monument in question was somewhat relevant to my work, I shared the picture in my company chat and asked if anyone had seen it and knew from the top of their head where this was. Almost immediately one colleague threw the picture into an AI reverse image search and instantly came up with the answer where it was and what the monument represented. I was incredibly annoyed at that; not because someone was able to come up with the answer much faster than I did on my own, but because it took the FUN out of the whole thing.
That's when I realized that my instinctive dislike for AI is because it takes the fun out of everything for me. The process of figuring out where this photo was taken was much more rewarding than the eventual answer. Similarly, when programming I take pleasure out of figuring out difficult problems and coming up with elegant solutions for then. Writing the actual code isn't the interesting or difficult part, and I don't need an AI to do that for me. AI is being hyped up by people who are not interested in the process of learning and understanding and who just want a quick shortcut to the answer, completely missing the point in my opinion.
You should try GeoGuessr if you haven't already. But there as well there are things that ruin the fun: "meta" knowledge, like if you see this tear in the sky, or this or that Google car then it's this country. I purposefully avoid learning any meta because that's not what makes GeoGuessr fun. The fun part is integrating several vague clues and arriving at the right conclusion.
Excellent realization. AI is fundamentally destructive, and is one reason I never use it, ever. Automation was never meant to infringe upon the creative domain, only the truly mechanical.
It can be better. On slop detection, shadowban the offender and have it discuss with two AI "maintainers", and after 30 messages go and reveal the ruse. Then ban.
Us Europeans have been extremely naive and complacent for the past decades. We've been all too happy to rely on America for our tech and defense needs, and America has been all too happy to provide. There have been plenty of warning signs in the past that this situation is not as self-evident as we'd like to believe and not sustainable in the long term, but so long as things keep churning along we'd rather just ignore the problem instead of proactively tackling it and becoming more self-sufficient.
And yes, now that America are showing their true colors with Trump leading the way, finally, finally we are starting to see that maybe this isn't how we should want things to be. It's still going to take a long time for Europe to shake off its dependence on the US tech industry and truly start to challenge it, but hopefully this is a wake-up call that will gradually push things in the right direction.
Several years ago we had issues with certification of our game on PS4 because the capitalization on Sony's Turkish translation for "wireless controller" was wrong. The problem being that Turkish dotless I. What was the cause? Some years prior we had had issues with internal system strings (read: stringified enums) breaking on certain international PC's because they were being upper/lowercased using locale-specific capitalization rules. As a quick fix, the choice was made then to change the culture info to invariant globally across the entire game. This of course meant that all strings were now being upper/lowercased according to English rules, including user-facing UI strings. Hence Turkish strings mixing up dotted and dotless I's in several places. The solution? We just pre-uppercased that one "wireless controller" term in our localization sheet, because that was the only bit of text Sony cared about. An ugly fix and we really should have gone through the code to properly separate system strings from UI texts, but it got the job done.
This all reminds me a lot of the early 2000's, when big corporations thought they could save a lot of money by outsourcing development work to low-income countries and have their expensive in-house engineers only write specifications. Turns out most of those outsourcing parties won't truly understand the core ideas behind the system you're trying to build, won't think outside the box and make corrections where necessary, and will just build the thing exactly as written in the spec. The result being that to get the end product you want, the spec needs to be so finely detailed and refined that by the time you get both specification and implementation to the desired quality level, it would have been the same amount of effort (and probably less time and frustration) to just build the system in-house.
Of course outsourcing software development hasn't gone away, but it hasn't become anywhere near as prevalent and dominant as its proponents would've had you believe. I see the same happening with AI coding - it has its place, certainly for prototyping and quick-and-dirty solutions - but it cannot and will not truly replace human understanding, ingenuity, creativity and insight.
Somebody will have to add some syntax to these LLM prompting systems to include text that doesn’t get converted. So the next round of PMs can ask you to documents your prompts).
> As you say, by the time you specify everything, you've written the code
Sadly not when it’s a 2000 page word document with a million formatting quirks and so many copy-paste versions you don’t know if you should trust “reqs_new25”, “reqs_new25v1” or the x number of other copies floating around.
Then remember when we said tests should be the specs?
Then we said the end users are the specs?
All of them can be construed as a joke in our erratic search for the correct way to write software without those $150k developers that seem to be the only ones getting the job done, assuming they have a competent management hierarchy and stock options incentives.
[1] We have a waterfall software and I wonder whether Crockford’s license “JSON should be used for good, not evil” was meant for me
I think this erratic manner of trying to find the correct way is the issue. I am currently nearing my 2nd year at a company A in my industry, and while I did know they all kinda suck in their own special way, I honestly had no idea it was this bad until I had to try to make this craziness somehow work for us. Even if there are standards, I do not see people following them. Last year, the one girl, who did seem to try to follow some common sense approach, got fired for effectively using common sense against big boss wishes.
What I am saying it is a mess from beginning to end and I am honestly not sure if there is one factor that could solve it..
> Last year, the one girl, who did seem to try to follow some common sense approach, got fired for effectively using common sense against big boss wishes.
What did your manager say when you said this to them?
Sigh, sadly, I didn't, because I found out largely after the fact and, more amusingly, after record profit year at the company with the project in question clearly being a major part of everyone's goals ( not mine, I was kinda roped in at the last stages ).
that girl doing stuff is one thing, but if you're so scared you can't even ask your direct boss (not the big boss, obvs) a question, like, yo, I dunno.
Do any of these vibe coding tools write out the prompts as specs and then keep the specs up to date as you continue prompting? Seems like specs == formal prompts.
You don't need a tool for that. "You're going to assist me in writing a detailed software spec in markdown. At each step adjust the document to incorporate new information. Suggest improvements and highlight areas which have been ignored so far. Initial description: ..."
If you have multiple of those, you can tell it about required sections / format, or provide a good past example.
> This all reminds me a lot of the early 2000's, when big corporations thought they could save a lot of money by outsourcing development work to low-income countries and have their expensive in-house engineers only write specifications
I worked at [insert massive retailer here] a few years ago and this mindset is still very much alive.
In my experience, directly hiring (or hiring through a local company) developers in a “low-income country” - in my experience, Eastern Europe and Latin America - goes a lot better than just contracting out a body of work to a third party. Especially if your company is already fully remote, you’re able to get developers who integrate onto your team just like American devs, and are just as good at coding.
No, when I was at a large, well-known company a year ago, job listings were 2:1 off-shore (India, South America) vs on-shore. There was also an increasing amount of contractors used, even for public-facing stuff you wouldn't expect.
It’s sad you’re getting downvoted by gatekeepers. It’s absolutely a good thing that more people have access. Maybe not for inflated costal salaries and egos, however.
Software is going to completely change. The writing is on the wall, it's just a matter of who can step back to see it.
Giga-projects with 50k, 100k, 500K, lines of code are going to get decimated. By and large these programs are written to capture large audiences by offering a massive feature set. But how often is any one user ever actually needing those 100k LOC?
If LLMs can start nailing 5K LOC projects pretty reliably, and the technical moat cleared away (i.e. using an IDE to run code). Software businesses are going to see a collapse in users as people just churn out bespoke single task software for their own use case each day.
If you think it's fine because you can't prompt Gemini "Create an excel clone", you are doing the equivalent of drawing a robot with a vinyl record player in 1950 and calling it "Entertainment of the future!". In a world with functional robots, portable vinyl players for it to play music make no sense.
> it's just a matter of who can step back to see it
I’m not sure why those fully bought into the AI hype so often dismiss anyone who’s less bullish as simply too small-minded to see the writing on the wall.
That aside, while I do think software is going to change, I’m not sure that I agree with your particular version. What examples of apps with 100k, 500k+ LoC codebases are going to get decimated? The biggest successes in software today have moats around inventory or network effects. Are people going to make their own Netflix or Uber? Even at a smaller level, is the idea that local gym owners are going to replace gym management software with their own mini apps? Or restaurants are going to do their own point of sales apps? Unless AI can make the cost of maintaining software something close to zero time, which is a really tall order, why would business owners waste time on something outside of their core business. And if this is such an untapped opportunity why didn’t we see the start of it with the no-code movement?
Will a big chunk of the software market fall off and be replaced by custom mini apps that the layperson crafts with AI? Maybe? But I don’t see how one justifies the level of confidence I see from these predictions.
Because, as we all know, what is important about coding is outputting lines of code.
This is the main difficulty, this is were IA will change the game forever.
Thinking about the code you write, that's worthless, do not do that. Improving a large code base ? Again, no need to think to do that. Just inject random lines of code and it will do the trick.
Of course, in a world where your code has no value because it's just a copy/paste from elsewhere that has a low lifetime expectancy, IA is shinning.
But if you want something a bit more useful, a bit more clever, a bit more adapted to your use case, IA sucks. Because IA do not think.
IA is a threat to dev that do not think. Good ridance, they won't be missed.