> Ironically, being concerned and skeptical about running random executables from the internet is a good idea in general.
I agree you shouldn't run random executables, but the key word is "random". In this case, Ring Racers is a relatively established and somewhat well-known game, plus it's open-source.
It doesn't guarantee it's not harmful of course, but ultimately for someone with the mindset of "I should never run any programs that aren't preapproved by a big corporation", they may as well just stick to Windows/MacOS or mobile devices where this is built into the ecosystem.
Open-source only matters if you have the time/skill/willingness to download said source (and any dependencies') and compile it.
Otherwise you're still running a random binary and there's no telling whether the source is malicious or whether the binary was even built with the published source.
It's no guarantee, but it's a positive indicator of trustworthiness if a codebase is open source.
I don't have hard numbers on this, but in my experience it's pretty rare for an open source codebase to contain malware. Few malicious actors are bold enough to publish the source of their malware. The exception that springs to mind is source-based supply chain attacks, such as publishing malicious Python code to Python's pip package-manager.
You have a valid point that a binary might not correspond to the supposed source code, but I think this is quite uncommon.
> It's no guarantee, but it's a positive indicator of trustworthiness if a codebase is open source.
It's something we as techies like to believe due to solidarity or belief in the greater good, but I'm not sure it's actually justified? It would only work if there's a sizeable, technically-inclined userbase of the project so that someone is likely to have audited the code.
If you're malicious, you can still release malicious software with an open-source cover (ideally without the source including the malicious part - but even then, you can coast just fine until someone comes along and actually checks said source). If you're anonymous there is little actual downside of detection, you can just try again under a different project.
Remember that the xz-utils backdoor was only discovered because they fucked up and caused a slowdown and not due to an unprompted audit.
> It would only work if there's a sizeable, technically-inclined userbase of the project so that someone is likely to have audited the code.
Not really. There's a long history of seemingly credible closed-source codebases turning out to have concealed malicious functionality, such as smart TVs spying on user activity, or the 'dieselgate' scandal, or the Sony rootkit. This kind of thing is extremely rare in Free and Open Source software. The creators don't want to run the risk of someone stumbling across the plain-as-day source code of malicious functionality. Open source software also generally makes it easy to remove malicious functionality, or even to create an ongoing fork project for this purpose. (The VSCodium project does this, roughly speaking. [0])
Firefox's telemetry is one of the more high-profile examples of unwanted behaviour in Free and Open Source software, and that probably doesn't even really count as malware.
> If you're malicious, you can still release malicious software with an open-source cover (ideally without the source including the malicious part - but even then, you can coast just fine until someone comes along and actually checks said source).
I already acknowledged this is possible, you don't need to spell it out. Again I don't have hard numbers, but it seems to me that in practice this is quite rare compared to malicious closed-source software of the 'ordinary' kind.
A good example of this was SourceForge injecting adware into binaries. [1]
> Remember that the xz-utils backdoor was only discovered because they fucked up and caused a slowdown and not due to an unprompted audit.
Right, that was a supply chain attack. They seem to be increasingly common, unfortunately.
Of course this is true. But you can keep going down the rabbit hole. How do you know there isn't a backdoor hidden in the source code? How do you know there isn't a compromised dependency, maybe intentionally?
Ultimately there needs to be trust at some point because nobody is realistically going to do a detailed security analysis of the source code of everything they install. We do this all the time as software developers; why do I trust that `pip install SQLAlchemy==2.0.45` isn't going to install a cryptominer on my system? It's certainly not because I've inspected the source code, it's because there's a web of trust in the ecosystem (well-known package, lots of downloads, if there were malware someone would have likely noticed before me).
> still running a random binary
Again "random" here is untrue, there's nothing random about it. You're running a binary which is published by the maintainers of some software. You're deciding how much you trust those maintainers (and their binary publishing processes, and whoever is hosting their binary).
The problem is that on Windows or your typical Linux distro "how much you trust" needs to be "with full access to all of the information on my computer, including any online accounts I access through that computer". This is very much unlike Android, for example, where all apps are sandboxed by default.
That's a pretty high bar, I don't blame your friend at all for being skeptical.
Right, which goes back to the main point; "total control of your computing environment" fundamentally means that you are responsible for figuring out which applications to trust, based on your own choice of heuristics (FOSS? # of downloads/Github stars? Project age? Reputation of maintainers and file host? etc...) Many, maybe most people don't actually want to do this, and would much rather outsource that determination of trust to Microsoft/Google/Apple.
> Right, which goes back to the main point; "total control of your computing environment" fundamentally means that you are responsible for figuring out which applications to trust, based on your own choice of heuristics
Hard disagree. Total control of my computing environment would be to allow an application access to my documents, a space to save a configuration, perhaps my Videos folder or even certain files in that folder. Or conversely, not.
At the moment, none of the desktops give me the ability to set a level of trust for an application. I can't execute Dr. Robotniks Ring Run (or whatever the example was) and be able to specify what it can, or cannot access. There may be a request for permission at a system level access, but that could be explained away as usually is for iApps and Android when requesting some scary sounding permission groups.
And it also doesn't stop malware from accessing my documents. Sometimes my Mac asks if an application is allowed to access Documents, but it isn't consistent.
> they are hidden away inside the settings, and they are not granular.
The switches default to off though, with a prompt on first attempt at accessing the protected resource.
The problem is that they're leaky like a sieve and the permission model and inheritance works is unclear (I once had the Terminal app ask me for permission - does it now mean anything I run from the terminal automatically inherits it - and so on).
> Open-source only matters if you have the time/skill/willingness to download said source (and any dependencies') and compile it.
Not really. The fact that an application is open-source means its originator can't rug-pull its users at some random future date (as so often happens with closed-source programs). End users don't need to compile the source for that to be true.
> Otherwise you're still running a random binary and there's no telling whether the source is malicious or whether the binary was even built with the published source.
This is also not true in general. Most open-source programs are available from an established URL, for example a Github archive with an appropriate track record. And the risks of downloading and running a closed-source app are much the same.
The kind of rug-pulling you describe only works if the software implements an online licensing check/DRM, and either way has nothing to do with security against malicious behavior.
> Github archive with an appropriate track record
How do you judge the "track record"? Github stars can be bought. Marketing can be used to inflate legitimate usage of a program before introducing the malicious behavior.
> the risks of downloading and running a closed-source app are much the same
But that's my point - open-source doesn't really change the equation there unless you are actually auditing the source and building & running said source. If you're just relying on a binary download you're no better than downloading proprietary software in binary form.
> The kind of rug-pulling you describe only works if the software implements an online licensing check/DRM, and either way has nothing to do with security against malicious behavior.
My point was that an open-source program cannot rug-pull its users without the obvious remedy of forking the project and removing the offending code. Open-source: commonly seen. Closed-source: not possible and often illegal.
For both options, you have to trust the source, which makes that a non-issue. You can checksum the Linux kernel to satisfy yourself that it came from a trusted source. You can checksum the Windows kernel to satisfy yourself that you're about to be screwed.
> But that's my point - open-source doesn't really change the equation there unless you are actually auditing the source and building & running said source.
In the open-source world, knowing how computers work is essential. In the closed-source world, knowing how computers work is somewhere between pointless and illegal. This is how open-source "changes the equation."
Modifying open-source code is welcome and accepted. Modifying closed-source code breaks the law. Take your pick.
I agree you shouldn't run random executables, but the key word is "random". In this case, Ring Racers is a relatively established and somewhat well-known game, plus it's open-source.
It doesn't guarantee it's not harmful of course, but ultimately for someone with the mindset of "I should never run any programs that aren't preapproved by a big corporation", they may as well just stick to Windows/MacOS or mobile devices where this is built into the ecosystem.