I added the Lua scripting layer to FS2Open. It’s very useful for not-quite-coders. There are a lot of projects where the founders end up doing everything because the other contributors don’t stick around permanently.
I don’t see the yearly releases as saying you have to upgrade. Rather, having a consistent cadence makes it easier for the supply chain, and the short iteration time means there’s less pressure to rush something in half-baked or delay a release.
It should be, but a lot of developers don’t have formal security training, nor especially management which may end up selecting the contractors/developers and deciding on the technical approach.
If it’s explicitly not production ready, it should probably say so up front, not advertise itself as “strong encryption”. However painful that may be.
There’s a lot of overhead as soon as you involve a filesystem rather than a block device, even on a dedicated disk, particularly with btrfs. I don’t know if the same is true with MacOS and APFS; this isn’t the area I usually work in. However copy-on-write file systems (which I believe apfs is) are somewhat predisposed to fragment files as part of the dedup process; I don’t know if apfs runs it online in some way so it could have affected the article’s author’s results.
The standard library implementation details can also have a huge impact, eg I observed with Rust for a prior project when I started fiddling with the read buffer size:
The other issue that I see is that their I/O is implicitly synchronous and requires a memory copy. They might see better performance if they can memmap the file, which can probably solve both issues. Then if C# allows it, they can just parse the CSV in-place; with a language like Rust, you can even trivially do this in a zero-copy manner, though I suspect it’s more involved with C# since this requires setting up strings / parsing that point at the memmaped file.
At that point, the OS should be theoretically able to serve up the cached file for the application to do some logic with, without ever needing to copy the full contents again into separate strings.
C# has an abstraction for memory-mapped files. You can always use raw pointers and directly call the corresponding OS APIs with interop too.
However, the fastest-performing implementations in 1BRC challenge that were written in C# ended up with inconclusive results whether using memory-mapping over RandomAccess.Read API (which is basically a thin wrapper over read/pread calls) is faster or not: https://github.com/noahfalk/1brc/?tab=readme-ov-file#file-re...
You can relatively easily do 2 GiB/s reads with RandomAccess/FileStream as long as sufficiently large buffer size is used. FileStream default settings already provide a quite good performance, and make it use adaptive buffer size under the hood. Memory-mapping is convenient but it's not a silver bullet (in this context) and page-faulting then mapping the page and filling it with data by performing the read within kernel space is not necessarily cheaper than passing a pointer to a buffer to read into.
The challenges in Rust and C# are going to be very similar in this type of task since C# can just pin the GC-allocated arrays to read into, call into malloc or 'stackalloc' the temporary buffer inline, and the rest of implementation will be subject to more or less identical constraints. C# is probably the closest* "high-level" language in feature set to Rust, even if this sounds strange. There's a sibling submission that covers an another angle to this: https://news.ycombinator.com/item?id=41963259
* have not looked through Swift 6 changes in detail yet
I have tried to use it for C++, iOS, Python, Flutter, Docker, MacOS binaries, an Ubuntu container, MacOS, and NixOS. In every case it became a time sink that failed to work due to broken packages and convoluted code.
Core issues are that:
(1) To meet nix’s goal of declarative package management, everything in nix wraps software to create a bespoke interface for nix. But the documentation for the nix interface is extremely spotty and inconsistent, where it exists at all.
(2) The language is a mix between functional and shell code, to create a declarative spec. This means many mental gear shifts while reading. This is made even worse by the nixpkgs API being inconsistent, with case-specific variants of functions (eg to set a property on a package you might need to call a language-specific property setter, which is very confusing when a package is built with multiple languages)
(3) Many packages are broken or unsupported on one OS or the other.
Because of these, it means that you need to be a coder AND have deep Linux knowledge AND deep Nix knowledge to use it for an extended period of time.
And then:
(4) Nix does not cleanly integrate with the packaging for a language, or expects running an AOT tool to generate a bespoke Nix derivation from the standard tooling
I and others have complained about Nix being too difficult to use for years, but the Nix community instead is more excited about flakes, which are even more convoluted than existing derivations and metastasize the existing architectural issues in nixpkgs into countless decentralized packages that will now need to be refactored if someone tries to overhaul the design of nixpkgs.
From my perspective, nix flakes are like if Linux was struggling with adoption because its internal and user APIs were undocumented and frequently broken, and the kernel developers got really excited about moving all the drivers out-of-tree into separate repositories because that’s “best practices”.
A killer cross-platform and project / system package management system with hermetic build environments would be a godsend. But nix’s practical implementation of that is too badly done for me to have ever been able to use it for anything but the tiniest most specialized of projects, or as a partial package manager for macOS (where I still have to install things manually or use homebrew).
There are a great many things that are theoretically possible with Nix, but I don’t know anything that I could practically recommend it for.
> Many packages are broken or unsupported on one OS or the other.
Report them. There's thousands of packages. Efforts like Zero Hydra Fails help, but there's still a lot to fix. If you raise an issue, it will be prioritised, because we'll know someone actually cares about it.
Keep in mind that's compared to a small percent of those packages available in other systems where you're on your own - not much different than on nixos with a broken package.
The worst case that comes to mind was iOS development. In that case there was supposedly packaging available that leveraged the Xcode cli, but it had ceased to be maintained and once I started fixing stuff, I began running into even deeper issues that made me question whether it ever truly worked.
In other cases it was things like mach-nix not working for certain Python packages, and evidently a refusal to upstream something like mach-nix that tried to work with default Python packaging upstream, and the default Python API being a mess to work with.
Thus in the latter case while from the nix POV nothing in nixpkgs was “broken”, it working as designed left me choosing between investing large amounts of time to figure out how to implement a derivation for the third-party Python script I wanted to work on or dealing with third-party tools. And I expected all that to be a rathole too. I wasn’t looking to push some tooling on the third-party maintainer or develop something I’d just throw away, I wanted to add a relatively simple feature. As it was I wound up spending the free time I had troubleshooting package management and ran out of time to actually get work done. I did file an issue with mach-nix, but the person who helped me also ran into problems.
This kind of “brokenness in depth” is exactly what I ran into when I tried to troubleshoot and fix nixpkg’s iOS support over a few months.
I have actually contributed a couple packages back, and I wrote a brew-like derivation adapter that may or may not still work on one of my macOS devices that worked with standard macOS installers/install archives (copy to Applications folder), but that got broken by an upstream update and fixing it became involved. Iirc whatever I did do to fix it made it unsuitable to upstream, and when I did get another macOS device, I didn’t even try to use nix for applications that weren’t immediately supported.
Most recently I tried to create an Ubuntu container with USB forwarding under NixOS. Once again: bespoke solution, scant documentation, and GPT-4o got confused. I failed, and spent less time copying my files off and installing Ubuntu than I did trying to get it to work under NixOS.
So this isn’t just one-off packages being broken that a maintainer needs to fix. It’s a deeply pervasive thing with the entire nix ecosystem that requires the community to internalize a need to make things work obviously and on the first try for common use cases.
I don’t know anybody whose problem is “people want to pay me to work on nix with no other output product to the work other than more nix”, so I can’t recommend nix to anybody as a solution. You can’t solve a problem with more problems. And after the container incident, I’m truly lost as to what people ARE doing with nix besides just developing nix, because I thought it was supposed to be most popular in Devops contexts.
And yet the very last thing I would entrust to nix at this point would be a production service whose infrastructure needs to be completely understood and rapidly fixable, because the abysmal documentation means a minor outage could turn into a major catastrophe while people google for some obscure forum post or decipher nixpkgs layer by layer.
Yeah, that would be a problem for a few reasons, but I'd point that rant at Apple "releasing" incomplete and broken sources. This whole area has been changed massively recently by a few Darwin heroes and will be released in 24.11. It's also going to be much easier to maintain in the future.
So if you're interested, try the new apple SDKs in a couple months.
FTR, it solves quite a few of my daily issues, especially in DevOps contexts where brew is unusable.
> FTR, it solves quite a few of my daily issues, especially in DevOps contexts where brew is unusable.
Perhaps there needs to be more explicit scope communicated. Right now the nix(os) website advertises several use cases that it just doesn’t seem ready for (but this has been the case for ~5 years now, so I don’t expect it will ever be ready).
When I’ve tried asking eg “is this ready for flutter development”, I was told yes, then immediately ran into brick walls when I started trying to develop with it.
And when it comes to filing a bug report, then I feel (or the maintainers will feel) that I’m obligated to grab logs, put in a certain amount of effort, format everything accordingly etc etc.
It’s just an exhausting amount of mental complexity to deal with right now, to the point it’s hard not to say “just use docker” is the right answer. However brute force and wasteful that may be in comparison.
Improving the interaction with language ecosystems was one of the motivating reasons for how I approached the [rework][1] of Darwin support in nixpkgs. A lot of Rust stuff was simply impossible to build due to their SDK needs and how hard it was to override the SDK correctly, but that’s fixed now (with a few remaining cases that will be fixed in the final staging cycle before 24.11). I expect other ecosystems to benefit similarly, especially since Darwin support looks more like a native toolchain while still being built and provided by nixpkgs.
For example, Zed and Wezterm (previously failing intermittently on x86_64-darwin) now build on Darwin. Someone even has [Firefox building from source][2]. PyTorch will be able to support MPS, and MoltenVK will be able to use Metal 3 features if your system supports them.
A good summary of the pain points I have with Nix. I was introduced to it recently at DayJob because one of the engineers I work with is a huge advocate for it.
I think there's a good use case for a small subset of Nix + direnv to manage system packages for repositories. That's essentially what we use at DayJob - but all it does is install system packages necessary for a containerized workflow. However, even that level of complexity, even if it's just a small flake.nix and .envrc, can be cumbersome to end users and we actively seek to hide away that complexity as much as possible because anyone not a Nix expert that is presented to work with Nix is entering a deep rabbit hole of complexity that is probably ultimately unrelated to the problem they are trying to solve.
Part of this I think is a branding and documentation issue with Nix. As a counterexample: I work with Argo a lot and they handle this sort of situation better. There is ArgoCD, Argo Workflows, Argo Events, and Argo Rollouts. They are all under the Argo umbrella that do distinctive things with clearly demarcated roles. Unlike Nix, I never get confused when I'm searching for ArgoCD related docs. I don't have to worry that I'm going to happen on Argo Rollouts related stuff and somehow not understand that Argo Rollouts is not the piece of Argo that I'm trying to use right now. The lines between each of the technologies are clearly demarcated and there is no confusion about the best way to do XYZ in each of them.
Not so with Nix - if I'm new to Nix and I'm looking up how to structure something in a flake, not only is there different setups/configs for a build tooling setup vs a package management setup, but even within a build tooling setup for instance there are probably 4-5 canonical different ways to configure things, all with a ton of decisions. Heck, if I'm new to Nix it is probably not even immediately obvious that I should be doing it with a flake instead of The Old Way, and I might even start implementing something using The Old Way before finding out buried in some Github issue from years ago that I'm Doing it Wrong
Thing that makes Nix amazing: It's infinitely customizable and welcomes that philosophy
Thing that prevents Nix from succeeding: It's infinitely customizable and welcomes that philosophy
TL;DR Nix favors configuration over convention, oftentimes to its detriment
Final footnote. I am writing this on a personal x86 machine managed with nix-darwin. It functions, but the amount of nondeterministic calls out to homebrew and kludges required for it to work essentially defeats the purpose of managing the machine declaratively in the first place. All of the following extremely popular softwares available in nixpkgs - 1Password, VSCode, Firefox, Docker Desktop - don't work out of the box on this machine and require either compiling the package yourself, jumping through some hoops to get it to work with MacOS code signing, or require you to just entirely ignore the Nix aspect and have Nix shell out to Homebrew. There are also quite a few binaries of packages that are simply unavailable in Nix.
What they meant should have written was "tons of refined sugar". That's the chemical that makes you fat, through making your food over-calorized while not leading your body to realize you need to stop eating.
The fake lie answers that they will give might include zero calorie sweeteners because people hate the idea that you can "have your cake and eat it too" (no meta-pun intended).
I think that you could probably put together a reasonable working definition of something like:
Substances which are artificially synthesized or heavily processed which are added to food. For the purpose of this definition, ingredients which have a long history of use such as salt, alcohol, fermented foods, smoking, etc. are excluded.
Of course the purpose of this definition is to serve as a generalization in order to facilitate discussion. I'm certain that there are exceptions where modern additives are probably fairly obviously harmless such as vitamin/mineral fortification. Likewise there are traditional ingredients that we now know can be harmful such as alcohol, excessive salt, smoke, etc.
What I would imagine happens is that some food producer realizes that a lot of their product is going to waste and they have intermittent reports of food poisoning. So they add salt to be able to continue selling the same volume of product. This also may make the product more flavorful. Seems like a win all around to them.
Now the food is causing long-term issues in some people, but the American medical system introduces a lot of friction towards chronic medical issues. These issues are underreported, therefore there isn’t a lot of money available to reaearch them. And the time between cause and effect is, well, decades before we have clinical diagnostics to allow us to say “you specifically need to eat less salt”.
Now we can slap regulations on the companies involved in food production to revise the levels of sodium in food. I’m not sure we know what the optimal levels are. But it will probably cost them millions of dollars factoring in food waste, changes to established shipping / storage guidelines, possibly even force them to change companies to deliver product faster or pull their product from certain retailers who find it no longer profitable to receive shipments given the low volume they can sell before the product is unsafe to sell.
But it’s only really possible to have the discussion of what the right solution is if the specific objection is stated. If someone is concerned about GMOs, the driving issue may be more related to where they can be grown, size of the product, crop vulnerability to disease, avoiding excessive use of herbicides or pesticides, adapting to ecological changes, and so forth.
There are a wide array of problems from plastics to herbicides and pesticides related to consumption. There's also the sustainability issue as laid out in this article. It's unclear what your contention is other than you might not like general statements about "chemicals". It's not possible to enumerate every issue. You're statement isn't contributing anything.
Everything we eat is “chemicals” that is broken down chemically to be turned into energy (edit: and structural purposes).
Sure in like-minded folks, chemicals may be understood to mean artificial sweeteners, pesticides, GMOs, HFCS, etc. but it’s unclear which they’re objecting to or even what agricultural sub-industry they’re criticizing.
Heck even high amounts of sodium in the American diet is criticized, but strip it out entirely and you’ve got a different set of problems now.
Most likely each change was done for a reason that improved either the cost-effectiveness or the appeal of food, or solved issues relating to storage, availability, changing ecologically factors, vulnerability to plant disease, malnutrition, etc.
It’s just not constructive to say something that’s so generic that it evaluates to “food could have healthier ingredients” or even “food could have more natural ingredients”. It’s just handwaving a bunch of supply chain issues as if people are just choosing to be arseholes.
It’s like taking potshots at tech for centralizing personal information into databases that keep getting compromised for identity theft. Yeah, there are issues with that paradigm, but that’s not to say that solving the issue is as simple as decentralizing all information storage - that introduces another set of issues (eg are end users really going to have sufficient cybersecurity chops to not lose their data themselves instead of a third-party).
It’s easy to complain about the solution when you aren’t familiar with the constraints that keep it from being perfect.
The main constraint to a solution is the size and scale of chemical companies who lobby to create rules in their favor. There's no practical solution to this problem, the best we can do is educate people to live and consume sustainably.
"chemicals are those ingredients with scary names" is not a useful definition - unless you think foods containing 3-Methylbutanal are problematic (bananas [1]). You have to be more specific, otherwise you end up deriding ingredients based on how they sound rather than how safe they are. HFCS for example, is 55% fructose and 45% glucose while regular sugar is 50% fructose and 50% glucose. So since fructose might be worse for the body (although this is disputed and it might be that glucose is worse), HFCS might be a little worse but it really is the quantities of sugar that matter than the kind.
What’s great about this comment is how damn complex just fucking water is.
You’ve got tap water, which can have chlorine or chloramine added to it. Yes, the water that you drink can be chlorinated. They do this because it kills off microbes that might be living in the pipes between the water distribution center and your faucet, because right now we believe that ingesting trace amounts of chlorine is better than contracting bacterial disease from your drinking water.
Then you have water that’s run through your filter, which might cut down on some larger particles.
Then you have reverse osmosis, which removes smaller particles, and usually includes a carbon filter. This can actually be harmful over long periods of time because the reverse osmosis process removes the trace magnesium etc that you usually get from water and lead to mineral deficiencies.
Then you have distilled water, which has been vaporized and condensed. Same risk applies as reverse osmosis water.
And then you have deionized water, which has gone through an extra filtration step. Not usually intended for drinking, and same risk of mineral deficiencies with long-term consumption applies.
Now, in the context of “remove everything artificial”, deionized water is probably the closest to being pure H2O. On the other hand, you need to additives to avoid health issues from drinking that.
On the other end of the scale, tap water sounds horrible-it’s chlorinated!
And I suppose if you keep going, you get to a point where you find the nearest natural lakebed composed of non-saltwater and just stick a straw into it. That’s probably the most “natural” source of freshwater, with absolutely zero additives, save for local pollution. There’s probably plenty of fecal matter from the local wildlife, but that’s natural, right? Note: Please do not try this at home or anywhere else.
So that’s…six varieties of water, each with their own profile of additives or “chemicals”. And in practice the water you get in your food is probably just going to be a mix from the municipal water supply, runoff, local wells, moist fertilizer, etc.
So before we even get to the chemicals in the food, we have to worry about the chemicals being put into the food to grow it. Oh, plus the chemical composition of the soil…hopefully there’s no heavy metals nearby, some plants are particularly greedy about snatching them up.
So it’s a really complex problem. We can’t just say “no chemicals in food”. It’s just not that simple.
Casual conversations are not about the technical aspects of food production and distribution that has been refined for thousands of years.
Also, chemistry? As a subject? Incredibly pedantic. The exception is the rule for practically everything.
There are formulations of medications that are selecting for this one shape of the particular molecule which has otherwise identical composition. And that may determine insurance coverage.
If you don’t want to have pedantic discussions, organic chemistry is not going to be a pleasant topic for you.
Odds are none or very few of the people on hacker news are farmers or chemists deeply involved with the agricultural industry, but I imagine this would come across about as favorably as a hacker news perspective on farmers complaining about the way apps on their phone work. Or complaining that computer nerds have ruined John Deere tractors by making them impossible to repair.
Ie it’s going to totally lack any sense of nuance about the business, politics, and logistical constraints involving the existing solutions.
I skimmed all of that but I gather you are saying don't talk about food production unless you are an expert or you want to be pedantic or some bullshit like that. Everyone eats food, everyone can influence food production in one way or another, whether through grocery habits or local or national politics. There is absolutely no way I would want to be associated with such a limiting viewpoint such as yours.
What you’re doing is spreading unqualified FUD towards the work of scientists and engineers involved in bioengineering. We don’t need more ignorant opposition to STEM in the US. We already have large swathes of the population rejecting vaccines with an excellent safety record because taking their chances with an unknown disease known to do permanent neurovascular damage was more “natural”.
As opposed to you encouraging naivite among the general population about bioengineered products? We do need a good amount of opposition to this incredibly naive viewpoint that so many people like you have of accepting whatever nonsense some scientist says as unquestionable truth. If the people involved in bioengineering feel so strongly that the population need to take particular drug, make that argument scientifically instead of going into histrionics about FUD or whatever.
Trust in scientists have plummeted in the last few years because of very good reasons (vaccine mandates, for one). Trust is hard to build back up, so if you want the trust back, you will have to do the decades long hard work of building it back up instead of complaining about it. It's not coming back just because you complain about it.
I don't know what "chemicals" means. Are you talking about preservatives, artificial colors/flavors, artificial sweeteners, certain natural fats, processed fats, contaminants, environmental chemicals, microplastics? I could go on. Saying "chemicals" is just a way to make an unfalsifiable claim. If someone shows evidence that, let's say, aspartame is harmless it's possible to just move the goalposts to the other "chemicals" because the list is nearly endless.
It's all of those things. Yes the list is nearly endless and by default they all should be considered harmful to humans.
Also, there is no need to stop using the word just because it can be used in arguments to make unfalsifiable claims. Talk about the claims instead; it's silly to talk about the word.
You seem to come from a perspective that we should consider these chemicals to be safe unless proven otherwise. That is an extremely naive perspective.
> You seem to come from a perspective that we should consider these chemicals to be safe unless proven otherwise. That is an extremely naive perspective.
Whatever you’re talking about has been ingested by millions or billions of people, so I don’t think it’s “naive” to assume a certain degree of safety for…whatever you’re talking about in American food.
Yeah, America’s health profile is different than other countries and we have a high rate of obesity, but only to a certain extent. We don’t have a whole lot of people who walk into McDonald’s and then drop dead after having the fries.
There’s a degree of reasonableness between “we should assume nothing is wrong” and “we should throw our food economy into chaos by outlawing ‘chemicals’ until we can have a two-generation double-blind randomly controlled study of every single one to prove safety.”
And this would probably have to include herbicides and pesticides which might get taken up or broken down by the plants, or which trace amounts might still exist on the product if it isn’t properly prepared, etc.
It’s a dead-end proposal because you can’t shut down food production to that degree without, you know, starving people and causing the collapse of modern society. Which, I’m just spitballing here, is probably going to have worse acute effects than all those “chemicals” put together.
So clearly you need to prioritize what you think is causing harm, and I suspect that’s exactly what relevant research is doing.
Reminds me of a particularly sassy medical paper:
> Advocates of evidence based medicine have criticised the adoption of interventions evaluated by using only observational data. We think that everyone might benefit if the most radical protagonists of evidence based medicine organised and participated in a double blind, randomised, placebo controlled, crossover trial of the parachute.
Your argument is something like if it doesn't work 100% perfectly, then don't try it at all. No need to go whole hog, there is a perfectly fine list of banned chemicals published by the EU right next door. We should start with that.
It's the literal definition of a chemical. Your body is a metabolic machine made of chemicals and performing all kinds of metabolic chemistry. People hysterical about "chemicals" and "toxins" are almost always uneducated and unspecfic about which ones they mean. Plenty of manufactured chemicals are nontoxic or even good for us. And plenty are bad for us. So we won't get any improvement health-wise by making a vague blanket boogeyman term like "chemicals". Learn some chemistry, educate yourself, and be part of the solution rather than just a ignorant voice adding to the noise.
This is a reductive and simplistic viewpoint. You are basically looking at a dictionary definition and trying to argue from that. You are not in 8th grade anymore and you are not talking to people who have just learned the definition of the word chemical anymore. Level up the conversation and learn that the sense of a word changes under different context, for a start.
I’m not too familiar with this situation, but I think one thing that would help Open Source in general is a way to signal what level of user the thing is intended to target.
For instance, is this just something that’s being dumped out on the internet in case someone else finds it useful?
Is it part of your portfolio and intended to showcase your technical skill, but not necessarily be polished from a UX perspective?
Or is it intended to be useful for end users?
Maybe it would be good to have a visually distinct and consistent badge or checklist available for open source projects to communicate the high-level goals so that people’s expectations are set correctly and they know what kind of feedback is inappropriate.
Every project is going to nominally be as-is for obvious liability reasons.
- UX Tier 10 for completely tech-illiterate users
- UX Tier 9 for infrequent mainstream users (do not need to watch a tutorial)
For instance, a rapid covid test might have low sensitivity but high specificity. Meaning if it’s negative, you could still have the disease. But if it’s positive, you’re almost certainly sick. Ie the false negative rate is a lot higher than the false positive rate.
Technically a "rapid covid test" only detects the presence of certain viral genetic material. This usually means the patient is or recently was infected with SARS-CoV-2 (the virus) but it doesn't indicate anything about whether the patient has COVID-19 (the disease). Many infections are asymptomatic and thus not medically classified as a disease state.
This distinction might seem pedantic but it's important to be precise when discussing medical issues.
If you want to be precise… There are different types of “rapid COVID test”, the most popular of which detect antigens, not ‘viral genetic material’. PCR tests detect genetic material. Both tests seem to have differing levels of sensitivity to each variant of the virus.
Maybe this would help:
https://serverfault.com/questions/136515/read-only-bind-moun...
reply