I can't see why anyone is using Conda in 2025. In 2018, yeah, pip (now uv) was hard and you could get a "just works" experience installing Tensorflow + NVIDIA on Conda. In 2023 it was the other way around and it still is.
Well, when you're building python packages that have non python dependencies and a big chunk of your users are on Windows, conda is the only option, even in 2025 :)
Examples include, quant libraries, in-house APIs/tools, etc.
Circa 2018, I figured out how to pack up the CUDA libraries inside conda for Windows so I could have different conda environments with different versions of CUDA which was essential back then because if you had a model that was written w/ a certain version of Tensorflow you had to have a matching CUDA and if you used NVIDIA's we-need-your-email-address installers you could only have one version of CUDA installed at a time.
Worked great except for conda making the terrible mistake of compressing package files with bzip2 which took forever to decompress for huge packages.
I see no reason you can't install any kind of non-Python thing that a Python system wants with uv because a wheel is just a ZIP file, so long as it doesn't need to be installed in a particular place you can just unpack it and go.
For that matter, you can install arbitrary content from a wheel with Pip.
The problem is all the things that do need to be installed in a particular place. Linux seems to have a lot of those, especially if they're build-time dependencies for something else. Hence the need for, and slow development of, https://peps.python.org/pep-0725/ (relevant background, though I'm sure you know this sort of stuff: https://pypackaging-native.github.io/ ).
Conda worked for me in the past, but at some point I was getting inexplicable segfaults from Python scripts. I switched back to just pip and everything worked fine again. And installation was much faster.