Hacker News new | past | comments | ask | show | jobs | submit login
Using uv and PEP 723 for Self-Contained Python Scripts (thisdavej.com)
245 points by thisdavej 3 days ago | hide | past | favorite | 118 comments





Slightly off-topic, but the fact that this script even needs a package manager in a language with a standard library as large as Python is pretty shocking. Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

And I’m not blaming the author, the standard library docs even recommend using a third party library (albeit not the one the author is using) on the closest equivalent (urllib.request)!

> The Requests package is recommended for a higher-level HTTP client interface.

Especially for a language that has not cared too much about backwards compatibility historically, having an ergonomic HTTP client seems like table stakes.


> Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

Sometimes languages/runtimes move slowly :) Speaking as a JS developer, this is how we made requests for a long time (before .fetch), inside the browser which is basically made for making requests:

    var xhr = new XMLHttpRequest();
    xhr.open('POST', 'https://example.com', true);
    xhr.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
    xhr.onload = function () {
        console.log(this.responseText);
    };
    xhr.send('param=add_comment');
Of course, we quickly wanted a library for it, most of us ended up using jQuery.get() et al before it wasn't comfortable up until .fetch appeared (or various npm libraries, if you were an early nodejs adopter)

This takes me back. I'm glad `fetch` has become the canonical way to do this. XHR was a new capability at the time, but back then we were just starting to learn about all the nasty things people could do by maliciously issuing XHR requests and/or loading random executables onto the page. Clickjacking was all the rage and nothing equivalent to Content Security Policy existed at the time.

I don’t think it’s just slowness or stability. The original release of requests was in 2011 and the standard library module (urllib.request) was added in Python 3.3 in 2012.

It’s way older than that. It used to live in urllib2, which dates back to at least Python 2.1, released in April 2001.

It has two! — http.client and urllib.request — and they are really usable.

Lots of people just like requests though as an alternative, or for historical reasons, or because of some particular aspect of its ergonomics, or to have a feature they’d rather have implemented for them than have to write in their own calling code.

At this stage it’s like using jQuery just to find an element by css selector (instead of just using document.querySelector.)


Requests used to have good PR back in the day and ended up entrenched as a transitive dependency for a lot of things. Because it’s for humans, right?

But recently I had to do something using raw urllib3, and you know what? It’s just as ergonomic.


That’s pretty much irrelevant given urllib3 is a third party dependency as well.

Sure historical popularity is a good reason for people who are already familiar with it to keep using it.

That is not really an excuse for why the standard library docs for the clients you mentioned link to requests though (especially if they were actually good, rather than just being legacy). If they really were good, why would the standard library itself suggest something else?


They could have used a database driver for msql, postgresql or mongodb for a more realistic example (very common for sysadmin type scripts that are only used once and then thrown away) and your complaint would be invalid, but then you'd have to set up the database and the example would no longer be fit for a quick blog post that gives you the opportunity to just copy paste the code and run it for yourself.

Well, the “this script needs package manager part”. The rest of my comment about the state of the HTTP client in Python would still be valid (but I probably wouldn’t have discovered it).

The standard library does not give you a possibility to do async HTTP requests, that's what httpx does. As Python still heavily relies on async this is really a bummer.

There’s absolutely no need for async http here. The script does one http request at the top of main. And a trivial one too (just a simple GET).

    response = urlopen(url)
    return json.load(response)
is what they’re saving themselves from.

In that case, sure, but if you have an entire async framework you don't want that blocking call.

For as much as Python is embracing async / coroutines, I'm surprised that their http functions do not support it yet.


> In that case, sure, but if you have an entire async framework you don't want that blocking call.

What “entire async framework”, do you mean asyncio or some other third party library? In the former case, are you using it just to feel cool like TFA?

> For as much as Python is embracing async / coroutines, I'm surprised that their http functions do not support it yet.

asyncio doesn’t even support async file io.


> What “entire async framework”, do you mean asyncio or some other third party library? In the former case, are you using it just to feel cool like TFA?

I was thinking more along the lines of a project like Home Assistant. For my personal stuff I have been using AnyIO.

> asyncio doesn’t even support async file io.

many operating systems do not support async file io to begin with.


I talked about that in my readme https://github.com/gabrielsroka/r

requests is really useful for non-trivial http requests (especially as urllib has terrible defaults around REST style interactions).

But here all the script is going is a trivial GET, that’s

    urllib.request.urlopen(url)

I agree. Requests is an embarrassment and indictment of the Python standard library. And so are dataclasses. They just should have subsumed attrs.

>And I’m not blaming the author, the standard library docs even recommend using a third party library (albeit not the one the author is using) on the closest equivalent (urllib.request)!

For perspective: urllib has existed since at least as 1.4 (released in 1996), as long as python.org's archive goes back (https://docs.python.org/release/1.4/lib/node113.html#SECTION...). Requests dates to 2011. httpx (the author's choice) has a 0.0.1 release from 2015, but effectively didn't exist until 2019 and is still zerover after a failed 1.0.0 prerelease in 2021. Python can't be sanely compared to the modern package-manager-based upstarts because it's literally not from that generation. When Python came out, the idea of versioning the language (not referring to a year some standards document was published) was, as far as I can tell, kinda novel. Python is older than Java, Applescript, and VB; over twice as old as Go; and over three times as old as Swift.

>Especially for a language that has not cared too much about backwards compatibility historically

It's always confused me that people actually see things this way. In my view, excessive concern for compatibility has severely inhibited Python (and especially packaging, if you want to include that despite being technically third-party) from fixing real problems. People switching over to 3.x should have been much faster; the breaking changes were unambiguously for the better and could not have been done in non-breaking ways.

There are tons of things the developers refuse to remove from the standard library that they would never even remotely consider adding today if they weren't already there - typically citing "maintenance burden" for even the simplest things. Trying to get anything added is a nightmare: even if you convince everyone it looks like a good idea, you'll invariably asked to prove interest by implementing it yourself (who's to say all the good ideas come from programmers?) and putting it on PyPI. (I was once told this myself even though I was proposing a method on a builtin. Incidentally, I learned those can be patched in CPython, thanks to a hack involving the GC implementation.) Then, even if you somehow manage to get people to notice you, and everyone likes it, now there is suddenly no reason to add it; after all, you're in a better position to maintain it externally, since it can be versioned separately.

If I were remaking Python today, the standard library would be quite minimal, although it would integrate bare necessities for packaging - APIs, not applications. (And the few things that really need to be in the standard library for a REPL to be functional and aware of the platform, would be in a namespace. They're a honking great idea. Let's do more of those.)


Lib/urllib.py was created "Tue Mar 22 15:37:06 1994", by renaming Lib/urlopen.py which was created "Mon Feb 21 17:07:07 1994".

I was referring to 3.x, but also to “minor” releases (not sure they use semver), where standard library functions and options are being removed occasionally.

So it is both “not conservative enough”, whilst as you say being overly conservative.

The main problem with “small libraries” is supply chain risk. This is why I try to use languages with a strong standard library (and first party external packages). Python would be a lot less useful without a strong standard library.


Anyone use PEP 723 + uv with an LSP based editor? What's your workflow? I looked into it briefly, the only thing I saw after a lot of digging around was to use `uv sync --script <script file>` and get the venv from the output of this command, activate that venv or specify it in your editor. Is there any other way, what I describe above seems a bit hacky since `sync` isn't meant to provide the venv path specifically, it just happens to display it.

Edit: I posted this comment before reading the article. Just read it now and I see that the author also kinda had a similar question. But I guess the author didn't happen to find the same workaround as I mention using the `sync` output. If the author sees this, maybe they can update the article if it's helpful to mention what I wrote above.


uv v0.6.10 has just been released with a more convenient way of doing this:

    uv python find --script foo.py
https://github.com/astral-sh/uv/releases/tag/0.6.10

https://docs.astral.sh/uv/reference/cli/#uv-python-find--scr...


How does it work ? How does it find the environment ?

Let say I have a project in `/home/boumtac/dev/myproject` with the venv inside.

If I run `uv python find --script /home/boumtac/dev/myproject/my_task.py`, will it find the venv ?


The philosophy of uv is that the venv is ephemeral; creating a new venv should be fast enough that you can do it on demand.

Do you have a standalone script or do you have a project? --script is for standalone scripts. You don’t use it with projects.

If you tell it to run a standalone script, it will construct the venv itself on the fly in $XDG_CACHE_HOME.

If you have a project, then it will look in the .venv/ subdirectory by default and you can change this with the $UV_PROJECT_ENVIRONMENT environment variable. If it doesn’t find an environment where it is expecting to, it will construct one.


Thanks! Came here to ask how to make pyright work with uv scripts...

pyright --pythonpath $(uv python find --script foo.py) foo.py



My general solution to project management problems with PEP 723 scripts is to develop the script as a regular Python application that has `pyproject.toml`. It lets you use all of your normal tooling. While I don't use an LSP-based editor, it makes things easy with Ruff and Pyright. I run my standard Poe the Poet (https://poethepoet.natn.io/) tasks for formatting, linting, and type checking as in any other project.

One drawback of this workflow is that by default, you duplicate the dependencies: you have them both in the PEP 723 script itself and `pyproject.toml`. I just switched a small server application from shiv (https://github.com/linkedin/shiv) to inline script metadata after a binary dependency broke the zipapp. I experimented with having `pyproject.toml` as the single source of truth for metadata in this project. I wrote the following code to embed the metadata in the script before it was deployed on the server. In a project that didn't already have a build and deploy step, you'd probably want to modify the PEP 723 script in place.

  #! /usr/bin/env python3
  # License: https://dbohdan.mit-license.org/@2025/license.txt
  
  import re
  import tomllib
  from pathlib import Path
  from string import Template
  
  import tomli_w
  
  DEPENDENCIES = "dependencies"
  PROJECT = "project"
  REQUIRES_PYTHON = "requires-python"
  
  DST = Path("bundle.py")
  PYPROJECT = Path("pyproject.toml")
  SRC = Path("main.py")
  
  BUNDLE = Template(
      """
  #! /usr/bin/env -S uv run --quiet --script
  # /// script
  $toml
  # ///
  
  $code
  """.strip()
  )
  
  
  def main() -> None:
      with PYPROJECT.open("rb") as f:
          pyproject = tomllib.load(f)
  
      toml = tomli_w.dumps(
          {
              DEPENDENCIES: pyproject[PROJECT][DEPENDENCIES],
              REQUIRES_PYTHON: pyproject[PROJECT][REQUIRES_PYTHON],
          },
          indent=2,
      )
  
      code = SRC.read_text()
      code = re.sub(r"^#![^\n]+\n", "", code)
  
      bundle = BUNDLE.substitute(
          toml="\n".join(f"# {line}" for line in toml.splitlines()),
          code=code,
      )
  
      DST.write_text(bundle)
  
  
  if __name__ == "__main__":
      main()

If you already have a pyproject.toml, and a "build and deploy step", why not just package normally? PEP 723 was developed for the part of the Python world that doesn't already live on PyPI (or a private package index).

I probably should! My motivation early into the project was to try different ways to distribute a TUI app in Python and see how practical they were.

I started with the most self-contained, Nuitka. I quickly switched to building a zipapp with shiv because it was faster and cross-platform if you avoided binary dependencies. I wanted to be able to share a Python application with others easily, especially with technical users who weren't heavily into Python. PEP 723 added the ability for those hypothetical users to inspect the app and modify it lightly with minimum effort. But since I am still the sole user, I can just build a wheel and install it with `uv tool install` on the server.


I'm generally not a fan of the incremental rustification of the Python ecosystem, but I started using uv a few weeks ago just for this particular case and have been liking it. And to the point where I'm considering to migrate my full projects as well from their current conda+poetry flow. Just a couple days ago I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.

Out of curiosity, what are some problems with rustification? Is it an aversion to Rust specifically or a dislike of the ecosystem tools not being written in Python?

The former is subjective, but the latter seems like not really much of an issue compared to the language itself being written in C.


Speaking for myself:

I have no aversion to Rust (I've read some of it, and while foreign, it comes across as much more pleasant than C or C++), but the way it's promoted often is grating. I'm getting really tired in particular of how the speed of Rust is universally described as "blazing", and how "written in Rust" has a sparkle emoji as mandatory punctuation. But maybe that's just because I'm, well, older than Python itself.

I don't really care that the reference implementation isn't self-hosting (although it's nice that PyPy exists). Using non-Python for support (other than IDEs - I don't care about those and don't see a need to make more of them at all) is a bit grating in that it suggests a lack of confidence in the language.

But much more importantly, when people praise uv, they seem to attribute everything they like about it to either a) the fact that it's written in Rust or b) the fact that it's not written in Python, and in a lot of cases it just doesn't stand up to scrutiny.

uv in particular is just being compared to a low bar. Consider: `pip install` without specifying a package to install (which will just report an error that you need to specify a package) on my machine takes almost half a second to complete. (And an additional .2 seconds with `--python`.) In the process, it imports more than 500 modules. Seriously. (On Linux you can test it yourself by hacking the wrapper script. You'll have to split the main() call onto a separate line to check in between that and sys.exit().)


It's more the latter, particularly when Rust is used in libraries (eg. FastAPI) as opposed to tools, as it's destroying portability. For example I use flet[0] in some of my projects, and I have to be increasingly careful about the other dependencies as there is no support for the Rust toolchain within Dart/Flutter, and even if there was it still sounds like it'd be a nightmare to maintain. Same applies to any other platforms/apps out there that support running Python for flexibility, and handling another language is just way out of scope (and I'm pretty sure there are quite a few). A key part of Python's existence is as glue between disparate system parts, and rustification is reducing it's usefulness for an increasing number of niche cases where it once excelled.

[0] https://flet.dev


I can understand the sentiment somewhat. It's another layer of complexity and it makes working on projects more difficult. The fact pip or mypy code is all Python makes it much easier to interact with and patch if needed.

You can also write Cython for more perf oriented code but I can totally understand the value Rust brings to the table, it's just now another language you'll need to know or learn, more layers like maturin or pyO3, while cffi is just there.

All the tooling coming from astral is amazing and I use it everyday but I can see the increasing complexity of our toolchains, not in ergonomics (it's much better now) but the tools themselves.


> he fact pip or mypy code is all Python makes it much easier to interact with and patch if needed

But how often in your career have you actually done this?


Agree, this only seems like a problem for the vast minority that are developing developer tools. And, to be honest, they got themselves in this situation due to a skill issue, too, Astral is just stomping the existing tools with better ergonomics and speed.

I haven't done much patching myself, but I've done it enough times to really appreciate having the capability when I do want to do it. It's one of the reasons why I have a huge preference for Python: its flexibility.

I'm not doing that but I'm not developing developer tooling either. I can go read mypy code though!

You can read uv code too. Rust isn't that hard to read a lot of the time

I've been doing it recently as part of diagnosing Pip's performance issues.

A problem with rustification is that it puts a giant ecosystem on a giant ecosystem, with poorly matched tooling. C has a lot of home ground advantage, and CPytjon is built on it.

Then you have PyPy which you’d have to accommodate somehow.

It doesn’t help that in a case where you have to build everything, Rust build toolchain currently needs Python. That sure would make bootstrapping a bitch if Python and Rust became a circular dependency of one another.


> Then you have PyPy which you’d have to accommodate somehow.

Adding pypy support to a pyo3 + maturin project was literally just a matter of telling maturin to build that wheel. And I added graal while at it.

Hopefully they eventually add stable ABI support too so I don’t have to add individual pypy/graal wheel targets.

Or pyo3 and maturin may support hpy once that’s stable.


    I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.
This sounds like a really useful modification to the LSP for Python. Would you be willing to share more about how you patched it and how you use it in an IDE?

I have a somewhat particular setup where I use conda to manage my envs, and autoenv[0] to ensure the env for a given project is active once I'm in the folder structure. So there's a .env file containing "conda activate <env_name>" in each. I also use Emacs as my sole IDE, but there are quite a few instances where support falls short for modern workflows. I use the pylsp language server, and it's only able to provide completions, etc for native libraries, since by default it doesn't know how to find the envs containing extra 3p packages.

And so I wrote a patcher[1] that searches the project folder and parents until it finds an appropriate .env file, and uses it to resolve the path to the project's env. With the latest changes to the patcher it now uses the output from "uv sync", which is the path to a standalone script's env, as well as the traditional "source venv_path/bin/activate" pattern to resolve envs for uv-managed projects.

[0] https://github.com/hyperupcall/autoenv [1] https://gitlab.com/-/snippets/2279333


what's the --dry-run hack ?

Using "--dry-run" makes the command a no-op, but still prints the env path.

Bonus points for "Bonus: where does uv install its virtual environments?" section! I was wondering the same question for a long time but haven't had a chance to dig in. It's great that venv is not being recreated unless any dependencies or Python version got modified

Thanks for the positive feedback! I was curious too and thought others would enjoy hearing what I learned.

You can also run `uv cache dir` to show the location.

I used to have a virtual environment for all little scrappy scripts, which would contain libraries I use often like requests, rich, or pandas. I now exclusively use this type of shebang and dependency declaration. It also makes runnings throwaway chatgpt scripts a lot easier, especially if you put PEP-723 instructions in your custom prompt.

This was discussed somewhat recently in https://news.ycombinator.com/item?id=42855258

This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Now anyone you give your script to has to install uv first.


> This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Not at the moment, but will your grandma run a script? There is an interesting thing you can already do today for larger applications which is to install uv alongside your app. You can make a curl-to-bash thing or similar that first installs uv into a program specific location to then use that to bootstrap your program. Is it a good idea? I don't know, but you can do that.


For simple scripts (I never succeeded using it on something really complex, but it's great when you don't want to use bash but need something like Python) I had used this approach that still works nowadays and has no uv dependency (only requires pip to be installed in the same Python interpreter that you're using to run your script):

https://www.franzoni.eu/single-file-editable-python-scripts-...


You can write a bash shebang that curl into shell. Unfortunately when I did it and gave to grandma it has failed because grandma has substituted oil shell and linked it as sh which is not best practice. I think grandmother shell script is simply impossible. They have spent decades acquiring idiosyncratic unix environment

Good thing those days init is systemd instead of a series of de jure POSIX shell scripts but de facto bashism shell scripts that will fail to boot if you swap /bin/sh away from bash.

At least ubuntu helped force the ecosystem to at least pretend to support /bin/dash too.


For this case, it might be easier to package the script using pyinstaller. That way, she can just run it. Packaging it that way is more work on your side though.

I think uv should become a default package for most operating systems.

It automatically downloads interpreters from some internet source. It's a security nightmare. It can be configured not to do that but it's not the default.

I'm not sure that's fair. It downloads standalone builds which astral themselves maintain. I'd say they're pretty trust-worthy.

If you're worried about installing code from internet sources, which I think is valid, then pip/uv/package-managers-in-general open cans of worms anyway.


> I'm not sure that's fair. It downloads standalone builds which astral themselves maintain. I'd say they're pretty trust-worthy.

That's not how trust works. Trust exists as a relationship between two entities. From a security perspective, an entity being "trust-worthy" is meaningless. What matters is whether I trust it or not.

If I install, for example, Debian GNU/Linux, then I'm trusting Debian. I wouldn't expect it to come with a tool that will automatically go and download and run binaries from some other place that I have no knowledge of.

To be clear, it's not a jab at uv as a developer tool. If you're doing dev work then you have to accept the risk. It's about uv being bundled as a system tool such that you can send a script to grandma.


Moreover, you aren't trusting Astral, you are trusting arbitrary malware download MITM infrastructure afflicting your network.

It's a package manager. The job of package managers is to download code that you then run. That certainly has security implications, but that doesn't differentiate uv from pip, Poetry, Cargo, CPAN, npm, RubyGems, ...

"uv run file.py" downloading remote dependencies was a surprise to me

Why would you use `uv run file.py` instead of `python3 file.py`? It seems to me that downloading dependencies is the point of doing that.

well you can just give her the `./install_uv.sh && ./run_script.sh` command, eg

`( curl -LsSf https://astral.sh/uv/install.sh | sh ) && ./run_script.sh`


Next up: uv competitor compiled with cosmopolitan libc.

You don't need to run the script as `py wordlookup.py` or make a batch file `wordlookup.cmd` in Windows.

The standard Python installation in Windows installs the py launcher and sets it as the default file handler for .py (and .pyw). So if you try to run `wordlookup.py` Windows will let the py launcher handle it. You can check this with `ftype | find "Python"` or look in the registry.

You can make it even easier that that though. If you add .py to the PATHEXT environment variable you can run .py files without typing the .py extension, just like .exe and .bat.


Albeit uv is amazing this not a unique feature of the project.

Hatch has this feature since a year or so too. https://hatch.pypa.io/latest/how-to/run/python-scripts/


As mentioned in the article, along with PDM.

I have also been switching to uv recently, frequently with --script, and I love it. What I havn't yet figured out though is how to integratge it with VScode's debugger to run the script with F5. It seems to insists on running what it thinks is the right python, not respecting the shebang.

Shameless plug for an old approach I use for various scripts when I think bash is not enough:

https://www.franzoni.eu/single-file-editable-python-scripts-...

This doesn't require UV, just pip within the same interpreter, but I wouldn't use it for something big, and still requires deps to be updated every now and then ofc (I never tried with raw deps, I always pin dependencies).


Oh hey, I have seen your post. Making a script download the dependencies on its own is an interesting challenge. I am a big fan of inline script metadata in Python, and I was an early adopter when pipx implemented PEP 722 (the precursor to PEP 723), but I made my version for fun.

https://pip.wtf/ was on HN not that long ago (https://news.ycombinator.com/item?id=38383635). I had my own take on it that used virtual environments, supported Windows, and was d a free license: https://github.com/dbohdan/pip-wtenv.


* was under a free license

Speaking as someone who writes Python code for a living, I like the language, but I consider the ecosystem dire. No one seems able to propose a solution to the problem of 'how do I call someone else's code?' that isn't yelling 'MOAR PACKAGE MANAGERS' in their best Jeremy Clarkson impression.

I have no idea how any of it works and I see no point in learning any of it because by the time I've worked it out, it'll all have changed anyway.

At work, there are plenty of nutjobs who seem to enjoy this bullshit, and as long as following the instructions in the documentation allow me to get the codebase running on my machine, I don't have to deal with any of it.

At home, I refuse to use any Python package that isn't in the Debian repositories. Sure, it's all 'out of date', but if your package pushes breaking changes every fortnight, I'm not interested in using it anyway.

If people are still talking about how great uv is in five years' time, maybe I'll give it a go then.


I totally agree, but uv is the real deal. It's not another Poetry, Pipenv, etc.

uv takes Python infra from "jesus this is awful" to "wow this is actually quite nice". It is a game changer.

You should really try it now. Waiting 5 years is just needless self-flagellation.

IMO the only risk is that Astral run out of money, but given how dire the situation without uv is, I'd say it's easily worth the risk.


The python ecosystem will catch up. Before Bambu Lab a lot of 3D printer companies produced garbage printers and after Bambu Lab every 3d printer company has almost 1:1 copied their printers, implying that they were selling garbage all these years, because they have no trouble catching up with Bambu Lab the moment they had to (to stay relevant).

Not worth trying to drag folks with this mindset into the future. The way I see his workflow(and I do get it, I’m stubborn with some financial stuff), is same way he sees using uv and other new stuff. I agree uv is real deal and will be around for awhile. It has totally reignited my love of writing python. I will say, the love of uv on hacker news has surprised me. I was expecting a lot more replies like theirs.

Yeah me too. HN tends to be quite stuck-in-the-mud heavy (e.g. the you often see this in discussions around Rust).

Tbf I kind of understand his point of view - there have been many many failed attempts to fix Python tooling and it's easy to expect uv to be just another failed attempt.

I think it says a lot about just how bad the situation before uv was that even HN is positive about it.


Speaking as someone who enjoys reading after dark, I like lanterns, but I consider the ecosystem dire. No one seems able to propose a solution to the problem of how do I keep this lantern lit all night without soot and fuel on my hands. At home I refuse to try any fuel that I can't get from the meatpacker's leftovers anyway. If people are still talking about electricity and bulbs in five year's time, maybe I'll give it a go then.

I've used plenty, but uv is basically a one stop shop with a logical workflow.

It has sane defaults so really I'd recommend most people just use it for everything, unless they some very specific reasons not to.


How long do these isolated uv-created venvs persist for? If you have a lot of scripts then it’s going to be a lot of venvs hanging around ready for subsequent reuse if the same script is run?

They hardlink files so venvs don't take up a lot of disk space

I've switched all my scripts to include their own dependencies, and it works like a charm.

Only problem I haven't been able to solve is how to convince my IDE (PyCharm) to run all scripts through uv before executing them / debugging them.

PyCharm does have uv support, but from what I can see only for uv managed projects, not for individual scripts with embedded requirements.


Can uv install Python?

Is it possible to curl the uv binary and then invoke such a packaged script with --no-cache to run everything, including the Python installation, from /tmp?


Yes, uv can install Python: https://docs.astral.sh/uv/guides/install-python/

There are a bunch of environment variables for controlling where it puts things, here's one that looks relevant: https://docs.astral.sh/uv/configuration/environment/#uv_pyth...


Yes. Uv can install specific Python versions from python build standalone (they have taken over maintenance for it https://astral.sh/blog/python-build-standalone ) instead of from python.org because python doesn't release relocatable binaries (and doesn't release binaries for Linux I think). It works well but does have a few minor https://gregoryszorc.com/docs/python-build-standalone/main/q...

Im fairly certain the answer to this is “yes”. Probably need to futz with env cars to get all the caches etc into /tmp though. It needs to put Python _somewhere_

Yes - in fact I recently uninstalled pyenv and poetry, switched to uv, used it to install python and poetry (for work projects)

Does anyone know a way to reuse that trick for jupyter notebooks? So that one could share a notebook declaring it's dependencies .

Not sure it's as performant, but jupyter notebooks can install dependencies at run time with `!pip install my_cool_library==2.3.1`.

If feels a little bit less elegant, and you don't get access to uv's caching goodness, but that'd more or less achieve what you're looking for!


That works with uv as well. Launch Jupyter with

   uv tool run jupyter lab
and then put

   !uv pip install my_cool_library
   !uv pip install other_library
   ...
in the first cell. Now you get full uv caching goodness.

Oooh, now that's glorious!

The only thing to be aware of with this approach is that you don't get an isolate venv for each notebook. So if you're working on one notebook that needs my_lib 1.X and one that needs my_lib 2.X you'll need to manually create separate venvs for each and make sure you start Jupyter in the right venv for each notebook.

Right, though that requires an existing virtualenv with jupyter in the first place.

I was more dreaming of something where I can send a notebook to someone not technical, and that just bootstraps on the fly an env with jupyter and the required dependencies.


I don't know of a way to do this for jupyter, but marimo (alternative notebook environment to jupyter) does support self declared dependencies, and indeed uses uv to provide that support.

There are libraries like `juv` that let you use uv in Jupyter. What I haven’t found yet is a nice and convenient way of running all that in vs code.

It's one of the coolest features of uv. I'm using it to vibe scripts and execute them immediately - https://everything.intellectronica.net/p/the-little-scripter

Tangential, I want to whip up simple apps (or instructing an LLM to do so) but its simpler to do formatted text/tables, inputs, graphs etc in a single HTML file+JS. Which library should I adopt? Seems Marimo is the closest, but are there lighter web popups, graphs, inputs etc?

I let Claude build them (which it does in React). Then I copy and paste them into o3-mini-high and ask it to port to raw html and JS. It pulls in some chart libraries and goes to town. Give it a crack and see.

The ads make this hard to read on mobile.

I didn't see any. Mobile browsers have nice ad blockers, too.

Does firefox?


uv add --script wordlookup.py httpx

# /// script # requires-python = ">=3.13" # dependencies = [ # "httpx", # ] # ///

import httpx

That seems pretty redundant. Why can't UV just infer the dependencies from the import statements?


There's a few things:

- uv won't necessarily know which modules aren't already present, because they can load from modules being developed in the current directory

- It's not easy to determine what imports will happen in a Python script due to transitive dependencies and dynamic loading

- The names of python modules are not always the same as the packages that contain them - for example the yaml module comes from the pyyaml package

Also the second line of the Zen of Python is "explicit is better than implicit": https://peps.python.org/pep-0020/


- It's not easy to determine what imports will happen

  fair enough, but adding option to auto import from the main file would solve 97% of use cases
- uv won't necessarily know which modules aren't already present

  easy to check
- The names of python modules are not always the same

  great opportunity to add convenient simple mapping / name resolution
- Zen

Should DRY don't repeat yourself be higher up?


This is covered in the Pep 723 document, it's not a uv invention.

https://peps.python.org/pep-0723/#why-not-infer-the-requirem...


which version of httpx?

same as in dependencies = [ # "httpx", # ]

the current one by default


> This approach eliminates the need for complex setup tools like requirements.txt or package managers...

And yet, the rest of the article is about uv. According to uv itself:

> An extremely fast Python package and project manager, written in Rust.

It's a package manager!


A package manager that is a quick and snappy binary that doesn't need a big runtime, unlike most of the Python tools.

What exactly is your standard for "big"?

    $ du ~/.local/pipx/venvs/uv/bin/uv | cut -f 1
    38812
Stripped and dynamically linked, BTW. Compare the system-provided Python:

    $ du /usr/bin/python3.12 | cut -f 1
    7832
(But also, if you hope not to pay the cost of a Python runtime, what is your remaining use case for uv?)

Seems like you're dismissing the uv single file setup approach without fully understanding it. I'd recommend giving it a try. It's indeed simpler and snappier than any other package manager to date.

Python is fun again!! Omg it’s like when I first started with python before I knew all the pitfalls that were coming. Uv just makes it work again(unaware of all uv pitfalls atm, don’t spoil it for me yet:)

I'm not dismissing uv, I'm critiquing the article.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: