You can change the endpoint, which anyone could learn from reading the comments on the release yesterday or reading the wiki
> I think that one of the greatest errors that was made with putting this in iTerm2 was making a big show of it, and by not letting you use local models (such as with Ollama) instead of having OpenAI be the only option.
There is not a single line this is that is true. A big show was not made and you can use Ollama with it. The “big show” was made by other people not the developer behind iTerm2.
> I think that one of the greatest errors that was made with putting this in iTerm2 was making a big show of it, and by not letting you use local models (such as with Ollama) instead of having OpenAI be the only option.
There is not a single line this is that is true. A big show was not made and you can use Ollama with it. The “big show” was made by other people not the developer behind iTerm2.