Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
PcChip
on June 19, 2023
|
parent
|
context
|
favorite
| on:
OpenLLM
Let's say I wanted to use one of their quantized models with this OpenLLM project. How would I do that?
Aerroon
on June 19, 2023
[–]
Sorry, I don't know. I suspect that it's not possible (yet?). OpenLLM lists a bunch of models in the Github Readme. I think the best way would be to use those for now.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: