Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let's say I wanted to use one of their quantized models with this OpenLLM project. How would I do that?


Sorry, I don't know. I suspect that it's not possible (yet?). OpenLLM lists a bunch of models in the Github Readme. I think the best way would be to use those for now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: