Hacker News new | past | comments | ask | show | jobs | submit login

Even Facebook is more open than OpenAI. They've released, under somewhat open licenses, models like Galactica and opt-175b, which is of similar size to GPT-3, though maybe not as good.

Here's a guide to running BLOOM, another 175 billion parameter model, on your local computer, just using the CPU, maybe something similar would also work for Facebooks models. https://towardsdatascience.com/run-bloom-the-largest-open-ac...

With this you can expect it to take around three minutes for each word/token the model outputs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: