Single comment thread
See full discussion

If you really want to do it yourself, I think it's going to be cost-prohibitive.

You would need to fork over several hundred dollars every single month for hardware with a GPU on it (assuming you need it online 24/7), then spend time configuring dependencies and so on.

Better to pay a provider that already has the necessary hardware: artificialanalysis.ai/models/…

Right now Groq or DeepInfra is the cheapest. Or if you already have AWS/Azure/whatever credits, use those first obviously as all the major cloud platforms have an AI service meant for executing jobs against LLMs these days.

Hmm, you’re raising fair questions. I always knew it would be hard, didn’t think about the new model upgrades though.

Home
Search
Messages
Notifications
More