I’m looking for a resource efficient AI model for text generation (math, coding etc.) that will work with LocalAI. Which model should I use? I don’t want it to use more than 1-3 GB RAM. I’ll run it on a vps to use with Nextcloud.
Edit: I’m use Mistral AI and Groq.com instead of selfhosting the models. They both have generous free plan.
You must log in or # to comment.
None. There is no model that can output anything even remotely usable on that tiny amount of RAM and certainly not using the few CPU cycles your vps has to offer.