I’m looking for a resource efficient AI model for text generation (math, coding etc.) that will work with LocalAI. Which model should I use? I don’t want it to use more than 1-3 GB RAM. I’ll run it on a vps to use with Nextcloud.

Edit: I’m use Mistral AI and Groq.com instead of selfhosting the models. They both have generous free plan.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 days ago

    I dont know if any specific model will be the right answer, but Qualcomm has their Snapdragon event going on right now, and many of the advancements they are touting are specifically for local AI processing.

    So, computing power will improve significantly over the next few years, with AI being the largest benefactor.