Hi all, I’d like to hear some suggestions on self hosting LLMs on a remote server, and accessing said LLM via a client app or a convenient website. Either hear about your setups or products you got good impression on.

I’ve hosted Ollama before but I don’t think it’s intented for remote use. On the other hand I’m not really an expert and maybe there’s other things to do like add-ons.

Thanks in advance!

  • Bluefruit@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    16 days ago

    If i can figure it out I’ll be sure to post something lol.

    So far, i found a python project that is supposed to enable RAG but i have yet to try it and after a reinstall of my linux pc to Popos, im having less than success getting ollama to run.