Not sure if this goes here or if this post will be hated upon? but i want to host ai like llms and comfyuis newer models locally but im not sure what type of setup or parts would work best on a possible slim budget? im not sure either if now is the time with inflation and such.

I dont have a price in mind yet but im wondering how much it would cost or what parts i may need?

If you have any questions or concerns please leave a comment.

  • Atherel@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    9 hours ago

    As others said it all depends on what you expect. I run stable diffusion on my gaming pc with 32GB RAM and a AMD 9070xt and it works fine. Did also on a 6800xt before that one died. A GPU with 16GB RAM helps a lot, would say that 12GB is the minimum. Lower will limit you in the models and speed.

    For LLM just try it out, they work fine without special hardware for smaller models and as long as you are the only user. There are tools like Jan or lmstudio which make it easy to run.