• Stampela@startrek.website
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    16 hours ago

    In a totally unrelated note, https://msty.app/ is extremely easy to use and runs the LLM locally. Good choices are Llama 3.2; Granite; Deepseek r1; Dolphin 3… can run on Nvidia, Apple and cpu. They say AMD too and know how to since November but it’s not working. Not as friendly but https://lmstudio.ai/ runs on just about any hardware.