minus-squaremoomoomoo309@programming.devtoSelfhosted@lemmy.world•What AI services are you selfhosting? Or, have tested and passed onlinkfedilinkEnglisharrow-up1·21 hours agoYour M.2 port can probably fit an M.2 to PCIe adapter and you can use a GPU with that - ollama supports AMD GPUs just fine nowadays (well, as well as it can, rocm is still very hit or miss) linkfedilink
Your M.2 port can probably fit an M.2 to PCIe adapter and you can use a GPU with that - ollama supports AMD GPUs just fine nowadays (well, as well as it can, rocm is still very hit or miss)