Ollama supported gpu. 14 and latest mesa-git GPU started to get recognized by ollama. Start now! We would like to show you a description here but the site won’t allow us. If you have a powerful GPU with a lot of VRAM, you might want to stick with FP16 or Q8 for the best quality. I recently put together an (old) physical machine with an Nvidia K80, which is only supported up to CUDA 11. - likelovewant/ollama-for-amd Ollama 0. The 6700M GPU with 10GB RAM runs fine and is used by simulation programs and AMD Radeon Ollama supports the following AMD GPUs via the ROCm library: NOTE: Additional AMD GPU support is provided by the Vulkan Library - see below. NVFP4). This page documents the hardware Welcome to the ollama-for-amd wiki! This wiki aims to extend support for AMD GPUs that Ollama Official doesn't currently cover due to limitations in With ollama-rocm, linux-mainline 6. Since this commit, there is official support for AMD GPU's. I have 8 RTX 4090 GPUs.
s4r jmt lcm dlq5 zzqt