Ollama fails to utilize GPU after driver update (NVIDIA)
Daniel Nachtrub -
As we're working - just like everyone else :-) - with AI tooling, we're using ollama host host our LLMs. Updating to the recent NVIDIA drivers (555.85), we can see that ollama is no longer using our GPU.
Testing the GPU mapping to the container shows the GPU is still there:
Long story short: After all, the reason seems to be an issue between NVIDIA driver 555.85 and ollama. Downgrade the driver (for example to 552.44) and all is fine again :-)