I wrote a step-by-step guide on how to setup Ollama with larger context length a while ago: https://prompt.16x.engineer/guide/ollama
TLDR
ollama run deepseek-r1:14b /set parameter num_ctx 8192 /save deepseek-r1:14b-8k ollama serve
I wrote a step-by-step guide on how to setup Ollama with larger context length a while ago: https://prompt.16x.engineer/guide/ollama
TLDR