Custom local LLM models
complete
Neil Chudleigh
complete
Shipped in v1.34
Neil Chudleigh
Make sure Ollama is running!
S4GU4R0
Neil Chudleigh Thank you! I figured this had already been implemented, but I couldn't find it.
J
Jonathan
Ollama instance support would be useful.