Use Ollama with RecurseChat
You can use local Ollama with RecurseChat through Ollama’s support of OpenAI compatible API.
The steps are:
- Serve an Ollama model with
ollama serve
- Create a new OpenAI API model: go to top right of the model tab, and click the “New Model” button, and click “New OpenAI Chat Completion model”.
- To use Ollama model, set base url to http://127.0.0.1:11434/v1 and set model id to an Ollama model id like mistral.