Use another AI Service

The beauty of BoltAI is that you can use the same features with different AI Services beyond OpenAI. BoltAI supports popular AI Services such as Anthropic, Google AI, Perplexity AI, Mistral AI or local LLMs via Ollama

Let's set up Ollama and use a local model with BoltAI.

Install Ollama

Ollama is a tool that helps us run large language models on our local machine and makes experimentation more accessible.

Installing Ollama is pretty straigh-forward.

  1. Go to Ollama Website and download the latest version

  2. Run Ollama. You should see the Ollama icon on the Mac Menubar

  3. Open Terminal, run ollama run mistral to run and chat with Mistral model

You can find more details in Ollama github repository

Set up Ollama in BoltAI

  1. Open BoltAI app, go to Settings > Models. Click on the plus button (+) and choose Ollama

  2. Choose a default model. Optionally, you can click "Refresh" to fetch the list of available models

  3. Click "Save Changes"

Use Ollama in Chat UI

  1. Start a new chat, then switch to Ollama AI Service.

  2. (Optional) Choose a different model or a custom System Instruction

  3. You're ready to chat

Use Ollama with AI Command

Go to Settings > Commands, choose an AI Command, and set the AI Provider to Ollama. You can now use this AI Command 100% offline.


Last updated