Use another AI Service
Last updated
Last updated
The beauty of BoltAI is that you can use the same features with different AI Services beyond OpenAI. BoltAI supports popular AI Services such as Anthropic, Google AI, Perplexity AI, Mistral AI or local LLMs via Ollama
Let's set up Ollama and use a local model with BoltAI.
Ollama is a tool that helps us run large language models on our local machine and makes experimentation more accessible.
Installing Ollama is pretty straigh-forward.
Go to Ollama Website and download the latest version
Run Ollama. You should see the Ollama icon on the Mac Menubar
Open Terminal
, run ollama run mistral
to run and chat with Mistral model
You can find more details in Ollama github repository
Open BoltAI app, go to Settings > Models. Click on the plus button (+) and choose Ollama
Choose a default model. Optionally, you can click "Refresh" to fetch the list of available models
Click "Save Changes"
Start a new chat, then switch to Ollama AI Service.
(Optional) Choose a different model or a custom System Instruction
You're ready to chat
Go to Settings > Commands, choose an AI Command, and set the AI Provider to Ollama. You can now use this AI Command 100% offline.
Demo: