BoltAI Blog
BoltAI HomepageBlogDocumentation
  • BoltAI Blog
  • What is ChatGPT o1? Understanding Its Features, Benefits, and Uses
  • Claude 3.5 Sonnet vs GPT-4o: A Comprehensive Comparison
  • ChatGPT API Cost: Features, Plans, Pros and Cons
  • How to Use ChatGPT API: A Comprehensive Guide
  • Top AI Tools for Developers: Boost Productivity and Code Smarter
  • How to Run LLM Locally on Mac: A Step-by-Step Guide
  • How to Use ChatGPT as a Search Engine: A Complete Guide
  • ChatGPT vs Claude: Which AI Tool Fits Your Needs?
  • ChatGPT vs Gemini: Which AI Tool Is Right For You?
  • Perplexity vs. ChatGPT: Our In-Depth Comparison
  • How to Train ChatGPT on Your Own Data: Enhance AI Accuracy & Relevance
  • DeepSeek vs. ChatGPT: Which AI Model Is Right for You?
  • Exploring the Top 10 ChatGPT Alternatives for Better AI Conversations in 2025
  • Top 7 AI Tools for Students to Boost Productivity and Success in 2025
  • How to Get a ChatGPT API Key: Step-by-Step Guide
  • Tech Stack Analysis for a Cross-Platform Offline-First AI Chat Client
  • BoltAI Projects, DeepSeek support and more
  • A Developer’s Guide to Bard vs. ChatGPT for Coding
  • ChatGPT Keyboard Shortcuts for Mac: Enhance Your Workflow with Quick Commands
  • ChatGPT for Programmers: How to Boost Productivity and Efficiency
  • Here’s Our Step-by-Step Guide on How to Use Mistral 7B
  • Claude vs. ChatGPT for Coding: Which AI Assistant is Best for You?
  • Amazon Bedrock & xAI support, cache breakpoint and more
  • Advanced Voice Mode, Improved Document Analysis and more
  • How to use local Whisper instance in BoltAI
  • Optimize Ollama Models for BoltAI
  • How to use xAI in BoltAI?
  • How BoltAI handles your API keys
  • How to build an AI Coding Assistant with BoltAI
  • Best Black Friday Deals 2024 for Mac
  • A simple A/B testing setup with Simple Analytics
Powered by GitBook
On this page
  • 0. Prerequisites
  • 1. Add a custom service in BoltAI
  • 2. Configure BoltAI to use this custom service
  • That's it

Was this helpful?

How to use local Whisper instance in BoltAI

Requires v1.27.2 or later

PreviousAdvanced Voice Mode, Improved Document Analysis and moreNextOptimize Ollama Models for BoltAI

Last updated 6 months ago

Was this helpful?

The Whisper model is a speech to text model from OpenAI that you can use to transcribe audio files. In BoltAI, you can use the Whisper model via OpenAI API, Groq or a custom server.

Follow this step-by-step guide to setup and use a local Whisper instance in BoltAI

0. Prerequisites

Make sure your local Whisper instance is up and running. This is outside the scope of this guide. Here are some pointers for you:

1. Add a custom service in BoltAI

  • Go to Settings > Models. Click the plus (+) button and select "OpenAI-compatible Server"

  • Fill the form, make sure to enter the full URL to the API endpoint

  • Click "Save (Skip Verification)"

2. Configure BoltAI to use this custom service

  • Go to Settings > Advanced > Voice Settings

  • In the "Whisper Settings" section, select your newly created service ("Local Whisper" in this example)

Now both the main chat UI and the Inline Whisper will use this local whisper instance instead of OpenAI.

That's it

It's pretty simple, isn't it? If you have any question, feel free to send me an email. I'm happy to help.

tiny-openai-whisper-api
whisper.cpp
lightning-whisper-mlx
whisper-turbo-mlx
openedai-whisper
Setup a custom service in BoltAI
Configure BoltAI to use the local Whisper instance