How to use Mistral AI on macOS with BoltAI
Last updated
Last updated
Mistral AI is a French company in the field of artificial intelligence. The company produces open-source large language models, with its most notable models being Mistral 7B and Mixtral 8x7B. Mistral AI is committed to open-source software development and welcomes external contributions.
Mistral AI's models are available under the Apache 2.0 License, and the company provides an API for pay-as-you-go access to its latest models.
BoltAI supports Mistral AI's API (pay-as-you-go plan). If you don't have an API Key yet, go to their website and join the waitlist.
Here are some useful links:
๐ Join the waitlist
๐ View your Mistral AI subscription
๐ View your Mistral API Keys
Once you got your API Key ready, setting it up in BoltAI is simple: Go to Settings > Models, click (+) button and choose "Mistral AI"
Give it a friendly name.
Enter the API key
Set the default model. Mistral AI offers 3 options: Mistral Tiny, Mistral Small & Mistral Medium
Set the maximum context length. Mistral models support up to 32k context length, but you can limit it to save cost.
You can optionally set it to the default service in BoltAI.
You should be able to start a chat with Mistral now: