Overview

An AI plugin allows you to extend the capability of a large language model (LLM) using Function Calling (or Tool Use).

BoltAI supports multiple AI plugins: Web Search, Web Browing, AppleScript, Shell Access, FFmpeg...

How does it work?

Without using AI plugins, the LLM would answer your prompt with the knowledge from its training data, which might not be up-to-date.

When using with AI plugins, the LLM would smartly decide which plugins to use, and how to use it to give you the most accurate answers.

What can I do with AI plugins in BoltAI?

BoltAI supports multiple AI plugins:

  • Web Search: allows LLM to search for external, up-to-date data using a search engine.

  • Web Browsing: allows LLM to read the content of an external web page.

  • WolframAlpha LLM: uses WolframAlpha LLM API to give the most accurate answers.

  • Image Generation: allows LLM to generate AI images following your prompts

  • Audio Transcription: allows LLM to transcibe your local audio files

  • FFmpeg: uses ffmpeg to compress, modify and improve your video or audio files

  • AppleScript: controls your computer using AppleScript code

  • and more...

Follow the plugin's help page for more information.

Why I can't see the AI Plugin option?

This feature requires the AI model to support Function Calling capability. If you don't see the Plugin option, it means the AI model you're using doesn't support this capability.

Try switching to a more capable AI model.

Which license I need to use the AI Plugin feature?

  • Basic License: does not have access to the AI Plugin feature

  • Plus License: can use Web Search and Web Browsing plugins

  • Premium License: can access all plugins

  • Setapp users and Legacy License: can access all plugins

FAQ

  1. Can I use AI plugins without using OpenAI models? Yes. You can use AI plugins with any LLM that supports Function Calling such as Anthropic Claude 3, Gemini 1.5 Pro, Groq LLaMA3 70b...

  2. Can I use AI plugins with a local model on Ollama? Not yet. Technically, it's possible if the local model supports function calling. Unfortunately, BoltAI also requires the inference server to follow OpenAI's tool use spec, which Ollama doesn't.

Last updated