5 min readUpdated Mar 2, 2026

Supported LLMs

Vantage supports multiple large language model (LLM) providers through Intuidy AI. You can use the built-in default provider, or connect your own API key to any of the supported providers below.


Provider Overview

ProviderModelsAPI Key RequiredSetup Complexity
Intuidy AIDefault modelNoNone — works out of the box
OpenAIGPT-4o, GPT-4o-mini, GPT-4 Turbo, GPT-3.5 TurboYesLow
Claude (Anthropic)Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 HaikuYesLow
Gemini (Google)Gemini Pro, Gemini Ultra, Gemini 1.5 ProYesLow
DeepSeekDeepSeek Chat, DeepSeek CoderYesLow
Grok (xAI)Grok-1, Grok-2YesLow
MistralMistral Large, Mistral Medium, Mistral Small, MixtralYesLow

Provider Details

Intuidy AI (Default)

The built-in AI provider that works with no configuration. Ideal for teams that want to start using AI immediately without managing API keys or provider accounts.


OpenAI

The most widely-used AI provider, offering the GPT family of models.

ModelContext WindowBest For
GPT-4o128K tokensComplex analysis, nuanced reasoning
GPT-4o-mini128K tokensFast, cost-effective general use
GPT-4 Turbo128K tokensLegacy compatibility
GPT-3.5 Turbo16K tokensBudget-friendly, simpler tasks

Claude (Anthropic)

Known for detailed, thoughtful responses and strong performance on long-form analysis.

ModelContext WindowBest For
Claude 3.5 Sonnet200K tokensBest all-around performance
Claude 3 Opus200K tokensMost capable, highest quality
Claude 3 Haiku200K tokensFast, cost-effective

Gemini (Google)

Google's multimodal AI models with strong data analysis capabilities.

ModelContext WindowBest For
Gemini 1.5 Pro1M tokensLarge dataset analysis
Gemini Pro32K tokensGeneral-purpose tasks
Gemini Ultra32K tokensMost capable

DeepSeek

Emerging provider with strong performance on coding and analytical tasks.

ModelBest For
DeepSeek ChatGeneral conversation and analysis
DeepSeek CoderCode-heavy tasks and technical analysis

Grok (xAI)

xAI's models designed for real-time, up-to-date information and analysis.

ModelBest For
Grok-2Latest capabilities, general use
Grok-1Basic analysis, fast responses

Mistral

European AI provider offering efficient, high-quality models.

ModelBest For
Mistral LargeComplex analysis, highest quality
Mistral MediumBalanced quality and speed
Mistral SmallFast, cost-effective
MixtralMulti-expert architecture, diverse tasks

Choosing a Provider

PriorityRecommended Provider
No setup neededIntuidy AI (default)
Best general qualityOpenAI (GPT-4o) or Claude (3.5 Sonnet)
Long documents / large datasetsGemini (1M token context) or Claude (200K context)
Cost efficiencyOpenAI (GPT-4o-mini), Mistral (Small), or DeepSeek
EU data residencyMistral
Code & technical analysisDeepSeek Coder or OpenAI (GPT-4o)

Switching Providers

You can switch providers at any time:

  1. Go to Settings → AI Features → Intuidy AI
  2. Select the new provider
  3. Enter API credentials if required
  4. Choose a model
  5. Click Save AI Settings

All AI features — assistant, tile summaries, workflow nodes — will immediately use the new provider.