AI Learning & Knowledge
Vantage's AI system builds contextual knowledge about your business over time, improving the relevance and accuracy of AI-generated insights, summaries, and responses.
Learned Knowledge Feed
The Learned Knowledge Feed tracks what the AI has learned about your company and data:
- Access via Settings → AI Features
- View accumulated knowledge snippets
- Each entry shows what the AI has learned and when
- Knowledge is used automatically in future AI interactions
As you use the platform — running workflows, generating tile summaries, interacting with the AI assistant — the system builds a richer understanding of your data patterns, business terminology, and analytical preferences.
AI Context Management
Context Sources
AI knowledge in Vantage comes from several sources:
| Source | How It's Created |
|---|---|
| Onboarding Info | Company description, industry, and website provided during account setup |
| Manual Context | Custom instructions and context snippets set in Settings → AI Features → Context |
| Implicit Learning | Patterns detected from your data, dashboards, and workflow usage |
| Data Context | The specific data visible in tiles and flowing through workflows |
Managing Context
- View context — Go to Settings → AI Features → Context to see all active context snippets.
- Edit context — Update any snippet and click "Save Context".
- Add custom instructions — Enter free-text directives to shape AI behavior.
- Review learned knowledge — Check the Learned Knowledge Feed for accumulated insights.
Context Hierarchy
Context is applied at multiple levels, with more specific context taking priority:
Organization-Level Context (broadest)
↓
Company-Level Context (company-specific)
↓
Custom Instructions (user-defined rules)
↓
Page/Tile Context (current data and view)
This means the AI always has the most relevant context for any given interaction — from broad company information down to the specific data you're looking at.
Token Usage for AI
All AI operations consume tokens. The cost depends on:
| Factor | Impact |
|---|---|
| Provider and model | More powerful models cost more per request |
| Data volume | Processing more rows costs more tokens |
| Feature type | Workflow nodes, tile summaries, and assistant queries have different costs |
| Context size | Larger context (more snippets, more instructions) uses more tokens per request |
Monitor your AI token usage in Settings → Account → Usage & Tokens, which breaks down consumption by category and operation type.
Best Practices
- Be specific in custom instructions — Vague instructions produce vague results. "Format currency as USD with commas" is better than "Use nice formatting."
- Keep context current — Update your company overview and industry as your business evolves.
- Start with Intuidy AI — The default provider requires no setup and works well for most use cases.
- Monitor token usage — Track which operations consume the most tokens and optimize accordingly.
- Use the Query Settings toggle — Enable "Process Large Datasets" only when accuracy is critical; disable for faster results on exploratory queries.