ContextFlo Blog

How to Make the Most Out of ContextFlo

Learn best practices for using ContextFlo effectively: from setting up your data sources to creating saved queries and reviewing LLM interactions.

January 11, 20258 min readContextFlo Team

ContextFlo transforms how your team analyzes data by connecting your LLM directly to your data warehouse with rich context about your tables, metrics, and business definitions.

But like any tool, you'll get better results by following a few best practices. This guide will help you maximize the value you get from ContextFlo.

How ContextFlo Helps Answer Your Questions

When you ask a question through your LLM, ContextFlo follows a systematic workflow:

1

Search for Similar Patterns

ContextFlo uses semantic search to find saved queries, metrics, and business concepts related to your question. This gives the LLM proven query patterns to work from.

2

Explore Tables

If the LLM doesn't find a relevant resource to work with, it will explore the tables in your data warehouse to understand your schema and identify relevant data sources.

3

Verify Table Schemas

The LLM retrieves the exact column names, data types, and descriptions for any tables it needs to query. This prevents hallucinated column names and incorrect joins.

4

Execute & Analyze

ContextFlo runs the query against your actual data warehouse and returns results for the LLM to analyze and present to you.

5

Save Useful Queries

When you save useful queries, they become available to help answer similar questions in the future, creating a growing knowledge base.

Best Practices

Ensure ContextFlo Has Access to Relevant Data

The LLM can only answer questions about data it can access. Make sure ContextFlo is connected to the right tables with current information:

  • Sync the tables that matter: Don't sync everything—focus on tables your team actually queries. Too many irrelevant tables create noise.
  • Keep data fresh: If your warehouse has a "production" and "staging" version of tables, make sure ContextFlo is pointed at production (unless you intentionally want staging for testing).
  • Watch for schema changes: When tables are restructured or renamed, resync them so the LLM has the latest column names and types.
  • Add context for important tables: For high-traffic tables, add descriptions with gotchas (e.g., "deleted_at IS NULL means active records", "test_ prefix indicates test data").

Common issue: Teams often connect their data warehouse but forget to sync new tables as they're created. Set a quarterly reminder to review what tables are available in your warehouse vs. what's synced in ContextFlo.

Start a New Thread for New Questions

When switching topics, start a fresh conversation with your LLM. This prevents confusion from previous context. Worse, you might run out of context window when the chat goes too long and lose the thread.

Don't do this

Continue asking about marketing metrics in the same thread where you were analyzing customer churn. The LLM might mix concepts or reuse irrelevant queries.

Do this instead

Start a new conversation when you switch from customer analytics to marketing analytics. This gives the LLM a clean slate to focus on the new domain.

Use Saved Queries for Frequently Asked Questions

When you find a query that works well, save it with a descriptive name. ContextFlo will automatically suggest it for similar questions in the future. Once you are done with the analysis, just ask your LLM "can you save this query?"

Some example saved queries:

  • 📊"Show MRR growth by product line for the last 30 days"
  • 👥"List customers with declining usage in a given quarter"
  • 💰"Break down ARR by customer segment and region"
  • 🔍"Calculate retention by signup month"

Saved queries act as institutional knowledge base—they capture proven analysis patterns that work for your specific data model. Admins can view these saved queries in contextflo and review/edit them easily.

Add Organization-Specific Prompts in Settings

Use the organization settings to add custom instructions that apply to every query. This helps the LLM understand your specific business context and conventions. To do so, go to Settings/Org Info/Data analysis guide

Examples of effective org-specific prompts:

Data quirks:

"The 'created_at' column in the orders table uses PST timezone. Always convert to UTC for comparisons."

Reporting conventions:

"When analyzing revenue, always exclude test accounts (account_id starting with 'test_') and internal employees (email domain '@company.com')."

Preferred metrics:

"For growth analysis, prioritize net revenue retention (NRR) over gross retention. Our board focuses on NRR as the primary health metric."

ADMINSReview User/LLM Interactions Regularly

As an admin, periodically review the activity log to see where users and the LLM are getting confused. This helps you identify gaps in your context that need to be addressed.

What to look for:

  • 🔍Repeated questions: If users ask the same thing multiple times, create a saved query or metric definition
  • ⚠️Failed queries: Look for patterns in query failures—do you need better table descriptions or business concepts?
  • 🤔Confusion around metrics: If the LLM misinterprets a metric, add a clear definition or update your org-specific prompts
  • 📋Common domains: Identify frequently queried areas and ensure those tables have comprehensive context

Putting It All Together

The key to maximizing ContextFlo is treating it like a knowledge management system, not just a query tool. The more you invest in maintaining context quality—through accurate table descriptions, saved queries, metrics, and org-specific guidance—the more valuable it becomes for your entire team.

These best practices compound over time. Each saved query makes future analyses faster. Each refined table description prevents confusion. Each org-specific prompt reduces back-and-forth. The teams that get the most value from ContextFlo are the ones that treat it as a living knowledge base that evolves with their needs.

Ready to get the most out of ContextFlo?

Review your activity log to see where queries are getting stuck. Update your org-specific prompts with conventions the LLM keeps missing. Save queries your team asks repeatedly. Your future self (and your teammates) will thank you.