Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions apps/backend/src/lib/ai/prompts.ts
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,44 @@ SQL QUERY GUIDELINES:
- Recent signups: SELECT * FROM users ORDER BY signed_up_at DESC LIMIT 10
- Events today: SELECT COUNT(*) FROM events WHERE toDate(event_at) = today()
- Event types: SELECT event_type, COUNT(*) as count FROM events GROUP BY event_type ORDER BY count DESC LIMIT 10

TOOL RESULT BUDGET (HARD LIMIT):
- The queryAnalytics tool returns { success: false } if the result JSON exceeds 50,000 characters.
NO ROWS reach you in that case — you get { success: false, error, rowCount, characters, columnsReturned }
and you MUST re-query with a more specific SQL statement.
- The events.data JSON blob typically triples per-row cost. Never SELECT * on events unless you have
Comment thread
aadesh18 marked this conversation as resolved.
a very small LIMIT and truly need every column.

PREFER AGGREGATION OVER RAW ROWS:
For "how many", "top N", "distribution", "unique count", "average", "over time" questions,
push the math into SQL using ClickHouse functions. Examples:

Count: SELECT COUNT(*) FROM events WHERE event_type='$token-refresh' AND event_at >= today()
Distinct count: SELECT uniqExact(user_id) FROM events WHERE event_at >= today() - INTERVAL 7 DAY
Top N: SELECT user_id, COUNT(*) AS c FROM events GROUP BY user_id ORDER BY c DESC LIMIT 10
Quantiles: SELECT quantile(0.5)(amount), quantile(0.95)(amount) FROM events
Comment thread
aadesh18 marked this conversation as resolved.
Outdated
Comment thread
aadesh18 marked this conversation as resolved.
Outdated
Time bucketing: SELECT toStartOfHour(event_at) AS bucket, COUNT(*) AS c FROM events
WHERE event_at >= now() - INTERVAL 1 DAY GROUP BY bucket ORDER BY bucket
JSON key discovery: SELECT arrayJoin(JSONExtractKeys(data)) AS k, COUNT(*) AS c FROM events
GROUP BY k ORDER BY c DESC LIMIT 20
Multi-metric: SELECT COUNT(*), uniqExact(user_id), min(event_at), max(event_at)
FROM events WHERE event_type='$token-refresh'

WHEN INDIVIDUAL ROWS MATTER (user explicitly asked to see records):
- ALWAYS use LIMIT <= 50.
- ALWAYS specify the exact columns you need — never SELECT * on events.
- Drop the 'data' column unless the user specifically asked about event payloads.

GROUP BY REQUIRES ORDER BY + LIMIT unless you expect <= 50 groups, otherwise the result may
exceed the 50,000-character budget and fail.

HANDLING { success: false } ERRORS:
When the tool returns success:false with "Result too large":
1. Read rowCount — if it's large (>100), switch to aggregation (COUNT, uniqExact, GROUP BY...).
2. Read columnsReturned — if it includes 'data', re-query without it.
3. Re-query with a narrower WHERE clause or a smaller LIMIT.
4. Do NOT present the error to the user — fix the query and try again.
5. Do NOT claim you saw rows that you didn't — the error response contains no row data.
`,
"docs-ask-ai": `
# Stack Auth AI Assistant System Prompt
Expand Down
15 changes: 15 additions & 0 deletions apps/backend/src/lib/ai/tools/sql-query.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,21 @@ export function createSqlQueryTool(auth: SmartRequestAuth | null, targetProjectI
})
.then(async (resultSet) => {
const rows = await resultSet.json<Record<string, unknown>[]>();
const serialized = JSON.stringify(rows);
if (serialized.length > 50_000) {
Comment thread
aadesh18 marked this conversation as resolved.
Outdated
return {
success: false as const,
error:
Comment thread
aadesh18 marked this conversation as resolved.
Outdated
`Result too large: ${rows.length} rows, ${serialized.length} characters (limit 50000). ` +
`To fix: ` +
`(1) Use aggregation (COUNT, uniqExact, GROUP BY, topK, quantile) instead of fetching rows. ` +
`(2) If you need rows, add a WHERE clause or reduce LIMIT. ` +
`(3) Select only the columns you need — avoid the 'data' column on events unless essential.`,
rowCount: rows.length,
characters: serialized.length,
columnsReturned: rows.length > 0 ? Object.keys(rows[0]) : [],
};
}
return {
success: true as const,
rowCount: rows.length,
Expand Down
28 changes: 26 additions & 2 deletions apps/dashboard/src/components/commands/ask-ai.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import { getPublicEnvVar } from "@/lib/env";
import { useChat, type UIMessage } from "@ai-sdk/react";
import { ArrowSquareOutIcon, CaretDownIcon, CheckIcon, CopyIcon, DatabaseIcon, PaperPlaneTiltIcon, SparkleIcon, SpinnerGapIcon, UserIcon } from "@phosphor-icons/react";
import { useUser } from "@stackframe/stack";
import { throwErr } from "@stackframe/stack-shared/dist/utils/errors";
import { captureError, throwErr } from "@stackframe/stack-shared/dist/utils/errors";
import { runAsynchronously } from "@stackframe/stack-shared/dist/utils/promises";
import { convertToModelMessages, DefaultChatTransport } from "ai";
import { usePathname } from "next/navigation";
Expand Down Expand Up @@ -480,6 +480,23 @@ function getToolInvocations(message: UIMessage): ToolInvocationPart[] {
.map((part) => part as unknown as ToolInvocationPart);
}

// Classifies raw AI provider errors into user-friendly messages.
// The raw error is captured to Sentry separately via captureError — never shown to the user.
function getFriendlyAiErrorMessage(error: Error): string {
const causeMessage = (error as { cause?: { message?: string } }).cause?.message ?? "";
const blob = `${error.message} ${causeMessage}`;
if (/maximum context length|context_length_exceeded|too many tokens|context length/i.test(blob)) {
return "The conversation got too long. Try starting a new chat or asking a more focused question.";
}
if (/rate limit|429|quota|too many requests/i.test(blob)) {
return "Service is busy. Please try again in a moment.";
}
if (/timeout|ECONNRESET|fetch failed|network/i.test(blob)) {
return "Request timed out. Please try again.";
}
return "Something went wrong. Please try again.";
Comment thread
N2D4 marked this conversation as resolved.
Outdated
}

// Word streaming hook - handles the progressive word reveal animation
function useWordStreaming(content: string) {
const [displayedWordCount, setDisplayedWordCount] = useState(0);
Expand Down Expand Up @@ -575,6 +592,13 @@ const AIChatPreviewInner = memo(function AIChatPreview({

const aiLoading = status === "submitted" || status === "streaming";

// Log the raw AI error once per error (Sentry captures the original message)
useEffect(() => {
if (aiError) {
captureError("ask-ai", aiError);
}
}, [aiError]);
Comment thread
aadesh18 marked this conversation as resolved.
Outdated

// Send initial query on mount (once) with debounce
useDebouncedAction({
action: async () => {
Expand Down Expand Up @@ -732,7 +756,7 @@ const AIChatPreviewInner = memo(function AIChatPreview({
{aiError && (
<div className="flex items-start gap-2 text-[12px] text-red-400/90 px-3 py-2 bg-red-500/[0.08] rounded-lg ring-1 ring-red-500/20">
<span className="shrink-0 mt-0.5">⚠</span>
<span>{aiError.message || "Failed to get response. Please try again."}</span>
<span>{getFriendlyAiErrorMessage(aiError)}</span>
</div>
)}
</div>
Expand Down
Loading