This folder contains the server-side AI integration for CV optimization and HTML translation.
ai-provider-service.ts: provider-neutral facade used by/api/uploadand/api/translate.
ai-provider-config.ts: reads env vars, validates provider selection, resolves base URLs, and exposes the shared runtime config.ai-provider-openai-compatible-client.ts: runs the shared OpenAI-compatiblechat.completionsrequests.ai-provider-openai-compatible-request.ts: builds shared prompts, messages, PDF file parts, and optional structured output settings.ai-provider-gemini-native-pdf.ts: keeps Gemini’s native Files API PDF flow behind the generic facade.pdf-processing/README.md: documents native PDF processing, extraction quality gates, and fallback rules.cv-optimizer-system-prompt.ts: provider-neutral system prompt persona for CV optimization.
Use .env or server env vars only:
AI_PROVIDER=openaiAI_PROVIDER=geminiAI_PROVIDER=openrouter
Required variables:
AI_PROVIDERAI_PROVIDER_API_KEYAI_PROVIDER_MODEL
Optional variables:
AI_PROVIDER_BASE_URLAI_TEMPERATUREAI_MAX_OUTPUT_TOKENSAI_TOP_PAI_OPENROUTER_HTTP_REFERERAI_OPENROUTER_TITLE
flowchart TD
Upload["/api/upload or /api/translate"] --> Facade["ai-provider-service.ts"]
Facade --> Shared["OpenAI-compatible chat builder/client"]
Facade --> GeminiPdf["Gemini native PDF adapter"]
Facade --> PdfFallback["Validated PDF text fallback workflow"]
Shared --> Providers["OpenAI | Gemini OpenAI endpoint | OpenRouter"]
GeminiPdf --> Gemini["Gemini Files API + generateContent"]
PdfFallback --> PdfExtract["Page-by-page extraction + quality gates"]
PdfExtract --> Providers
- Browser translation still runs first in the client; the server provider is only the fallback.
- The remote translation fallback reuses
AI_PROVIDER_MODEL; there is no separate translation model env var anymore. - Gemini-specific naming from the previous integration was removed from the public service layer.
- PDF handling is provider-aware:
gemini: native Files API branch first, then validated text fallback when the failure is explicitly file/PDF-relatedopenaiandopenrouter: OpenAI-compatible file content parts first, then validated text fallback when the failure is explicitly file/PDF-related
- Generic provider failures do not trigger PDF fallback.
- If extracted PDF text looks incomplete or unreliable, the workflow fails with an explicit no-OCR error instead of sending partial text to the LLM.
- There is no provider fallback chain. Missing env vars or unsupported models return explicit errors.