-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathdoc.go
More file actions
76 lines (76 loc) · 2.76 KB
/
doc.go
File metadata and controls
76 lines (76 loc) · 2.76 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
// Package copilotproxy provides an OpenAI-compatible API wrapper for the GitHub Copilot API.
//
// This application serves as a bridge between standard OpenAI API clients and GitHub Copilot's
// proprietary API, allowing users to leverage their existing GitHub Copilot subscription through
// familiar OpenAI-style interfaces.
//
// # Architecture Overview
//
// The application follows a clean architecture pattern with the following components:
//
// - app: Core application logic and server implementation
// - auth: Authentication and authorization mechanisms
// - llm: Language model integration and API handling
// - rpc: Remote procedure call handling for client-server communication
// - models: Data structures for API requests and responses
// - utils: Helper functions and utilities
//
// # Authentication System
//
// Multiple authentication methods are supported:
// 1. Direct Copilot API keys
// 2. GitHub OAuth tokens (automatically exchanged for Copilot API keys)
// 3. Local VS Code Copilot configuration (automatically detected)
//
// The application can also run with authentication disabled using the --disable-auth flag,
// which enables testing and development without requiring API keys or tokens.
// When running with --disable-auth:
// - No API keys or tokens are required in requests
// - All requests are processed with administrative privileges
// - All OpenAI-compatible endpoints are available
//
// For production use, it's recommended to set the LLM_API_SECRET environment variable
// to a secure random value and require proper authentication.
//
// # Feature Highlights
//
// - OpenAI-compatible endpoint for chat completions
// - Support for streaming responses
// - Automatic token refresh
// - Rate limiting
// - VS Code Copilot extension monitoring
// - Comprehensive CLI options
//
// # Copilot Integration
//
// The application integrates with GitHub Copilot using the following mechanisms:
// 1. Direct API key usage for authentication
// 2. GitHub OAuth token exchange for API keys
// 3. Lazy authentication and model list caching
// - Fetches and caches the Copilot API key and available model list on first request
// - Refreshes both only after a 30-minute TTL expires
// - Idle service does not poll GitHub until needed
//
// # Getting Started
//
// To use this package as a library:
//
// import "github.com/anschmieg/copilot-proxy/internal/llm"
//
// service := llm.NewCopilotService(config)
// response, err := service.GetChatCompletions(request)
//
// To run as a standalone server:
//
// go run cmd/main.go
//
// # API Reference
//
// For detailed API documentation, see docs/copilot-api.md
//
// # Version History
//
// Current API Version: 2025-04-01
//
// See CHANGELOG.md for version history details.
package copilotproxy