|
| 1 | +# WireMock Chat Backend Sample App |
| 2 | + |
| 3 | +A CDK application demonstrating integration with WireMock-mocked OpenAI APIs running in LocalStack. |
| 4 | + |
| 5 | +## Overview |
| 6 | + |
| 7 | +This sample app deploys a serverless chat backend using: |
| 8 | + |
| 9 | +- **AWS Lambda** - Handles API requests |
| 10 | +- **Amazon API Gateway** - REST API endpoints |
| 11 | +- **WireMock** - Mocks OpenAI API responses |
| 12 | + |
| 13 | +### API Endpoints |
| 14 | + |
| 15 | +| Method | Endpoint | Description | |
| 16 | +|--------|-----------|------------------------------------------| |
| 17 | +| GET | /models | List available AI models | |
| 18 | +| POST | /chat | Send a chat message | |
| 19 | + |
| 20 | +### WireMock OpenAI Endpoints Used |
| 21 | + |
| 22 | +- `GET /models` - Returns list of available models |
| 23 | +- `POST /chat/completions` - Chat completion response |
| 24 | + |
| 25 | +## Prerequisites |
| 26 | + |
| 27 | +- [LocalStack](https://localstack.cloud/) installed |
| 28 | +- [Node.js](https://nodejs.org/) 18+ installed |
| 29 | +- [AWS CDK Local](https://github.com/localstack/aws-cdk-local) (`npm install -g aws-cdk-local`) |
| 30 | +- WireMock extension configured with OpenAI stubs |
| 31 | + |
| 32 | +## Setup |
| 33 | + |
| 34 | +### 1. Start LocalStack with WireMock Extension |
| 35 | + |
| 36 | +To use the OpenAI mock, you need to [create the mock API in WireMock Cloud](https://app.wiremock.cloud/mock-apis/create-flow), then pull the configuration locally. Follow the instructions in the [WireMock README](../README.md) to create the mock API and pull the configuration. |
| 37 | + |
| 38 | +Once pulled, the configuration will be in the `.wiremock` directory. |
| 39 | + |
| 40 | +```bash |
| 41 | +LOCALSTACK_WIREMOCK_API_TOKEN="<your-wiremock-api-token>" \ |
| 42 | +LOCALSTACK_WIREMOCK_CONFIG_DIR="/path/to/.wiremock" \ |
| 43 | +localstack start |
| 44 | +``` |
| 45 | + |
| 46 | +### 2. Install Dependencies |
| 47 | + |
| 48 | +```bash |
| 49 | +cd wiremock/sample-app-runner |
| 50 | +npm install |
| 51 | +``` |
| 52 | + |
| 53 | +### 3. Bootstrap CDK |
| 54 | + |
| 55 | +```bash |
| 56 | +cdklocal bootstrap |
| 57 | +``` |
| 58 | + |
| 59 | +### 4. Deploy the Stack |
| 60 | + |
| 61 | +```bash |
| 62 | +cdklocal deploy |
| 63 | +``` |
| 64 | + |
| 65 | +After deployment, you'll see output similar to: |
| 66 | +``` |
| 67 | +Outputs: |
| 68 | +WiremockChatStack.ApiEndpoint = https://<api-id>.execute-api.localhost.localstack.cloud:4566/dev/ |
| 69 | +WiremockChatStack.ChatApiEndpoint = https://<api-id>.execute-api.localhost.localstack.cloud:4566/dev/chat |
| 70 | +WiremockChatStack.ModelsEndpoint = https://<api-id>.execute-api.localhost.localstack.cloud:4566/dev/models |
| 71 | +``` |
| 72 | + |
| 73 | +## Usage |
| 74 | + |
| 75 | +### List Available Models |
| 76 | + |
| 77 | +```bash |
| 78 | +curl https://<api-id>.execute-api.localhost.localstack.cloud:4566/dev/models |
| 79 | +``` |
| 80 | + |
| 81 | +### Send a Chat Message |
| 82 | + |
| 83 | +```bash |
| 84 | +curl -X POST https://<api-id>.execute-api.localhost.localstack.cloud:4566/dev/chat \ |
| 85 | + -H "Content-Type: application/json" \ |
| 86 | + -d '{"message": "Hello, how are you?"}' |
| 87 | +``` |
| 88 | + |
| 89 | +## Testing |
| 90 | + |
| 91 | +```bash |
| 92 | +npm test |
| 93 | +``` |
| 94 | + |
| 95 | +## How It Works |
| 96 | + |
| 97 | +1. **GET /models**: Lambda fetches the list of available models from WireMock's `/models` endpoint and returns them. |
| 98 | + |
| 99 | +2. **POST /chat**: Lambda calls WireMock's `/chat/completions` endpoint with the user's message and returns the AI response along with usage statistics. |
0 commit comments