This guide explains how to run tests locally and in CI/CD.
uv sync --group test# Start PostgreSQL test database in Docker
./scripts/setup_test_db.shThis will:
- Start PostgreSQL 16 in Docker on port 5433
- Create the
devbin_testdatabase - Wait for the database to be ready
# Run all tests
uv run pytest
# Run only unit tests (no database required)
uv run pytest tests/unit/ -v
# Run with coverage report
uv run pytest --cov=app --cov-report=html
# Run in parallel (faster)
uv run pytest -n auto
# Run specific test file
uv run pytest tests/unit/test_token_utils.py -vtests/
├── unit/ # Fast tests, no external dependencies
├── integration/ # Tests with database and file system
├── api/ # Full API endpoint tests
└── security/ # Security-focused tests
# Start test database
docker-compose -f docker-compose.test.yml up -d
# Stop test database
docker-compose -f docker-compose.test.yml down
# Clean up database and volumes
docker-compose -f docker-compose.test.yml down -v
# View logs
docker-compose -f docker-compose.test.yml logs -f- Host: localhost
- Port: 5433 (to avoid conflicts with dev database on 5432)
- Database: devbin_test
- User: postgres
- Password: postgres
- Connection String:
postgresql://postgres:postgres@localhost:5433/devbin_test
Tests use environment variables from pytest.ini by default:
APP_DATABASE_URL = postgresql+asyncpg://postgres:postgres@localhost:5433/devbin_test
APP_BASE_FOLDER_PATH = /tmp/devbin_test_files
APP_DEBUG = true
APP_ALLOW_CORS_WILDCARD = trueYou can override these by setting environment variables before running tests:
export APP_DATABASE_URL=postgresql+asyncpg://postgres:postgres@localhost:5432/my_test_db
uv run pytestThe test database is automatically configured in .github/workflows/test.yml:
services:
postgres:
image: postgres:16
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: devbin_test
ports:
- 5432:5432In CI, tests use port 5432 (the default PostgreSQL port in the service container).
# CI sets APP_DATABASE_URL to use the service container
pytest -v --cov=app --cov-report=xml
coverage report --fail-under=80pytest tests/unit/ -m unit- No database or file system
- Mock external dependencies
- < 1ms per test
- Test utilities, validators, pure functions
pytest tests/integration/ -m integration- Real database with transaction rollback
- File system operations (temp directories)
- 10-100ms per test
- Test service layer
pytest tests/api/- Full HTTP request/response cycle
- All middleware included
- 50-200ms per test
- Test endpoints, rate limiting, caching
uv run pytest --cov=app --cov-report=html
open htmlcov/index.html # or xdg-open on Linux- Overall: 80%+ (enforced in CI)
- Critical modules: 90%+
app/services/paste_service.pyapp/utils/token_utils.pyapp/api/subroutes/pastes.py
Problem: connection refused or could not connect to server
Solution:
- Check if test database is running:
docker ps | grep devbin_test - Start database:
./scripts/setup_test_db.sh - Check database logs:
docker-compose -f docker-compose.test.yml logs
Problem: Port 5433 is already in use
Solution:
- Change port in
docker-compose.test.yml - Update
pytest.inito match - Restart database
Problem: Tests pass sometimes, fail other times (flaky tests)
Solution:
- Check if tests are properly isolated (no shared state)
- Verify database cleanup between tests
- Check for timing issues (use
freezegunfor time-based tests)
Problem: Tests take too long to run
Solution:
- Run tests in parallel:
pytest -n auto - Run only unit tests:
pytest tests/unit/ - Run specific test file instead of entire suite
- Check for N+1 query issues in integration tests
- Use descriptive names:
test_create_paste_with_valid_data_returns_200 - Test one thing: Each test should verify one specific behavior
- Use fixtures: Reuse common setup via pytest fixtures
- Clean up: Tests should not leave artifacts (files, DB records)
- Mark tests: Use
@pytest.mark.unitor@pytest.mark.integration
- Use
fakerfor realistic test data (IPs, user agents, names) - Use
sample_paste_datafixture for consistent paste creation - Create factory functions for complex test objects
- Mock external services: Time, disk usage checks, network calls
- Use real implementations: Database, file system (with temp dirs)
- Mock sparingly: Real implementations catch more bugs
Tests run automatically on:
- Push to
masterordevelop - Pull requests
- All tests must pass
- Coverage must not decrease
- Coverage must be >= 80%
- Linting must pass (ruff)
- Go to GitHub Actions tab
- Click on the latest workflow run
- View test results and coverage report