The easiest way to contribute an API spec is through the registry publisher page. You can also use the CLI:
# Compile and publish in one step
lapsh publish my-api.yaml --provider acmeFor local development and testing:
-
Find the OpenAPI spec for the API you want to add. Good sources:
- APIs.guru -- large directory of OpenAPI specs
- The API provider's developer docs (often link to their spec)
- GitHub repos (search
openapi.yamlorswagger.json)
-
Add it to
examples/verbose/openapi/:cp ~/downloaded-spec.yaml examples/verbose/openapi/my-api.yaml -
Compile:
lapsh compile examples/verbose/openapi/my-api.yaml -o examples/lap/openapi/my-api.lap lapsh compile examples/verbose/openapi/my-api.yaml -o examples/lap/openapi/my-api.lean.lap --lean
-
Fix any warnings -- if the compiler emits warnings about malformed fields, the source spec may need cleanup or the compiler may need a fix.
The compiler lives in lap/core/ and consists of:
| File | Purpose |
|---|---|
lap/core/compilers/openapi.py |
OpenAPI → LAP compilation |
lap/core/parser.py |
LAP text → AST parsing |
lap/core/formats/lap.py |
LAP data structures and serialization |
lap/core/converter.py |
LAP → OpenAPI conversion |
lap/core/differ.py |
LAP diff engine |
lap/core/utils.py |
Token counting, file I/O |
- Better type inference — improve how OpenAPI schemas map to LAP types
- Handle edge cases — polymorphic schemas (
oneOf/anyOf), deeply nested objects - Parser robustness — handle malformed input gracefully
- New directives — propose new
@directivesyntax for missing features
# Run tests
pytest tests/ -v
pytest integrations/test_integrations.py -v
# Test compilation on all specs
lapsh benchmark-all examples/verbose/openapi/
# Inspect a specific spec
lapsh inspect examples/verbose/openapi/stripe-charges.yamlIntegrations live in integrations/:
| Path | Framework |
|---|---|
integrations/langchain/ |
LangChain |
integrations/crewai/lap_tool.py |
CrewAI |
integrations/mcp/lap_mcp_server.py |
MCP |
-
Create
integrations/<framework>/with:__init__.py- Main integration file
- Example usage
-
Graceful degradation — the integration must work without the framework installed (use try/except imports with stubs). See
integrations/crewai/lap_tool.pyfor the pattern. -
Add tests in
integrations/test_integrations.py -
Document it in
docs/guide-integrate.md
"""<Framework> integration for LAP."""
from lap.core.parser import parse_lap
from lap.core.utils import read_file_safe
# Graceful degradation
try:
from <framework> import SomeBase
_HAS_FRAMEWORK = True
except ImportError:
_HAS_FRAMEWORK = False
class SomeBase:
pass
class LAP<Framework>Tool(SomeBase):
"""LAP tool for <Framework>."""
def __init__(self, specs_dir: str):
self.specs_dir = Path(specs_dir)
def _run(self, api_name: str, **kwargs) -> str:
# Load and return LAP content
...- Python 3.10+ with type hints
- Dataclasses for data structures
- No external dependencies in core (only
pyyaml,tiktoken,rich) - Framework dependencies are optional (extras)
# All tests
pytest tests/ -v
# Integration tests
pytest integrations/test_integrations.py -v- Compiler -- every spec in
examples/verbose/should compile without errors - Parser — round-trip: compile → parse → serialize should be stable
- Validator — 100% endpoint/parameter/error preservation
- Integrations — basic load/query tests, with and without framework installed
- Tests pass (
pytest tests/ -v) - All specs still compile (
lapsh benchmark-all examples/verbose/openapi/) - No new warnings on existing specs
- Documentation updated if adding features
- Type hints on public API