|
| 1 | +# CLAUDE.md |
| 2 | + |
| 3 | +This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. |
| 4 | + |
| 5 | +## Project Overview |
| 6 | + |
| 7 | +LinkedDataHub (LDH) is a low-code Knowledge Graph application platform that enables managing data, creating visualizations, and building apps on RDF Knowledge Graphs. It's a completely data-driven platform where applications and documents are defined as data, managed using a single generic HTTP API, and presented using declarative technologies. |
| 8 | + |
| 9 | +## Build System and Development Commands |
| 10 | + |
| 11 | +LinkedDataHub uses Maven as the primary build system with Docker for containerization. |
| 12 | + |
| 13 | +### Development Setup |
| 14 | +```bash |
| 15 | +# Initial setup (requires .env file configuration) |
| 16 | +./bin/server-cert-gen.sh .env nginx ssl |
| 17 | +docker-compose up --build |
| 18 | +``` |
| 19 | + |
| 20 | +### Core Build Commands |
| 21 | +```bash |
| 22 | +# Maven build (Java 17 required) |
| 23 | +mvn clean install |
| 24 | + |
| 25 | +# Build specific profiles |
| 26 | +mvn -Pstandalone clean install # Standalone WAR |
| 27 | +mvn -Pdependency clean install # JAR dependency |
| 28 | +mvn -Prelease clean install # Release with signing |
| 29 | + |
| 30 | +# Docker-based development |
| 31 | +docker-compose up --build # Start all services |
| 32 | +docker-compose down -v # Stop and remove volumes |
| 33 | +sudo rm -rf data uploads && docker-compose down -v # Complete reset |
| 34 | +``` |
| 35 | + |
| 36 | +### Testing |
| 37 | +```bash |
| 38 | +# HTTP tests (requires running application) |
| 39 | +cd http-tests |
| 40 | +./run.sh ssl/owner/cert.pem [password] ssl/secretary/cert.pem [password] |
| 41 | + |
| 42 | +# Test individual suites |
| 43 | +find ./document-hierarchy/ -name '*.sh' -exec bash {} \; |
| 44 | +``` |
| 45 | + |
| 46 | +## Architecture Overview |
| 47 | + |
| 48 | +### Core Application Structure |
| 49 | +- **JAX-RS based**: Uses Jersey framework for RESTful web services |
| 50 | +- **Multi-application architecture**: Separate admin and end-user applications |
| 51 | +- **Data-driven design**: Applications and resources defined as RDF data |
| 52 | +- **XSLT-based UI**: Client-side rendering using Saxon-JS with XSLT transformations |
| 53 | + |
| 54 | +### Key Components |
| 55 | + |
| 56 | +#### Applications (`com.atomgraph.linkeddatahub.apps.model`) |
| 57 | +- `AdminApplication` - Administrative interface and functions |
| 58 | +- `EndUserApplication` - Main user-facing application |
| 59 | +- Applications are data-driven and loaded from RDF datasets |
| 60 | + |
| 61 | +#### Security & Authentication (`com.atomgraph.linkeddatahub.server.filter.request.auth`) |
| 62 | +- WebID-based authentication with client certificates |
| 63 | +- OAuth2 integration (Google) |
| 64 | +- Authorization filters and context management |
| 65 | +- Multi-level security: Agent, Authorization, and Application filters |
| 66 | + |
| 67 | +#### Data Management (`com.atomgraph.linkeddatahub.model`) |
| 68 | +- RDF-native data handling with Jena |
| 69 | +- Import/Export functionality for CSV, RDF, and other formats |
| 70 | +- SPARQL endpoint integration with separate admin and end-user stores |
| 71 | + |
| 72 | +#### Resource Handling (`com.atomgraph.linkeddatahub.resource`) |
| 73 | +- RESTful resource endpoints for CRUD operations |
| 74 | +- File upload and content-addressed storage |
| 75 | +- Transformation and generation utilities |
| 76 | + |
| 77 | +### Service Architecture |
| 78 | +The application runs as a multi-container setup: |
| 79 | +- **nginx**: Reverse proxy and SSL termination |
| 80 | +- **linkeddatahub**: Main Java application (Tomcat) |
| 81 | +- **fuseki-admin/fuseki-end-user**: Separate SPARQL stores |
| 82 | +- **varnish-frontend/varnish-admin/varnish-end-user**: Caching layers |
| 83 | + |
| 84 | +### Data Flow |
| 85 | +1. Requests come through nginx proxy |
| 86 | +2. Varnish provides caching layer |
| 87 | +3. LinkedDataHub application handles business logic |
| 88 | +4. Data persisted to appropriate Fuseki triplestore |
| 89 | +5. XSLT transforms data for client presentation |
| 90 | + |
| 91 | +### Key Extension Points |
| 92 | +- **Vocabulary definitions** in `com.atomgraph.linkeddatahub.vocabulary` |
| 93 | +- **Custom resource handlers** in `com.atomgraph.linkeddatahub.resource` |
| 94 | +- **Import processors** in `com.atomgraph.linkeddatahub.imports` |
| 95 | +- **XSLT transformations** in `src/main/webapp/static/com/atomgraph/linkeddatahub/xsl` |
| 96 | + |
| 97 | +## CLI Tools |
| 98 | +LinkedDataHub includes extensive CLI tools in the `bin/` directory: |
| 99 | +- Resource management: `create-container.sh`, `create-item.sh`, `get.sh`, `post.sh`, `put.sh` |
| 100 | +- Import functionality: `imports/create-csv-import.sh`, `imports/import-rdf.sh` |
| 101 | +- Admin operations: `admin/model/add-class.sh`, `admin/acl/create-authorization.sh` |
| 102 | +- Certificate management: `webid-keygen.sh`, `server-cert-gen.sh` |
| 103 | + |
| 104 | +Add CLI tools to PATH for development: |
| 105 | +```bash |
| 106 | +export PATH="$(find bin -type d -exec realpath {} \; | tr '\n' ':')$PATH" |
| 107 | +``` |
| 108 | + |
| 109 | +## Development Notes |
| 110 | +- Java 17 is required for compilation |
| 111 | +- The application uses AtomGraph's Processor and Web-Client libraries as core dependencies |
| 112 | +- XSLT stylesheets are processed during build to inline XML entities |
| 113 | +- Saxon-JS SEF files are generated during Maven package phase for client-side XSLT |
| 114 | +- WebID certificates are required for authenticated API access |
| 115 | +- The system expects Jena CLI tools to be available (`JENA_HOME` environment variable) |
0 commit comments