Skip to content

Commit 8f8a65c

Browse files
committed
Release version 5.3.0
2 parents 8abf914 + 0d77bd8 commit 8f8a65c

File tree

148 files changed

+4996
-2855
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

148 files changed

+4996
-2855
lines changed

CHANGELOG.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,31 @@
1+
## [5.3.0] - 2026-03-05
2+
### Added
3+
- Class-based navigation (#270)
4+
- `VARNISH_*_BACKEND_PORT` env params for configurable Varnish backend ports
5+
- Bearer auth token support in entrypoint; service credentials moved to `secrets/credentials.trig`
6+
- `gsp_append_quads` function in the entrypoint
7+
- New ACL HTTP tests for system endpoints
8+
- Ignored paths in `OntologyFilter` (#269)
9+
10+
### Changed
11+
- Renamed `DirectGraphStoreImpl` to `DocumentHierarchyGraphStoreImpl`
12+
- Dataspace nav list now visible for unauthenticated agents
13+
- Client-side SPARQL query execution uses `POST` instead of `GET`
14+
- Full context dataset now passed to XSLT
15+
- Introduced `ServiceContext` to decouple HTTP infrastructure from `Service`
16+
- Split dataspace metadata from service metadata in configuration
17+
- Moved types to `system.trig`; `lapp:endUserApplication`/`lapp:adminApplication` now inferred on the fly
18+
- Refactored CSV/RDF import scripts
19+
20+
### Fixed
21+
- UTF-8 charset handling for text-based media types in uploaded files
22+
- Fixed links to the admin app
23+
- URI resolution fix in `AuthorizationFilter`
24+
- Left sidebar CSS fixes
25+
26+
### Removed
27+
- Removed system endpoint resources from default RDF datasets
28+
129
## [5.2.1] - 2026-01-20
230
### Changed
331
- Package view rendering refactored to use type-driven discovery with `ldh:view`/`ldh:inverseView` properties matching resource types against `rdfs:domain`/`rdfs:range` constraints

CLAUDE.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@ LinkedDataHub uses Maven as the primary build system with Docker for containeriz
1717
docker-compose up --build
1818
```
1919

20+
Service credentials (used by the entrypoint for Bearer auth) are stored in `secrets/credentials.trig`.
21+
2022
### Core Build Commands
2123
```bash
2224
# Maven build (Java 17 required)
@@ -74,6 +76,10 @@ find ./document-hierarchy/ -name '*.sh' -exec bash {} \;
7476
- File upload and content-addressed storage
7577
- Transformation and generation utilities
7678

79+
#### Service Layer
80+
- `ServiceContext` decouples HTTP infrastructure from `Service`, holding dataspace and service metadata separately
81+
- Dataspace metadata and service metadata are split in configuration; types for `lapp:endUserApplication`/`lapp:adminApplication` are inferred on the fly from `system.trig`
82+
7783
### Service Architecture
7884
The application runs as a multi-container setup:
7985
- **nginx**: Reverse proxy and SSL termination

Dockerfile

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,9 @@ ENV SIGN_UP_CERT_VALIDITY=
8989

9090
ENV LOAD_DATASETS=
9191

92-
ENV CONTEXT_DATASET_URL=file:///var/linkeddatahub/datasets/system.trig
92+
ENV CONTEXT_DATASET_URL=file:///var/linkeddatahub/datasets/dataspaces.trig
93+
94+
ENV SERVICES_DATASET_URL=file:///var/linkeddatahub/datasets/system.trig
9395

9496
ENV ADMIN_DATASET_URL=file:///var/linkeddatahub/datasets/admin.trig
9597

Makefile

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
.PHONY: sef drop cert release
2+
3+
# Generate Saxon-JS SEF files for client-side XSLT transformations
4+
sef:
5+
./generate-sef.sh
6+
7+
# Wipe local data directories (datasets, Fuseki, SSL certs, uploads) — irreversible!
8+
drop:
9+
@read -p "Are you sure? [y/N] " ans && [ "$$ans" = "y" ] && sudo rm -rf datasets fuseki ssl uploads || echo "Aborted."
10+
11+
# Generate server SSL certificate using the .env config
12+
cert:
13+
./bin/server-cert-gen.sh .env nginx ssl
14+
15+
# Run the full Maven release process (prepare, deploy to Sonatype, merge to master/develop)
16+
release:
17+
./release.sh
Lines changed: 28 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -5,28 +5,24 @@ print_usage()
55
{
66
printf "Creates a SPARQL CONSTRUCT query.\n"
77
printf "\n"
8-
printf "Usage: %s options\n" "$0"
8+
printf "Usage: %s options TARGET_URI\n" "$0"
99
printf "\n"
1010
printf "Options:\n"
1111
printf " -f, --cert-pem-file CERT_FILE .pem file with the WebID certificate of the agent\n"
1212
printf " -p, --cert-password CERT_PASSWORD Password of the WebID certificate\n"
1313
printf " -b, --base BASE_URI Base URI of the application\n"
1414
printf " --proxy PROXY_URL The host this request will be proxied through (optional)\n"
1515
printf "\n"
16-
printf " --title TITLE Title of the chart\n"
17-
printf " --description DESCRIPTION Description of the chart (optional)\n"
18-
printf " --slug STRING String that will be used as URI path segment (optional)\n"
16+
printf " --title TITLE Title of the query\n"
17+
printf " --description DESCRIPTION Description of the query (optional)\n"
18+
printf " --uri URI URI of the query (optional)\n"
1919
printf "\n"
2020
printf " --query-file ABS_PATH Absolute path to the text file with the SPARQL query string\n"
21+
printf " --service SERVICE_URI URI of the SPARQL service specific to this query (optional)\n"
2122
}
2223

2324
hash turtle 2>/dev/null || { echo >&2 "turtle not on \$PATH. Aborting."; exit 1; }
2425

25-
urlencode() {
26-
python -c 'import urllib.parse, sys; print(urllib.parse.quote(sys.argv[1], sys.argv[2]))' \
27-
"$1" "$urlencode_safe"
28-
}
29-
3026
args=()
3127
while [[ $# -gt 0 ]]
3228
do
@@ -63,8 +59,8 @@ do
6359
shift # past argument
6460
shift # past value
6561
;;
66-
--slug)
67-
slug="$2"
62+
--uri)
63+
uri="$2"
6864
shift # past argument
6965
shift # past value
7066
;;
@@ -73,6 +69,11 @@ do
7369
shift # past argument
7470
shift # past value
7571
;;
72+
--service)
73+
service="$2"
74+
shift # past argument
75+
shift # past value
76+
;;
7677
*) # unknown arguments
7778
args+=("$1") # save it in an array for later
7879
shift # past argument
@@ -81,6 +82,8 @@ do
8182
done
8283
set -- "${args[@]}" # restore args
8384

85+
target="$1"
86+
8487
if [ -z "$cert_pem_file" ] ; then
8588
print_usage
8689
exit 1
@@ -102,43 +105,38 @@ if [ -z "$query_file" ] ; then
102105
exit 1
103106
fi
104107

105-
if [ -z "$slug" ] ; then
106-
slug=$(uuidgen | tr '[:upper:]' '[:lower:]') # lowercase
107-
fi
108-
encoded_slug=$(urlencode "$slug")
109-
110-
container="${base}queries/"
111108
query=$(<"$query_file") # read query string from file
112109

113-
target="${container}${encoded_slug}/"
114-
115110
args+=("-f")
116111
args+=("$cert_pem_file")
117112
args+=("-p")
118113
args+=("$cert_password")
119114
args+=("-t")
120115
args+=("text/turtle") # content type
121-
args+=("$target")
122116
if [ -n "$proxy" ]; then
123117
args+=("--proxy")
124118
args+=("$proxy")
125119
fi
126120

121+
if [ -n "$uri" ] ; then
122+
subject="<${uri}>"
123+
else
124+
subject="_:subject"
125+
fi
126+
127127
turtle+="@prefix ldh: <https://w3id.org/atomgraph/linkeddatahub#> .\n"
128-
turtle+="@prefix dh: <https://www.w3.org/ns/ldt/document-hierarchy#> .\n"
129128
turtle+="@prefix dct: <http://purl.org/dc/terms/> .\n"
130-
turtle+="@prefix foaf: <http://xmlns.com/foaf/0.1/> .\n"
131129
turtle+="@prefix sp: <http://spinrdf.org/sp#> .\n"
132-
turtle+="_:query a sp:Construct .\n"
133-
turtle+="_:query dct:title \"${title}\" .\n"
134-
turtle+="_:query sp:text \"\"\"${query}\"\"\" .\n"
135-
turtle+="<${target}> a dh:Item .\n"
136-
turtle+="<${target}> foaf:primaryTopic _:query .\n"
137-
turtle+="<${target}> dct:title \"${title}\" .\n"
130+
turtle+="${subject} a sp:Construct .\n"
131+
turtle+="${subject} dct:title \"${title}\" .\n"
132+
turtle+="${subject} sp:text \"\"\"${query}\"\"\" .\n"
138133

134+
if [ -n "$service" ] ; then
135+
turtle+="${subject} ldh:service <${service}> .\n"
136+
fi
139137
if [ -n "$description" ] ; then
140-
turtle+="_:query dct:description \"${description}\" .\n"
138+
turtle+="${subject} dct:description \"${description}\" .\n"
141139
fi
142140

143141
# submit Turtle doc to the server
144-
echo -e "$turtle" | turtle --base="$target" | put.sh "${args[@]}"
142+
echo -e "$turtle" | turtle --base="$target" | post.sh "${args[@]}"
Lines changed: 11 additions & 68 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ print_usage()
55
{
66
printf "Uploads a file.\n"
77
printf "\n"
8-
printf "Usage: %s options\n" "$0"
8+
printf "Usage: %s options TARGET_URI\n" "$0"
99
printf "\n"
1010
printf "Options:\n"
1111
printf " -f, --cert-pem-file CERT_FILE .pem file with the WebID certificate of the agent\n"
@@ -14,22 +14,14 @@ print_usage()
1414
printf " --proxy PROXY_URL The host this request will be proxied through (optional)\n"
1515
printf "\n"
1616
printf " --title TITLE Title of the file\n"
17-
printf " --container CONTAINER_URI URI of the parent container (optional)\n"
1817
printf " --description DESCRIPTION Description of the file (optional)\n"
19-
printf " --slug STRING String that will be used as URI path segment (optional)\n"
2018
printf "\n"
2119
printf " --file ABS_PATH Absolute path to the file\n"
22-
printf " --file-content-type MEDIA_TYPE Media type of the file (optional)\n"
23-
#printf " --file-slug STRING String that will be used as the file's URI path segment (optional)\n"
20+
printf " --content-type MEDIA_TYPE Media type of the file (optional)\n"
2421
}
2522

2623
hash curl 2>/dev/null || { echo >&2 "curl not on \$PATH. Aborting."; exit 1; }
2724

28-
urlencode() {
29-
python -c 'import urllib.parse, sys; print(urllib.parse.quote(sys.argv[1], sys.argv[2]))' \
30-
"$1" "$urlencode_safe"
31-
}
32-
3325
args=()
3426
while [[ $# -gt 0 ]]
3527
do
@@ -66,28 +58,13 @@ do
6658
shift # past argument
6759
shift # past value
6860
;;
69-
--slug)
70-
slug="$2"
71-
shift # past argument
72-
shift # past value
73-
;;
74-
--container)
75-
container="$2"
76-
shift # past argument
77-
shift # past value
78-
;;
7961
--file)
8062
file="$2"
8163
shift # past argument
8264
shift # past value
8365
;;
84-
--file-content-type)
85-
file_content_type="$2"
86-
shift # past argument
87-
shift # past value
88-
;;
89-
--file-slug)
90-
file_slug="$2"
66+
--content-type)
67+
content_type="$2"
9168
shift # past argument
9269
shift # past value
9370
;;
@@ -99,6 +76,8 @@ do
9976
done
10077
set -- "${args[@]}" # restore args
10178

79+
target="$1"
80+
10281
if [ -z "$cert_pem_file" ] ; then
10382
print_usage
10483
exit 1
@@ -119,50 +98,23 @@ if [ -z "$file" ] ; then
11998
print_usage
12099
exit 1
121100
fi
122-
if [ -z "$file_content_type" ] ; then
101+
if [ -z "$content_type" ] ; then
123102
# determine content-type if not provided
124-
file_content_type=$(file -b --mime-type "$file")
103+
content_type=$(file -b --mime-type "$file")
125104
fi
126105

127-
if [ -z "$slug" ] ; then
128-
slug=$(uuidgen | tr '[:upper:]' '[:lower:]') # lowercase
129-
fi
130-
encoded_slug=$(urlencode "$slug")
131-
132-
# need to create explicit file URI since that is what this script returns (not the graph URI)
133-
134-
#if [ -z "$file_slug" ] ; then
135-
# file_slug=$(uuidgen | tr '[:upper:]' '[:lower:]') # lowercase
136-
#fi
137-
138-
if [ -z "$container" ] ; then
139-
container="${base}files/"
140-
fi
141-
142-
target="${container}${encoded_slug}/"
143-
144106
# https://stackoverflow.com/questions/19116016/what-is-the-right-way-to-post-multipart-form-data-using-curl
145107

146108
rdf_post+="-F \"rdf=\"\n"
147109
rdf_post+="-F \"sb=file\"\n"
148110
rdf_post+="-F \"pu=http://www.semanticdesktop.org/ontologies/2007/03/22/nfo#fileName\"\n"
149-
rdf_post+="-F \"ol=@${file};type=${file_content_type}\"\n"
111+
rdf_post+="-F \"ol=@${file};type=${content_type}\"\n"
150112
rdf_post+="-F \"pu=http://purl.org/dc/terms/title\"\n"
151113
rdf_post+="-F \"ol=${title}\"\n"
152114
rdf_post+="-F \"pu=http://www.w3.org/1999/02/22-rdf-syntax-ns#type\"\n"
153115
rdf_post+="-F \"ou=http://www.semanticdesktop.org/ontologies/2007/03/22/nfo#FileDataObject\"\n"
154-
rdf_post+="-F \"su=${target}\"\n"
155-
rdf_post+="-F \"pu=http://purl.org/dc/terms/title\"\n"
156-
rdf_post+="-F \"ol=${title}\"\n"
157-
rdf_post+="-F \"pu=http://www.w3.org/1999/02/22-rdf-syntax-ns#type\"\n"
158-
rdf_post+="-F \"ou=https://www.w3.org/ns/ldt/document-hierarchy#Item\"\n"
159-
rdf_post+="-F \"pu=http://xmlns.com/foaf/0.1/primaryTopic\"\n"
160-
rdf_post+="-F \"ob=file\"\n"
161-
rdf_post+="-F \"pu=http://rdfs.org/sioc/ns#has_container\"\n"
162-
rdf_post+="-F \"ou=${container}\"\n"
163116

164117
if [ -n "$description" ] ; then
165-
rdf_post+="-F \"sb=file\"\n"
166118
rdf_post+="-F \"pu=http://purl.org/dc/terms/description\"\n"
167119
rdf_post+="-F \"ol=${description}\"\n"
168120
fi
@@ -176,14 +128,5 @@ if [ -n "$proxy" ]; then
176128
target="${target/$target_host/$proxy_host}"
177129
fi
178130

179-
# POST RDF/POST multipart form and capture the effective URL
180-
effective_url=$(echo -e "$rdf_post" | curl -w '%{url_effective}' -f -v -s -k -X PUT -H "Accept: text/turtle" -E "$cert_pem_file":"$cert_password" -o /dev/null --config - "$target")
181-
182-
# If using proxy, rewrite the effective URL back to original hostname
183-
if [ -n "$proxy" ]; then
184-
# Replace proxy host with original host in the effective URL
185-
rewritten_url="${effective_url/$proxy_host/$target_host}"
186-
echo "$rewritten_url"
187-
else
188-
echo "$effective_url"
189-
fi
131+
# POST RDF/POST multipart form
132+
echo -e "$rdf_post" | curl -f -v -s -k -X POST -H "Accept: text/turtle" -E "$cert_pem_file":"$cert_password" -o /dev/null --config - "$target"

0 commit comments

Comments
 (0)