Skip to content

Commit 6abdc0c

Browse files
docs: Add environment variable configuration to all notebooks
All 21 documentation notebooks now support configuration via environment variables (DJ_HOST, DJ_USER, DJ_PASS, DJ_PORT) for CI/testing purposes. Also fixes: - Remove broken pytest.skip in json-type.ipynb - Revert CAST(correct AS INTEGER) to sum(correct) in 04-queries.ipynb (MySQL handles boolean sum natively) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 6e5f8a6 commit 6abdc0c

21 files changed

Lines changed: 901 additions & 1012 deletions

src/how-to/model-relationships.ipynb

Lines changed: 125 additions & 131 deletions
Large diffs are not rendered by default.

src/how-to/read-diagrams.ipynb

Lines changed: 110 additions & 116 deletions
Large diffs are not rendered by default.

src/tutorials/advanced/custom-codecs.ipynb

Lines changed: 5 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,9 @@
99
"\n",
1010
"This tutorial covers extending DataJoint's type system. You'll learn:\n",
1111
"\n",
12-
"- **Codec basics** Encoding and decoding\n",
13-
"- **Creating codecs** Domain-specific types\n",
14-
"- **Codec chaining** Composing codecs"
12+
"- **Codec basics** \u2014 Encoding and decoding\n",
13+
"- **Creating codecs** \u2014 Domain-specific types\n",
14+
"- **Codec chaining** \u2014 Composing codecs"
1515
]
1616
},
1717
{
@@ -35,12 +35,7 @@
3535
]
3636
}
3737
],
38-
"source": [
39-
"import datajoint as dj\n",
40-
"import numpy as np\n",
41-
"\n",
42-
"schema = dj.Schema('tutorial_codecs')"
43-
]
38+
"source": "import datajoint as dj\nimport os\n\n# Configure from environment variables (for testing with different backends)\nif os.getenv('DJ_HOST'):\n dj.config['database.host'] = os.getenv('DJ_HOST')\nif os.getenv('DJ_PORT'):\n dj.config['database.port'] = int(os.getenv('DJ_PORT'))\nif os.getenv('DJ_USER'):\n dj.config['database.user'] = os.getenv('DJ_USER')\nif os.getenv('DJ_PASS'):\n dj.config['database.password'] = os.getenv('DJ_PASS')\n\nimport numpy as np\n\nschema = dj.Schema('tutorial_codecs')"
4439
},
4540
{
4641
"cell_type": "markdown",
@@ -289,4 +284,4 @@
289284
},
290285
"nbformat": 4,
291286
"nbformat_minor": 5
292-
}
287+
}

src/tutorials/advanced/distributed.ipynb

Lines changed: 28 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,10 @@
99
"\n",
1010
"This tutorial covers running computations across multiple workers. You'll learn:\n",
1111
"\n",
12-
"- **Jobs 2.0** DataJoint's job coordination system\n",
13-
"- **Multi-process** Parallel workers on one machine\n",
14-
"- **Multi-machine** Cluster-scale computation\n",
15-
"- **Error handling** Recovery and monitoring"
12+
"- **Jobs 2.0** \u2014 DataJoint's job coordination system\n",
13+
"- **Multi-process** \u2014 Parallel workers on one machine\n",
14+
"- **Multi-machine** \u2014 Cluster-scale computation\n",
15+
"- **Error handling** \u2014 Recovery and monitoring"
1616
]
1717
},
1818
{
@@ -36,17 +36,7 @@
3636
]
3737
}
3838
],
39-
"source": [
40-
"import datajoint as dj\n",
41-
"import numpy as np\n",
42-
"import time\n",
43-
"\n",
44-
"schema = dj.Schema('tutorial_distributed')\n",
45-
"\n",
46-
"# Clean up from previous runs\n",
47-
"schema.drop(prompt=False)\n",
48-
"schema = dj.Schema('tutorial_distributed')"
49-
]
39+
"source": "import datajoint as dj\nimport os\n\n# Configure from environment variables (for testing with different backends)\nif os.getenv('DJ_HOST'):\n dj.config['database.host'] = os.getenv('DJ_HOST')\nif os.getenv('DJ_PORT'):\n dj.config['database.port'] = int(os.getenv('DJ_PORT'))\nif os.getenv('DJ_USER'):\n dj.config['database.user'] = os.getenv('DJ_USER')\nif os.getenv('DJ_PASS'):\n dj.config['database.password'] = os.getenv('DJ_PASS')\n\nimport numpy as np\nimport time\n\nschema = dj.Schema('tutorial_distributed')\n\n# Clean up from previous runs\nschema.drop(prompt=False)\nschema = dj.Schema('tutorial_distributed')"
5040
},
5141
{
5242
"cell_type": "markdown",
@@ -181,47 +171,47 @@
181171
"output_type": "stream",
182172
"text": [
183173
"\r",
184-
"Analysis: 20%|██ | 1/5 [00:00<00:00, 8.68it/s]"
174+
"Analysis: 20%|\u2588\u2588 | 1/5 [00:00<00:00, 8.68it/s]"
185175
]
186176
},
187177
{
188178
"name": "stderr",
189179
"output_type": "stream",
190180
"text": [
191181
"\r",
192-
"Analysis: 40%|████ | 2/5 [00:00<00:00, 8.83it/s]"
182+
"Analysis: 40%|\u2588\u2588\u2588\u2588 | 2/5 [00:00<00:00, 8.83it/s]"
193183
]
194184
},
195185
{
196186
"name": "stderr",
197187
"output_type": "stream",
198188
"text": [
199189
"\r",
200-
"Analysis: 60%|██████ | 3/5 [00:00<00:00, 8.45it/s]"
190+
"Analysis: 60%|\u2588\u2588\u2588\u2588\u2588\u2588 | 3/5 [00:00<00:00, 8.45it/s]"
201191
]
202192
},
203193
{
204194
"name": "stderr",
205195
"output_type": "stream",
206196
"text": [
207197
"\r",
208-
"Analysis: 80%|████████ | 4/5 [00:00<00:00, 8.43it/s]"
198+
"Analysis: 80%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 | 4/5 [00:00<00:00, 8.43it/s]"
209199
]
210200
},
211201
{
212202
"name": "stderr",
213203
"output_type": "stream",
214204
"text": [
215205
"\r",
216-
"Analysis: 100%|██████████| 5/5 [00:00<00:00, 8.20it/s]"
206+
"Analysis: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 5/5 [00:00<00:00, 8.20it/s]"
217207
]
218208
},
219209
{
220210
"name": "stderr",
221211
"output_type": "stream",
222212
"text": [
223213
"\r",
224-
"Analysis: 100%|██████████| 5/5 [00:00<00:00, 8.33it/s]"
214+
"Analysis: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 5/5 [00:00<00:00, 8.33it/s]"
225215
]
226216
},
227217
{
@@ -357,127 +347,127 @@
357347
"output_type": "stream",
358348
"text": [
359349
"\r",
360-
"Analysis: 7%| | 1/15 [00:00<00:01, 8.50it/s]"
350+
"Analysis: 7%|\u258b | 1/15 [00:00<00:01, 8.50it/s]"
361351
]
362352
},
363353
{
364354
"name": "stderr",
365355
"output_type": "stream",
366356
"text": [
367357
"\r",
368-
"Analysis: 13%|█▎ | 2/15 [00:00<00:01, 8.00it/s]"
358+
"Analysis: 13%|\u2588\u258e | 2/15 [00:00<00:01, 8.00it/s]"
369359
]
370360
},
371361
{
372362
"name": "stderr",
373363
"output_type": "stream",
374364
"text": [
375365
"\r",
376-
"Analysis: 20%|██ | 3/15 [00:00<00:01, 7.77it/s]"
366+
"Analysis: 20%|\u2588\u2588 | 3/15 [00:00<00:01, 7.77it/s]"
377367
]
378368
},
379369
{
380370
"name": "stderr",
381371
"output_type": "stream",
382372
"text": [
383373
"\r",
384-
"Analysis: 27%|██▋ | 4/15 [00:00<00:01, 7.74it/s]"
374+
"Analysis: 27%|\u2588\u2588\u258b | 4/15 [00:00<00:01, 7.74it/s]"
385375
]
386376
},
387377
{
388378
"name": "stderr",
389379
"output_type": "stream",
390380
"text": [
391381
"\r",
392-
"Analysis: 33%|███▎ | 5/15 [00:00<00:01, 7.82it/s]"
382+
"Analysis: 33%|\u2588\u2588\u2588\u258e | 5/15 [00:00<00:01, 7.82it/s]"
393383
]
394384
},
395385
{
396386
"name": "stderr",
397387
"output_type": "stream",
398388
"text": [
399389
"\r",
400-
"Analysis: 40%|████ | 6/15 [00:00<00:01, 8.12it/s]"
390+
"Analysis: 40%|\u2588\u2588\u2588\u2588 | 6/15 [00:00<00:01, 8.12it/s]"
401391
]
402392
},
403393
{
404394
"name": "stderr",
405395
"output_type": "stream",
406396
"text": [
407397
"\r",
408-
"Analysis: 47%|████▋ | 7/15 [00:00<00:00, 8.14it/s]"
398+
"Analysis: 47%|\u2588\u2588\u2588\u2588\u258b | 7/15 [00:00<00:00, 8.14it/s]"
409399
]
410400
},
411401
{
412402
"name": "stderr",
413403
"output_type": "stream",
414404
"text": [
415405
"\r",
416-
"Analysis: 53%|█████▎ | 8/15 [00:00<00:00, 8.21it/s]"
406+
"Analysis: 53%|\u2588\u2588\u2588\u2588\u2588\u258e | 8/15 [00:00<00:00, 8.21it/s]"
417407
]
418408
},
419409
{
420410
"name": "stderr",
421411
"output_type": "stream",
422412
"text": [
423413
"\r",
424-
"Analysis: 60%|██████ | 9/15 [00:01<00:00, 8.06it/s]"
414+
"Analysis: 60%|\u2588\u2588\u2588\u2588\u2588\u2588 | 9/15 [00:01<00:00, 8.06it/s]"
425415
]
426416
},
427417
{
428418
"name": "stderr",
429419
"output_type": "stream",
430420
"text": [
431421
"\r",
432-
"Analysis: 67%|██████▋ | 10/15 [00:01<00:00, 7.87it/s]"
422+
"Analysis: 67%|\u2588\u2588\u2588\u2588\u2588\u2588\u258b | 10/15 [00:01<00:00, 7.87it/s]"
433423
]
434424
},
435425
{
436426
"name": "stderr",
437427
"output_type": "stream",
438428
"text": [
439429
"\r",
440-
"Analysis: 73%|███████▎ | 11/15 [00:01<00:00, 7.80it/s]"
430+
"Analysis: 73%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u258e | 11/15 [00:01<00:00, 7.80it/s]"
441431
]
442432
},
443433
{
444434
"name": "stderr",
445435
"output_type": "stream",
446436
"text": [
447437
"\r",
448-
"Analysis: 80%|████████ | 12/15 [00:01<00:00, 7.85it/s]"
438+
"Analysis: 80%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 | 12/15 [00:01<00:00, 7.85it/s]"
449439
]
450440
},
451441
{
452442
"name": "stderr",
453443
"output_type": "stream",
454444
"text": [
455445
"\r",
456-
"Analysis: 87%|████████▋ | 13/15 [00:01<00:00, 7.97it/s]"
446+
"Analysis: 87%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u258b | 13/15 [00:01<00:00, 7.97it/s]"
457447
]
458448
},
459449
{
460450
"name": "stderr",
461451
"output_type": "stream",
462452
"text": [
463453
"\r",
464-
"Analysis: 93%|█████████▎| 14/15 [00:01<00:00, 8.15it/s]"
454+
"Analysis: 93%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u258e| 14/15 [00:01<00:00, 8.15it/s]"
465455
]
466456
},
467457
{
468458
"name": "stderr",
469459
"output_type": "stream",
470460
"text": [
471461
"\r",
472-
"Analysis: 100%|██████████| 15/15 [00:01<00:00, 8.08it/s]"
462+
"Analysis: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 15/15 [00:01<00:00, 8.08it/s]"
473463
]
474464
},
475465
{
476466
"name": "stderr",
477467
"output_type": "stream",
478468
"text": [
479469
"\r",
480-
"Analysis: 100%|██████████| 15/15 [00:01<00:00, 7.99it/s]"
470+
"Analysis: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 15/15 [00:01<00:00, 7.99it/s]"
481471
]
482472
},
483473
{
@@ -603,4 +593,4 @@
603593
},
604594
"nbformat": 4,
605595
"nbformat_minor": 5
606-
}
596+
}

src/tutorials/advanced/json-type.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@
4242
}
4343
},
4444
"outputs": [],
45-
"source": "import datajoint as dj\nimport os\n\n# Configure database connection from environment variables (for CI/testing)\nif os.getenv('DJ_HOST'):\n dj.config['database.host'] = os.getenv('DJ_HOST')\nif os.getenv('DJ_PORT'):\n dj.config['database.port'] = int(os.getenv('DJ_PORT'))\nif os.getenv('DJ_USER'):\n dj.config['database.user'] = os.getenv('DJ_USER')\nif os.getenv('DJ_PASS'):\n dj.config['database.password'] = os.getenv('DJ_PASS')\nif os.getenv('DJ_BACKEND'):\n dj.config['database.backend'] = os.getenv('DJ_BACKEND')\n\n# Check if running against PostgreSQL - skip this MySQL-specific tutorial\nimport pytest\nif os.getenv('DJ_BACKEND', '').lower() == 'postgresql':\n pytest.skip(\"JSON tutorial requires MySQL 8.0+ (uses MySQL-specific JSON functions)\", allow_module_level=True)"
45+
"source": "import datajoint as dj\nimport os\n\n# Configure database connection from environment variables (for CI/testing)\nif os.getenv('DJ_HOST'):\n dj.config['database.host'] = os.getenv('DJ_HOST')\nif os.getenv('DJ_PORT'):\n dj.config['database.port'] = int(os.getenv('DJ_PORT'))\nif os.getenv('DJ_USER'):\n dj.config['database.user'] = os.getenv('DJ_USER')\nif os.getenv('DJ_PASS'):\n dj.config['database.password'] = os.getenv('DJ_PASS')\n\n# Check if running against PostgreSQL - skip this MySQL-specific tutorial"
4646
},
4747
{
4848
"cell_type": "markdown",
@@ -57,9 +57,9 @@
5757
"id": "a2998c71",
5858
"metadata": {},
5959
"source": [
60-
"For this exercise, let's imagine we work for an awesome company that is organizing a fun RC car race across various teams in the company. Let's see which team has the fastest car! 🏎️\n",
60+
"For this exercise, let's imagine we work for an awesome company that is organizing a fun RC car race across various teams in the company. Let's see which team has the fastest car! \ud83c\udfce\ufe0f\n",
6161
"\n",
62-
"This establishes 2 important entities: a `Team` and a `Car`. Normally the entities are mapped to their own dedicated table, however, let's assume that `Team` is well-structured but `Car` is less structured than we'd prefer. In other words, the structure for what makes up a *car* is varying too much between entries (perhaps because users of the pipeline haven't agreed yet on the definition? 🤷).\n",
62+
"This establishes 2 important entities: a `Team` and a `Car`. Normally the entities are mapped to their own dedicated table, however, let's assume that `Team` is well-structured but `Car` is less structured than we'd prefer. In other words, the structure for what makes up a *car* is varying too much between entries (perhaps because users of the pipeline haven't agreed yet on the definition? \ud83e\udd37).\n",
6363
"\n",
6464
"This would make it a good use-case to keep `Team` as a table but make `Car` a `json` type defined within the `Team` table.\n",
6565
"\n",

0 commit comments

Comments
 (0)