Skip to content

Fix DANDI sandbox upload broken by dandi-cli 0.73.2#1037

Merged
bendichter merged 2 commits intomainfrom
fix-dandi-sandbox-api-key
Feb 11, 2026
Merged

Fix DANDI sandbox upload broken by dandi-cli 0.73.2#1037
bendichter merged 2 commits intomainfrom
fix-dandi-sandbox-api-key

Conversation

@bendichter
Copy link
Copy Markdown
Collaborator

Summary

  • Root cause: dandi-cli 0.73.2 (released Nov 15, 2025) changed API key env var lookup from the generic DANDI_API_KEY to instance-specific vars (e.g. DANDI_SANDBOX_API_KEY for the sandbox). NWB GUIDE only set DANDI_API_KEY, so sandbox uploads silently failed (timeout after 5 min).
  • Impact: All E2E DANDI upload tests have been failing since Nov 15, 2025 (daily tests show 2 consistent failures)
  • Fix: Set both DANDI_API_KEY and DANDI_SANDBOX_API_KEY when uploading to sandbox, maintaining backward compatibility with older dandi versions

Test plan

  • E2E tests should now pass the "Upload pipeline output to DANDI" and "Review upload results" tests
  • Verify DANDI upload still works for non-sandbox (production) uploads

🤖 Generated with Claude Code

bendichter and others added 2 commits February 11, 2026 16:42
dandi-cli 0.73.2 (released Nov 15, 2025) changed API key lookup from
the generic DANDI_API_KEY env var to instance-specific env vars (e.g.
DANDI_SANDBOX_API_KEY for the sandbox). This broke sandbox uploads
because nwb-guide only set DANDI_API_KEY.

Now set both DANDI_API_KEY and DANDI_SANDBOX_API_KEY when uploading
to the sandbox, maintaining compatibility with both old and new dandi.

See: dandi/dandi-cli#1731

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@bendichter
Copy link
Copy Markdown
Collaborator Author

Note on CI failures

The macOS (macos-latest) job failures in this PR are unrelated to this change. They are caused by a tables (PyTables) build failure during conda environment setup:

ld: library 'hdf5' not found
ERROR:: Could not find a local HDF5 installation.
ERROR: Failed to build 'tables' when getting requirements to build wheel

Root cause: neuroconv==0.6.1 requires tables<3.9.2 on macOS via the [behavior] extra. tables==3.9.1 has no pre-built wheel for macOS arm64, so pip tries to build from source. This previously worked when the CI runner's environment had HDF5 headers accessible to pip's build subprocess, but now fails (likely due to a runner image update).

This is a pre-existing issue on main that only surfaces on conda cache misses. It is already fixed in neuroconv>=0.6.5, which relaxes the constraint to tables>=3.10.1 (compatible with the conda-forge pytables=3.10.2 already in the env file). PRs #1034, #1035, and #1036 bump neuroconv and will resolve this.

This PR's actual change is a one-line fix to set DANDI_SANDBOX_API_KEY (in addition to DANDI_API_KEY) when uploading to the DANDI sandbox, which has been required since dandi-cli==0.73.2 (dandi/dandi-cli#1731, released Nov 15 2025). This is what broke the E2E DANDI upload tests in the daily runs.

@rly
Copy link
Copy Markdown
Collaborator

rly commented Feb 11, 2026

Other test failures appear to stem from an incompatibility with the latest version of spikeinterface, which we should address in a separate PR. The changes here look good, so I will merge bypassing CI requirements after the UI tests pass.

[electron] [2026-02-11 21:47:00,237] ERROR in app: Exception on /neuroconv/metadata [POST]
Traceback (most recent call last):
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/flask/app.py", line 1484, in full_dispatch_request
    rv = self.dispatch_request()
         ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/flask/app.py", line 1469, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/flask_restx/api.py", line 404, in wrapper
    resp = resource(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/flask/views.py", line 109, in view
    return current_app.ensure_sync(self.dispatch_request)(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/flask_restx/resource.py", line 46, in dispatch_request
    resp = meth(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/nwb-guide/nwb-guide/src/pyflask/namespaces/neuroconv.py", line 68, in post
    return get_metadata_schema(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/nwb-guide/nwb-guide/src/pyflask/manageNeuroconv/manage_neuroconv.py", line 440, in get_metadata_schema
    converter = instantiate_custom_converter(resolved_source_data, interfaces)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/nwb-guide/nwb-guide/src/pyflask/manageNeuroconv/manage_neuroconv.py", line 403, in instantiate_custom_converter
    return CustomNWBConverter(source_data=source_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/neuroconv/nwbconverter.py", line 81, in __init__
    name: data_interface(**source_data[name])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/neuroconv/datainterfaces/ecephys/phy/phydatainterface.py", line 44, in __init__
    super().__init__(folder_path=folder_path, exclude_cluster_groups=exclude_cluster_groups, verbose=verbose)
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/neuroconv/datainterfaces/ecephys/basesortingextractorinterface.py", line 23, in __init__
    self.sorting_extractor = self.get_extractor()(**source_data)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/spikeinterface/extractors/phykilosortextractors.py", line 231, in __init__
    BasePhyKilosortSortingExtractor.__init__(
  File "/home/runner/miniconda3/envs/nwb-guide/lib/python3.12/site-packages/spikeinterface/extractors/phykilosortextractors.py", line 138, in __init__
    unit_ids[i] = new_si_id
    ~~~~~~~~^^^
ValueError: assignment destination is read-only

@bendichter bendichter enabled auto-merge February 11, 2026 22:27
@bendichter bendichter disabled auto-merge February 11, 2026 22:30
@bendichter bendichter merged commit 5658817 into main Feb 11, 2026
14 of 25 checks passed
@bendichter bendichter deleted the fix-dandi-sandbox-api-key branch February 11, 2026 22:31
@bendichter
Copy link
Copy Markdown
Collaborator Author

bendichter commented Feb 11, 2026

yes, that is an issue with pandas>=3 in spikeinterface, which I address in #1034

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants