Skip to content

Commit 722578a

Browse files
author
Kyle
committed
Bump neuroconv to 0.6.4
1 parent 4d5b035 commit 722578a

4 files changed

Lines changed: 8 additions & 8 deletions

File tree

environments/environment-Linux.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ dependencies:
1515
- flask-cors == 4.0.0
1616
- flask_restx == 1.1.0
1717
- werkzeug < 3.0 # werkzeug 3.0 deprecates features used by flask 2.3.2. Remove this when updating flask.
18-
- neuroconv[dandi,compressors] == 0.6.3
19-
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.3
18+
- neuroconv[dandi,compressors] == 0.6.4
19+
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.4
2020
- spikeinterface >= 0.101.0 # Previously included via neuroconv[ecephys]; needed for tutorial data generation
2121
- pandas < 3.0 # pandas 3.0 uses Arrow backend by default, returning read-only arrays that break spikeinterface Phy extractor
2222
- neo == 0.14.1 # 0.14.2 is not compatible with neuroconv < 0.7.5

environments/environment-MAC-apple-silicon.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ dependencies:
2323
- werkzeug < 3.0 # werkzeug 3.0 deprecates features used by flask 2.3.2. Remove this when updating flask.
2424
# NOTE: the NeuroConv wheel on PyPI includes sonpy which is not compatible with arm64, so build and install
2525
# NeuroConv from GitHub, which will remove the sonpy dependency when building from Mac arm64
26-
- neuroconv[dandi,compressors] == 0.6.3
27-
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.3
26+
- neuroconv[dandi,compressors] == 0.6.4
27+
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.4
2828
- spikeinterface >= 0.101.0 # Previously included via neuroconv[ecephys]; needed for tutorial data generation
2929
- pandas < 3.0 # pandas 3.0 uses Arrow backend by default, returning read-only arrays that break spikeinterface Phy extractor
3030
- neo == 0.14.1 # 0.14.2 is not compatible with neuroconv < 0.7.5

environments/environment-MAC-intel.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,8 @@ dependencies:
1818
- flask-cors == 4.0.0
1919
- flask_restx == 1.1.0
2020
- werkzeug < 3.0 # werkzeug 3.0 deprecates features used by flask 2.3.2. Remove this when updating flask.
21-
- neuroconv[dandi,compressors] == 0.6.3
22-
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.3
21+
- neuroconv[dandi,compressors] == 0.6.4
22+
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.4
2323
- spikeinterface >= 0.101.0 # Previously included via neuroconv[ecephys]; needed for tutorial data generation
2424
- pandas < 3.0 # pandas 3.0 uses Arrow backend by default, returning read-only arrays that break spikeinterface Phy extractor
2525
- neo == 0.14.1 # 0.14.2 is not compatible with neuroconv < 0.7.5

environments/environment-Windows.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,8 @@ dependencies:
1818
- flask-cors === 3.0.10
1919
- flask_restx == 1.1.0
2020
- werkzeug < 3.0 # werkzeug 3.0 deprecates features used by flask 2.3.2. Remove this when updating flask.
21-
- neuroconv[dandi,compressors] == 0.6.3
22-
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.3
21+
- neuroconv[dandi,compressors] == 0.6.4
22+
- dandi < 0.74.0 # 0.74.0 renamed dandi-staging to dandi-sandbox, breaking neuroconv 0.6.4
2323
- spikeinterface >= 0.101.0 # Previously included via neuroconv[ecephys]; needed for tutorial data generation
2424
- pandas < 3.0 # pandas 3.0 uses Arrow backend by default, returning read-only arrays that break spikeinterface Phy extractor
2525
- neo == 0.14.1 # 0.14.2 is not compatible with neuroconv < 0.7.5

0 commit comments

Comments
 (0)