Add plugins to wheels#1450
Conversation
|
jswhit2 seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
|
the plugins are working for linux, but not for windows or macos. It seems like the conda libnetcdf package for windows should be installing the plugins, but they are not being found, even if I set NETCDF_PLUGIN_DIR to point to where they should be installed. Looks like brew on macos should install at least zstd and bzip2, but netcdf4-python can't find them there either. EDIT: looks like homebrew does not install the netcdf plugins on macos. |
I'll investigate this next week. I thought we had all the plugins in the Windows conda package. |
|
Updated the libnetcdf version to 4.9.3 on windows (this version has the netcdf plugins). The blosc plugin test fails with an HDF5 lib error, so I've skipped that test for now on windows. Switched to using conda for the macos wheel, and now plugins are found there also. However, the delocate step fails on macos since the libs linked by the plugins are not found. |
|
After converting the macos wheels to use conda installed libs, the compression plugin tests are working. The blosc test is still failing on windows though. I noticed that delvewheel was not installing the blosc dll into the wheel, so I many added it (along with zstd and lz4). That didn't help. Interestingly, the blosc test is passing in the "regular" github action tests - the only difference there is HDF5_PLUGIN_PATH is set to point to the location of the conda installed plugins. |
|
This h5py issue seems to be relevant for the windows blosc test failure we are seeing. "the plugin calls back into the HDF5 API (several plugins do in order to determine chunk size and type information) then it needs to be the same instance of the HDF5 library that h5py is using - so plugins installed system wide that call into the API will not work" This suggests that perhaps the plugin installed in the wheel is not using the same instance of the HDF5 library that netcdf4-python is using. |
|
@NotSqrt could you test one of the linux wheels generated here in a clean environment (outside the manylinux docker image environment) to make sure the plugins work? I can do the same for macos. |
|
Plugin tests pass on macos in a clean virtual environment. |
I've tested the x86_64 linux wheel from https://github.com/Unidata/netcdf4-python/actions/runs/19615699182, it works just like the wheel I've been building in my copy of the container. |
|
@ocefpaf will you have a chance to review anytime soon? |
ocefpaf
left a comment
There was a problem hiding this comment.
Just some minor comments. LGTM!
|
Here are the wheels sizes for macos:
and
I'll investigate why macos14 is ~20 MB. That is not bad... But not good either. PS: I don't have a mac to test these wheels. It would be nice to test them in a "end user" laptop. @dopplershift sorry for the ping, but can you test them? |
|
Here are the contents of the mac and windows wheels. mac: windows |
|
the macos arm64 wheels include |
I need to figure out if |
|
@ocefpaf I'd like to go ahead and merge while the issue of the size of the macos arm64 wheels is being worked on. Sound OK to you? |
I'm fine with that. Hopefully the size reduction will trickle down to the wheel in the next build. |
To trigger the actions following #1447 and the new docker netcdf-manylinux images