-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Explicitly calculate dtype element size in netCDF3 records #466
Conversation
@martindurant I tried testing, but don't know whether it's not working or simply user error opening in xarray (cell [8] in https://gist.github.com/rsignell/cb6e3ed842abedb797e2cd8ccc39169c |
Should have been:
(remove
Note from our discussion: you can split any of the big arrays on the first dimension, which here is depth, length 40 ( |
Test should be fixed by fsspec/filesystem_spec#1634 |
I'll merge this now and we can open any further issues that might arise. |
We'll need to decide what the best size is in the end. Having smaller chunks means more requests and a bigger reference set on disk. Since it's pretty easy to generate, we could make multiple variants and profile them. |
Fixes #465
@rsignell-usgs please test. Not what I thought was happening. I don't know why I can't trust numpy's
dt.itemsize
.