Conversation
|
Two notes:
I see two options:
|
docs/getting_started.md
Outdated
| ds = xr.open_zarr(s3.get_mapper('chalmerscloudiceclimatology/record/gridsat/2020/ccic_gridsat_202001010000.zarr')) | ||
| aws_file_path = "chalmerscloudiceclimatology/record/gridsat/2021/ccic_gridsat_202101010000.zarr" | ||
| store = zarr.storage.FsspecStore(s3, path=aws_file_path) | ||
| ds = xr.open_zarr(store, consolidated=True) |
There was a problem hiding this comment.
ds = xr.open_zarr('s3://chalmerscloudiceclimatology/record/gridsat/2021/ccic_gridsat_202101010000.zarr') will simply work with a Zarr3 installation and xarray (xarray will try to guess the store from the s3 key)
|
Also note that this PR fails the GitHub Update: Zarr 3 requires at least Python 3.11. We use 3.10 in the environment YAML files. Python 3.11 is not compatible with the PyTorch packages we specify in the YAML files. I opened issue #101 for this. Related: PR #102 |
|
Thanks a lot for digging into this. Your suggestion is a lot cleaner. To make it work I still had to
|
|
Ah, yes, of course, s3fs should be a dependency if you read from S3 buckets. I assumed the user would have that already on their end --- I was having in mind the more local use we do at Chalmers when I changed For future reference: I think something has changed with how |
This PR updates the instructions for accessing CCIC data from AWS to ensure compatibility with Zarr version 3.0 and above. The previous approach no longer worked due to changes in the Zarr API.