-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Develop prototype code for applying "fixes" to decadal data #98
Comments
Here's what I've written for this so far: https://gist.github.com/ellesmith88/532e6395afe1b53567dd564b15696d0e It's written as it would be in daops and dachar - with the generic functions and then dictionaries containing the info for each fix, so its quite long at the moment. A few notes:
I'll have another look at the last 2 points. additional point:
After:
I've tried overwriting this in ds.lat_bnds.attrs["coordinates"] and ds.lat_bnds.encoding["coordinates"] as we have done with fill values, but I can't seem to get rid of it. |
Updated here: https://gist.github.com/ellesmith88/532e6395afe1b53567dd564b15696d0e Looks like the dropping of the time_bnds unit is something that always happens when using xarrau, but it still exists in the xarray dataset under |
Questions to BSC:
|
Note to @agstephens and @ellesmith88: liaise with Piotr from the Met Office when we have implemented the fixes. |
Hi @agstephens
Yes. Exactly.
Yes. the new sub_experiment_id has the month in addiation to the year.
Yes. You can compute leadtime as reftime + time.
The reftime is start year (taken from the file name) - 11-01:00 |
|
Release plan:
|
Working notes... Update all versions of packages
Writing to elasticsearchInternal link to Kibana: https://kibana-ror-master.130.246.131.9.nip.io/app/dev_tools#/console Get the API key and write to:
Then run these tests...
It is possible to see the changes happening in kibana by polling the following...(which shows the index being created, populated and then deleted...
There is no need to create an index, it happens automatically. Read access from outside the firewallI have updated the settings in Kibana so that indexes will be available outside the firewall if their are named "c3s*". Test the fixHere is how to test the new decadal fixes...
Temporary patch (to my version of Intake) - for decoding issues using Dask.DataFrame - note that JV had this issue on the CDS and version fixes resolved it. My patch is just:
We can resolve it later. Then the tests run:
and
We'll need to run the
|
After testing the fixes in
|
|
closed via roocs/daops#87 and roocs/dachar#93 |
We have an example file to start developing the required "fixes" for the decadal data to be added to the CMIP6 holdings for C3S.
The original NetCDF (pre-fixes) file is here:
https://github.com/cp4cds/c3s34g_master/blob/master/Decadal/ESGF-original/tas_Amon_EC-Earth3_dcppA-hindcast_s1986-r1i1p1f1_gr_198611-198710.nc?raw=true
The modified NetCDF (post-fixes) file is here:
https://github.com/cp4cds/c3s34g_master/blob/master/Decadal/tas_Amon_EC-Earth3_dcppA-hindcast_s198611-r1i1p1f1_gr_198611-198710.nc?raw=true
The initial task is to write some xarray code that will apply the changes (using the first file), and then write an equivalent to the second file. Once we are happy with the approach, we can embed it into dachar and daops.
Rules to create compliant format for files
Rules:
time:long_name
value to "valid_time"reftime
variable:startdate
(e.g.::startdate = "s198611" ;
)leadtime
variablestartdate
,sub_experiment_id
The text was updated successfully, but these errors were encountered: