Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Saltflux option for CICE #799

Merged
merged 24 commits into from
Dec 7, 2022
Merged

Conversation

dabail10
Copy link
Contributor

@dabail10 dabail10 commented Dec 2, 2022

For detailed information about submitting Pull Requests (PRs) to the CICE-Consortium,
please refer to: https://github.com/CICE-Consortium/About-Us/wiki/Resource-Index#information-for-developers

PR checklist

  • Short (1 sentence) summary of your PR:
    This adds the saltflux option in CICE. Also, a couple cleanup things.
  • Developer(s):
    dabail10 (D. Bailey)
  • Suggest PR reviewers from list in the column to the right.
  • Please copy the PR test results link or provide a summary of testing completed below.
    Bfb with the aux_cice tests in CESM.
    Also, did base_suite on cheyenne using intel, pgi, gnu. Added salt test (only in gnu base_suite here).
    https://github.com/CICE-Consortium/Test-Results/wiki/cice_by_hash_forks#f19faa770d7e06903bbae621687adadc0bce42ba
  • How much do the PR code changes differ from the unmodified code?
    • bit for bit
    • different at roundoff level
    • more substantial
  • Does this PR create or have dependencies on Icepack or any other models?
    • Yes
    • No
  • Does this PR update the Icepack submodule? If so, the Icepack submodule must point to a hash on Icepack's main branch.
    • Yes
    • No
  • Does this PR add any new test cases?
    • Yes
    • No
  • Is the documentation being updated? ("Documentation" includes information on the wiki or in the .rst files from doc/source/, which are used to create the online technical docs at https://readthedocs.org/projects/cice-consortium-cice/. A test build of the technical docs will be performed as part of the PR testing.)
    • Yes
    • No, does the documentation need to be updated at a later time?
      • Yes
      • No
  • Please provide any additional information or relevant details below:
    In addition to the saltflux option (Issue Add true salt flux computation. #706 ) I have added a couple cleanup things:
  1. Fix an ACC directive in ice_dyn_evp_1d.F90 (issue OPENACC code in ice_dyn_evp_1d.F90 #783)
  2. Fix tmpstr. Issue tmpstr not defined in ice_init.F90 #798
  3. Remove the CESM1_PIO stuff in io_pio2/ice_restart.F90. This is not necessary now that we are not using landblock elimination. (Issue Add CESM1_PIO for fill value check #675)
  4. Put an ifdef around the OpenMP initialization code in ice_init.F90. Not sure what is going on here yet. (Issue OpenMP Initialization in ice_grid.F90 #800 )
  5. I have added an option (wrapped in #CESMCOUPLED) to initialize ice in the 'default' case based on Tair instead of sst. When coupling in CESM, the SST has not been received from the ocean on initialization, but it has received Tair.

Copy link
Contributor

@apcraig apcraig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to do some standalone CICE testing to make sure we understand what the results are going to be. I think the log comparisons are going to fail because of the new diagnostics even if it's all bit-for-bit. I can help with this if needed.

! north/south salt
call total_salt (work2)
totsn = global_sum(work2, distrb_info, field_loc_center, tarean)
totss = global_sum(work2, distrb_info, field_loc_center, tareas)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like you could reuse work1 here rather than add new array, work2.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe. I thought they needed to be independent.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this code, we do not need work2, but in the print_points code we do. The array work2 is used in a number of places, so it is not really saving us memory.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, it just looks like work2 was added in init_mass_diags just for the new salt diagnostic. work2 must exist separately in print_points and elsewhere.

@@ -260,6 +260,7 @@ subroutine input_data
highfreq, natmiter, atmiter_conv, calc_dragio, &
ustar_min, emissivity, iceruf, iceruf_ocn, &
fbot_xfer_type, update_ocn_f, l_mpond_fresh, tfrz_option, &
saltflux_option, ice_ref_salinity, &
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ice_ref_salinity could be shifted left one space, align &.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok

@@ -1414,6 +1419,12 @@ subroutine input_data
write(nu_diag,*) subname//' WARNING: For consistency, set tfrz_option = mushy'
endif
endif
if (ktherm == 1 .and. trim(saltflux_option) /= 'constant') then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be an error or just a warning that nobody will see?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm basically going with the example of tfrz_option here. It is not advisable to use ktherm = 1 and saltflux_option /= 'constant', but maybe people could do it if they wanted.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fine by me, was just asking.

ice_IOUnitsInUse(nu_diag) = .true. ! reserve unit nu_diag
#ifdef CESMCOUPLED
! CESM can have negative unit numbers.
if (nu_diag < 0) nu_diag_set = .true.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Had a quick look at "nu_diag_set" usage and don't love it overall. Can we put #ifdef CESMCOUPLED" around it everywhere, so it's clear it has nothing to do with CICE.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

including in the declaration in ice_fileunits.F90.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or could we get rid of nu_diag_set and use CESMCOUPLED cpp to "turn off" whatever CICE is doing to set/release nu_diag? With CESMCOUPLED, if we can assume nu_diag is set by the coupler layer, we shouldn't need "nu_diag_set".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine.

@@ -239,7 +243,8 @@ subroutine release_all_fileunits
call release_fileunit(nu_rst_pointer)
call release_fileunit(nu_history)
call release_fileunit(nu_hdr)
if (nu_diag /= ice_stdout) call release_fileunit(nu_diag)
! CESM can have negative unit numbers
if (nu_diag > 0 .and. nu_diag /= ice_stdout) call release_fileunit(nu_diag)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe rather than change this logic, it should just be

#ifndef CESMCOUPLED
if (nu_diag /= ice_stdout) call release_fileunit(nu_diag)
#endif

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We still need to make sure not to release the negative file units here. However, I am fine with wrapping in a CESMCOUPLED.

Copy link
Contributor

@apcraig apcraig Dec 2, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall, I think it would be nice if the unit number management in CICE is not altered by the coupled models. If we could come up with a few rules like

  • the coupler layer will always set nu_diag but no other unit numbers
  • when coupled, nu_diag should not be set or released
  • can use CESMCOUPLED cpp as needed to change nu_diag implementation

then it should be easy to clean up the unit number management, really just nu_diag, in CICE and make it a lot clearer. Does that make sense?

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 2, 2022

I think this is a good idea.

@apcraig
Copy link
Contributor

apcraig commented Dec 2, 2022

Github Actions indicates the code is not compiling. Recommend standalone testing.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 2, 2022

I fixed the issue with nu_diag_set in ice_init.F90. Should compile now. Although, I don't think it will actually run, because it requires the new icepack submodule?

@apcraig
Copy link
Contributor

apcraig commented Dec 2, 2022

I was just noticing that it needs the updated Icepack.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 2, 2022

I will do some standalone testing though.

@apcraig
Copy link
Contributor

apcraig commented Dec 6, 2022

You should be able to update Icepack on your branch and push to update the PR.

How about also changing the name of the test from salt to saltflux. Thanks.

@apcraig
Copy link
Contributor

apcraig commented Dec 6, 2022

I think the Icepack update is incorrect. GHActions is also failing. You could try the following and see if it helps,

  • mv icepack icepack.old
  • git submodule update --init
  • cd icepack
  • git pull origin main
  • git log (should be #8637a2d)
  • cd ../
  • git add icepack
  • git commit -a "update icepack"
  • git push ...

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 6, 2022

Hmm. I thought I had the right one. I will check it.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 6, 2022

Sorry. I was pointing to ESCOMP instead of Consortium main for Icepack.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 6, 2022

I just synced my ESCOMP main with Consortium main. I do like this option in github now.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 6, 2022

Ok. Updating icepack is creating some conflicts with the current Icepack hash.

@apcraig
Copy link
Contributor

apcraig commented Dec 6, 2022

I think Icepack is OK now. Will run some tests just to be sure.

@dabail10
Copy link
Contributor Author

dabail10 commented Dec 6, 2022

There were some weird conflicts in CICE, but I was able to resolve these.

@apcraig
Copy link
Contributor

apcraig commented Dec 7, 2022

Full test suite on cheyenne is all bit-for-bit, 3 compilers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants