Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flextopo #194

Merged
merged 30 commits into from
Apr 14, 2022
Merged

Flextopo #194

merged 30 commits into from
Apr 14, 2022

Conversation

laurenebouaziz
Copy link
Contributor

@laurenebouaziz laurenebouaziz commented Jan 12, 2022

implementation of flextopo model with flexible number of classes and modular options for the equations/presence of the different storages.

in the toml file the classes are indicated in [model] with:
classes = ["H", "P", "W"]
selection of storage functions for each class is then indicated for the different storages with:
selectSi = ["interception_no_storage", "interception_overflow", "interception_overflow"]

the snow storage and the glacier module are common for all the classes.
interception, hortonponding, hortonrunoff, root-zone, fast storage are class specific (and modular, can be included or excluded for a class through the choice of the function)
the slow reservoir is a common reservoir for the different classes.

to do (things discussed with Willem and Martijn on 12 jan 2022):

  • adapt I/O to be able to write outstates and outmaps for Svectors with dimension class. Implies to make I/O more generic to have multiple options for dimensions (layers or classes) and see how to best handle coordinates (1,2,3 or H, P , W)
  • adapt reading of parameter maps to netcdf with dimension (make sure the values of the parameters can be defined in the toml. See what is the best way to do this [input.vertical] imax['H'] or imax[1] or [input.vertical.H] etc...
  • check options to write to csv:

[[csv.column]]
header = "Sh_H"
map = "sub_S06"
parameter = "vertical.Sh[1]"
reducer = "mean"

or

[[csv.column]]
header = "Sh_H"
map = "sub_S06"
parameter = "vertical.Sh"
reducer = "mean"
dim = 'H'

  • add documentation
  • add example test case (preferably out of hydromt)
  • replace short names of the variables with more explicit names with capital letters

changes made in utils.jl slipped in during testing but probably just need to be reverted / removed.

note:
I tested writing explicitly vertical.Sw, vertical.Sww before the function:
vertical.Sw, vertical.Sww = lateral_snow_transport!(
vertical.Sw,
vertical.Sww,
lateral.land.sl,
network.land,
)
but this gives the error: does not work LoadError: setfield!: immutable struct of type FLEXTOPO cannot be changed
same is true if implemented in SBM though. not sure what this means for the update of the snow and snowwater?

@verseve
Copy link
Member

verseve commented Jan 19, 2022

After our discussion on 12 January 2022, I did make the following changes to the code:

  • Dimension classes is used as an extra dimension in flextopo. This should be specified in the TOML file (was already the case), for example: ["H", "P", "W"], and in the same way in the staticmaps.nc file. The extra dimension is provided during the initialization of the model, and written to states and gridded output.
  • It is now also possible to set model parameters with an extra dimension like layer and classes with a value for each coordinate of the dimension, for example:
[input.vertical]
imax.value = [2.0, 2.0, 3.0]
  • For scalar output (CSV and NetCDF) a dim_index can be provided in the TOML file for model parameters with an extra dimension (layer or classes) that are stored as SVectors. For example:
[[csv.column]]
header = "imax_H"
map = "sub_S06"
parameter = "vertical.imax"
reducer = "mean"
index_dim = 1 # class "H"

@laurenebouaziz
Copy link
Contributor Author

great! thanks @verseve !!
I will test this this week, but just a few questions before:

[input.vertical]
imax.value = [2.0, 2.0, 3.0]

does this mean that we can only assign values to imax?
It would be useful if as for the other parameters, we can also assign a map and a scale and offset value based on a map.
Is this also possible with the current implementation?

and is it also possible to read in parameters from staticmaps with a classes dimension?

@verseve
Copy link
Member

verseve commented Jan 19, 2022

[input.vertical]
imax.value = [2.0, 2.0, 3.0]

does this mean that we can only assign values to imax? It would be useful if as for the other parameters, we can also assign a map and a scale and offset value based on a map. Is this also possible with the current implementation?

No, you can also assign values through a NetCDF variable. Note that the NetCDF variable should have dimension classes if the variable is used for different classes. The order of classes is specified in the TOML file.

Scale and offset should also work.

and is it also possible to read in parameters from staticmaps with a classes dimension?

Yes, see answer above.

@laurenebouaziz
Copy link
Contributor Author

ok, nice that reading in parameters with classes dimension also works, I guess this is what you mean with:

No, you can also assign values through a NetCDF variable. Note that the NetCDF variable should have dimension classes if the variable is used for different classes. The order of classes is specified in the TOML file.

and about:

Scale and offset should also work.

what would be the syntax to apply a scale and offset on a map in the staticmaps?
something like this?

[input.vertical.imax]
#remove from previous list
index_dim = 1
netcdf.variable.name = "map_in_staticmaps"
scale = 2
offset = 0

@verseve
Copy link
Member

verseve commented Jan 19, 2022

ok, nice that reading in parameters with classes dimension also works, I guess this is what you mean with:

Yep, this works fine.

what would be the syntax to apply a scale and offset on a map in the staticmaps? something like this?

[input.vertical.imax]
#remove from previous list
index_dim = 1
netcdf.variable.name = "map_in_staticmaps"
scale = 2
offset = 0

Yes, except that this is not working with index_dim. Scale and offset are applied to all elements:

A = read_standardized(nc, var, dim_sel) .* mod.scale .+ mod.offset

@laurenebouaziz
Copy link
Contributor Author

Yes, except that this is not working with index_dim. Scale and offset are applied to all elements:

is it possible to implement scale and offset with the index_dim?
I think it would make it more flexible for calibration when of a parameter of a specific class only requires calibration without affecting the values of the other classes.

@verseve
Copy link
Member

verseve commented Jan 19, 2022

is it possible to implement scale and offset with the index_dim?
I think it would make it more flexible for calibration when of a parameter of a specific class only requires calibration without affecting the values of the other classes.

Sure, this is possible. Probably good to support multiple indexing then, I guess there is also the use case to modify more than one index during calibration?

@laurenebouaziz
Copy link
Contributor Author

yes good suggestion, this would imply that if index_dim is not specified, then all components are changed, and if it is specified then only the specific component is changed?

@verseve
Copy link
Member

verseve commented Jan 19, 2022

And what if you want to change multiple indices at once, for example index 1 and 3 in the example below. Is this useful?

[input.vertical.imax]
index_dim = [1,3]
netcdf.variable.name = "map_in_staticmaps"
scale = [2.0, 0.5]
offset = [0.0, 0.0]

@laurenebouaziz
Copy link
Contributor Author

yes this sounds useful aswell, and about the "map_in_staticmaps", can this be a map without a "classes" dimension?

@verseve
Copy link
Member

verseve commented Jan 19, 2022

yes this sounds useful aswell, and about the "map_in_staticmaps", can this be a map without a "classes" dimension?

Yes, this is definitely possible, then it is a parameter without an extra dimension like classes or layer, index_dim is then not used/required, only scale and offset.

@visr
Copy link
Member

visr commented Jan 19, 2022

Just to check @verseve, you mention the classes are specified in the TOML like ["H", "P", "W"]. Must this match the netCDF dimension labels exactly, or can it be a subset? What about different order?

And with this:

index_dim = 1 # class "H"

Would that be and index into the list of classes in the TOML, or the netCDF? Do you think it might be a good idea to instead specify the label?

class = "H"

@visr
Copy link
Member

visr commented Jan 19, 2022

And @verseve, now that you dug into the code more, do you wish to proceed with the SVector approach? I still have to check the code, but is it correct that we run per class, but then have to recreate each SVector nclass times (with setindex)? Do we already have an idea about performance compared to Python? @laurenebouaziz if you still have the alternate design we discussed in your notes, could you post it in the thread, even if we won't end up using it?

@verseve
Copy link
Member

verseve commented Jan 19, 2022

Just to check @verseve, you mention the classes are specified in the TOML like ["H", "P", "W"]. Must this match the netCDF dimension labels exactly, or can it be a subset? What about different order?

Yes, this must match the NetCDF dimension exactly (order and size).

Would that be and index into the list of classes in the TOML, or the netCDF? Do you think it might be a good idea to instead specify the label?

class = "H"

The index_dim is now the index into the model parameter (for CSV and scalar NetCDF output). Indeed for classes index 1 correspondents to class H in the example. This is a very simple generic approach (model independent), but less explicit (a user needs to connect the information). We could also resolve the index in the code, it requires a bit more code I think the advantage is that it is more explicit.

@verseve
Copy link
Member

verseve commented Jan 19, 2022

And @verseve, now that you dug into the code more, do you wish to proceed with the SVector approach?

I think it is fine to proceed with the SVector approach. I haven't checked the update functions/performance yet, my impression is that if we run in the same way as the sbm model (unsaturated zone layers) it should be fine.

@laurenebouaziz
Copy link
Contributor Author

here is a copy of the notes we took for the other setup we had discussed:
I did not yet make a good comparison of run times for julia and python.

#W=wetland(0) H=hillslope(1)  P=plateau(2)  WD = drained wetland(3) 
classes = ['W','H','P']
timestepsecs = 3600

#selection of reservoir configuration - 'None' means reservoir should not be modelled for this class
selectSi = interception_overflow2, interception_no_reservoir, interception_overflow2
selectSu = unsatZone_LP_beta, unsatZone_LP_beta, unsatZone_forAgri_hourlyEp
selectSa = None, None, agriZone_hourlyEp_Sa_beta_frostSamax
selectSf = fastRunoff_lag2, fastRunoff_lag2, fastRunoff_lag2
selectSfa = None, None, fastRunoff_lag_agriDitch
selectSw = snow_rain_hourlyEp, snow_rain_hourlyEp, snow_rain_hourlyEp

const Dict{String, Function}(
    "interception_overflow2" => interception_overflow2,
)

struct InterceptionReservoir{T}
	# for 1 class and all pixels
	data::Vector{T}
end

function interception_overflow2(res::InterceptionReservoir)
  n = length(data)  # number of pixels
  for i in 1:n
	#compute
	res.fluxa[i] = min(res.fluxb[i], 10)
  end
end

interception2


struct TopoFlexReservoirs{T}
	i::InterceptionReservoir{T}
	u:Unsat..
end



struct TopoFlex
	classes::Vector{Pair{TopoFlexReservoirs, Function}}
end

[
[interception_overflow2, unsatZone_LP_beta]#wetland
#hillslope
]

about:

We could also resolve the index in the code, it requires a bit more code I think the advantage is that it is more explicit.

Agree, it could also prevent user errors about starting with index 0 or 1 (i.e. i believe the c parameters has dim values [0,1,2,3] in staticmaps.nc but in julia the index would be 1,2,3,4 right?)

@laurenebouaziz
Copy link
Contributor Author

laurenebouaziz commented Jan 27, 2022

I tested the following things which all work nicely:

  • write outstates with classes dimension
  • write outmaps with classes dimensions
  • read in params with classes dimension from staticmaps
  • read in params with assigned values from ini file
  • multiplication of parameters (so far over the whole element)
  • write csv for index_dim 1, 2 and 3 (in this branch it is also still possible to have the three at one with [...], but I guess this will be changed when merging with the changes made in #dim-layer-csv.
  • reinit also works, but the lines don't match exactly (I noticed a similar behavior also for sbm, see attached figure). Not sure what is happening there.

I also changed the names of the variables to longer more explicit names without capital letters to make it more consistent with the other concepts in Wflow. I still need to:

  • make an example staticmaps, forcing and configfile and
  • write the documentation.

What remains of things we had discussed before is:

  • multiplication per element
  • refer to index_dim with the name of the dimension instead of its index?

Is it still possible for you to have a look at this @verseve (no rush)? (I added a new version of the toml file with the new names in you postbox in case you need it.) And thanks for all the changes so far :) !!

states_check_sbm

@verseve
Copy link
Member

verseve commented Jan 28, 2022

  • reinit also works, but the lines don't match exactly (I noticed a similar behavior also for sbm, see attached figure). Not sure what is happening there.

Did you check the test in \Wflow\test\run.jl, part of the Wflow tests? This is testing restart of runs (state handling). Perhaps something is wrong with the dates in your TOML file?

@verseve
Copy link
Member

verseve commented Feb 2, 2022

And @laurenebouaziz, for running flextopo as part of Delft-FEWS, 3D data/SVectors can be imported by the General Adapter run through the z coordinates of the mapped internal FEWS locations. For sbm this is now implemented in #193. For flextopo this is a bit of a hack, since the extra dimension in flextopo refers to a class and not a depth. Other options are:

  • for FEWS run export SVector as 2D map (renaming variable, by adding the class).
  • import as an ensemble, haven't checked this and not sure if/how this works with other ensembles.

@laurenebouaziz
Copy link
Contributor Author

Thanks @verseve for checking how it would work in FEWS for flextopo. For scalar netcdf at point locations, I don't think it is a problem to separate the output for the different classes with separate names and import them separately in fews. However, for outmaps and outstates, I don't think there is also a way to split the output for each class in the toml? And would the states still correctly be picked up in a fews warm states run? I guess for outmaps and outstates, it would probably then be better to misuse the z coordinate for the classes, what do you think?

@verseve
Copy link
Member

verseve commented Feb 11, 2022

Update:

  • rebased, now indexing in TOML file based on label (as in Master branch), so e.g. class = "H" .
  • flextopo now also compatible with FEWS, in case of a FEWS run the extra dimension classes has Float values.
  • dimension can be supplied when modifying an input NetCDF variable:
[input.vertical.imax]
netcdf.variable.name = "map_in_staticmaps"
scale = 2.0
offset = 0.0
class = "H"

@laurenebouaziz
Copy link
Contributor Author

I finally had some time to test the model and compare it with the python runs and I was able to reproduce the results :)

  • About what you previously suggested:
> And what if you want to change multiple indices at once, for example index 1 and 3 in the example below. Is this useful?
> 
> ```toml
> [input.vertical.imax]
> index_dim = [1,3]
> netcdf.variable.name = "map_in_staticmaps"
> scale = [2.0, 0.5]
> offset = [0.0, 0.0]
> ```

I think it is a good idead and I implemented this option for parameter multiplication over several classes in commit c5d5c91
@verseve, maybe we can have a look together if this needs to be improved or adapted?

I think how it was now, parameter multiplication only worked for a single class and it seemed to me that there were no possibilities to change the parameters over multiple classes at the same time, as for example the code below does not work :

[input.vertical.alfa]
netcdf.variable.name = "alfa"
scale = 1.3
offset = 0
class = "p"

[input.vertical.alfa]
netcdf.variable.name = "alfa"
scale = 1.5
offset = 0
class = "h"

  • Also I noticed that with the changes made in 18e0d25 writing states and outmaps with multiple classes did not work anymore. I think it is due to the fact that:

extra_dim::NamedTuple{(:name, :value),Tuple{String,Vector{Float64}}},

expects a vector of Float64 while the classes are a vector of strings, so if I replace this line with:
extra_dim::NamedTuple{(:name, :value),Tuple{String,Vector{String}}},
it works for flextopo, but I don't think it will work for sbm anymore, so wanted to check with you how to best fix this @verseve ?

  • last thing, for the fewsrun, I was wondering if it is correct that classes here becomes 3 and not [1,2,3]:
    classes = fews_run ? Float64.(collect(length(flextopo.classes))) : flextopo.classes

@verseve
Copy link
Member

verseve commented Mar 17, 2022

* Also I noticed that with the changes made in [18e0d25](https://github.com/Deltares/Wflow.jl/commit/18e0d255040e80bd6ee50708c2e402803bb1c56e) writing states and outmaps with multiple classes did not work anymore. I think it is due to the fact that:

extra_dim::NamedTuple{(:name, :value),Tuple{String,Vector{Float64}}},

expects a vector of Float64 while the classes are a vector of strings, so if I replace this line with: extra_dim::NamedTuple{(:name, :value),Tuple{String,Vector{String}}}, it works for flextopo, but I don't think it will work for sbm anymore, so wanted to check with you how to best fix this @verseve ?

I think this should fix this: extra_dim::NamedTuple{(:name, :value),Tuple{String, Vector{T}}} where T <: Union{String, Float64},

@verseve
Copy link
Member

verseve commented Mar 17, 2022

* last thing, for the fewsrun, I was wondering if it is correct that classes here becomes 3 and not [1,2,3]:
  https://github.com/Deltares/Wflow.jl/blob/c5d5c91dbcc9439c670bedaf1c7353edd439f07f/src/flextopo_model.jl#L660

Thanks for spotting this @laurenebouaziz , indeed for FEWS the classes should be [1.0,2.0,3.0] and not 3.

@laurenebouaziz
Copy link
Contributor Author

thanks @verseve ! both are fixed now.

@verseve
Copy link
Member

verseve commented Mar 17, 2022

* About what you previously suggested:
> And what if you want to change multiple indices at once, for example index 1 and 3 in the example below. Is this useful?
> 
> ```toml
> [input.vertical.imax]
> index_dim = [1,3]
> netcdf.variable.name = "map_in_staticmaps"
> scale = [2.0, 0.5]
> offset = [0.0, 0.0]
> ```

I think it is a good idead and I implemented this option for parameter multiplication over several classes in commit c5d5c91 @verseve, maybe we can have a look together if this needs to be improved or adapted?

I did briefly check the code and agree that modifying multiple indices at once is useful. I did notice it is only implemented for dim class, recommend to also implement this for layer then.

@verseve verseve linked an issue Apr 8, 2022 that may be closed by this pull request
@verseve
Copy link
Member

verseve commented Apr 8, 2022

Would be good to increase codecov for this PR, @laurenebouaziz?

@visr
Copy link
Member

visr commented Apr 8, 2022

Indeed that would be good, currently:

7.82% of diff hit (target 88.71%)

(although these numbers are a bit inflated due to unrelated config.input > config changes)

I see you did quite some checks offline, but the best would be to add (some of) these checks to the tests such that we know we won't break it in the future.

@laurenebouaziz
Copy link
Contributor Author

yes good point, what kind of tests do you recommend me to add?

@visr
Copy link
Member

visr commented Apr 8, 2022

The codecoverage output may help to identify some functions or features that are currently not covered by tests:

https://app.codecov.io/gh/Deltares/Wflow.jl/compare/194/diff#D6L531

Try to see what new functionality is not currently being run.

Copy link
Member

@visr visr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made a few comments that would be good to address before merging, but overall this seems pretty close to being finished, nice job.

src/utils.jl Show resolved Hide resolved
src/utils.jl Show resolved Hide resolved
src/flextopo_model.jl Outdated Show resolved Hide resolved
src/flextopo.jl Outdated Show resolved Hide resolved
src/flextopo.jl Outdated Show resolved Hide resolved
src/flextopo.jl Outdated Show resolved Hide resolved
src/flextopo.jl Outdated Show resolved Hide resolved
@verseve
Copy link
Member

verseve commented Apr 12, 2022

I think it would be good to add to the docs a bit more info on changing/setting input parameters with an extra dimension:

  1. Set to a fixed value, e.g.:
[input.vertical]
c.value = [10.0, 12.0, 9.0, 8.5]
  1. Modifying input parameters with an extra dimension.

I will add some info on this to docs\src\user_guide\step2_settings_file.md, and also mention there the extra dimension class (now only layer is mentioned there).

verseve and others added 4 commits April 12, 2022 15:31
@laurenebouaziz
Copy link
Contributor Author

Thanks @verseve for completing the docs on param mult with extra dimension!
Thanks @visr for the review! I adressed the changes in commit 7dde6ff. There is still one open question on remove "config" from the functions, not sure what is best here.
I explanded the test in 97a0704, this should increase the codecov percentage.
Let me know if you think it is ready to merge now!

@visr
Copy link
Member

visr commented Apr 13, 2022

Thanks for addressing the comments. Indeed the code coverage looks good now. Shall we remove the extra config argument?

@verseve
Copy link
Member

verseve commented Apr 13, 2022

Thanks for addressing the comments. Indeed the code coverage looks good now. Shall we remove the extra config argument?

Yes, I agree to remove the extra config argument if it is not used!

@laurenebouaziz laurenebouaziz merged commit 23338f3 into master Apr 14, 2022
@laurenebouaziz laurenebouaziz deleted the flextopo branch April 14, 2022 11:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flextopo implementation in Julia
3 participants