-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
large share of memory footprint in two variables #272
Comments
OK, so the reason we have two of these is because ts_net_uptake is calculated inside iteration loops, and so we cant just update year_net_uptake directly. The reason we have 40 layers is less clear. Ostensibly, this is the maximum plausible LAI / dinc, which is the layer thickness and is currently 1.0. 40 seems a bit high but is probably to accommodate crazy things happening. I can imagine we could half it and all would be well (maybe we want it to crash if LAI>20?). The alternative is only looking at the bottom 'n' layers of the canopy, but that gets bogged down in canopy thickness changing. Or not do this optimization thing at all, or something else I haven't thought of yet... |
Without having looked at the code recently... Maybe we could create a patch level array to hold this info that is dimensioned (pft,nlayer,nlevleaf)? I don't know how much memory that would save, it depends on how much large maxcohortperpatch is than numpft*nclmax. |
Hmm. I guess it is really just a PFT/leaflayer quantity. I think? |
Looking more at the code. I think we can get away with just one of the two; or rather I think we can remove ts_net_uptake. We could add to ccohort%year_net_uptake in the same place we fill-in ts_net_uptake: Line 475 of FatesPlantRespPhotosynthMod.F90. Another option: we could just track the bottom-most (and thereby the most carbon starved, hypothetically) layer, as well as the index of that layer. And instead of calling the routine yearly, we could call at a higher frequency, which would help to ameliorate the possibility of plants that would want to trim multiple layers per year. |
The issue with sub-annual calling is places that have an off season, as I recall. If say leaves come into positive carbon balance in March, and you are cropping leaves on their 3-month net uptake, it's still going to look like the bottom leaves are highly unproductive for a while after they start to have npp>0. On the bottom layer tracking, how does it work if the Lai grows from 5.9 to 6.1, and so 5-6 is no longer the bottom? (Particularly if it then declines back to 5.9 having missed out the intervening period?) |
Isn't the ts_net_update calculation still inside the temperature iteration also? (Apologies for negative responses to ideas. I've been grappling with this too...) |
Sorry for the late response @rosiealice , this probably made my brain hurt trying to figure this out. @rosiealice, regarding your point on the bottom layer tracking issue. Thinking out loud here. I guess the uptake rate we would track would be for whatever the lowest layer is for the fully flushed plant at its target (or carrying capacity), at the current trimming rate. If a plant is constantly at a leaf area/biomass that does not reach this layer, than I suppose we would not be updating the uptake rate in that lowest layer, and wouldn't need to because the plant doesn't even fill out and wouldnt be relevent anyway. Now, I guess if the plant grows in stature, and the index of the bottom leaf layer gets larger as it grows; I'm now asking myself if the uptakes that have been tracked at the lower layers, are still relevant/meaningful/accurate to influence the decision on how to trim with respect to the new and larger lowest leaf layer index. If the plant was not productive in lower leaf layers as a smaller plant, I would suppose that information would still be relevant as the plant gets larger, and would be meaningful as we decide to trim leaf layers at its larger stature. This assumes that it has not jumped into a new canopy layer (but the vectorized uptake array would not catch this either). |
following up on @rgknox's update and responses to @rosiealice's comment #272 (comment): in general the bottom layer shouldn't change with growth, via the same logic laid out in #420 (comment), which says that, so long as the bleaf and canopy area use the same exponent, then LAI should be invariant with tree size. This isn't strictly true in the understory as the SLA scaling could change due to things happening above, even in that case, the inexactness arising from only tracking the carbon balance of the lowermost expressed leaf layer shouldn't be very large. So I support trying to see if only tracking the lowermost layer works. |
a (possibly) better idea: what if instead of just tracking the bottom layer, we tracked the bottom two leaf layers, and then use the local values of carbon balance increment as a function of bleaf increment to try to identify the approximate intercept of that function, and then trim to that value (assuming that it is within +/- |
@ckoven could you explain in a little more detail, sounds interesting:
|
@ckoven and I stepped up to the whiteboard and I think I have a loose idea of how a solution could approached. My interpretation is that by knowing the annual carbon balance of the two lowest layers, we can calculate the slope, and then use a linear extrapolation to see at what trimming value we would have 0 net uptake. I have to look at the code more, but will self assign and keep you all updated. |
Cool. Now you put it like that, I think I understand the idea a bit more...
…On Thu, Oct 4, 2018, 12:26 AM Ryan Knox ***@***.***> wrote:
@ckoven <https://github.com/ckoven> and I stepped up to the whiteboard
and I think I have a loose idea of how a solution could approached. My
interpretation is that by knowing the annual carbon balance of the two
lowest layers, we can calculate the slope, and then use a linear
extrapolation to see at what trimming value we would have 0 net uptake. I
have to look at the code more, but will self assign and keep you all
updated.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#272 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AMWsQx1lN7q6XRUW_o637FdRTQIU8zctks5uhTmXgaJpZM4PZnu_>
.
|
We have two cohort arrays that are dimensioned by nlevleaf:
nlevleaf is quite large, default = 40
This is the equivalent of 80 cohort level variables that are scalar values. These two variables may be taking up more than half of the total memory. We should investigate how to reduce/remove these arrays.
The text was updated successfully, but these errors were encountered: