You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We've got temperature_dependence_v2 to the point where the user can use set_temperature! to specify a temperature and change all the relevant rates, and then run simulate without anything crashing 🎉
So hopefully everything is working and @hanamayall is rewriting the tests now...
One issue we're facing however is that scaling all the rates to per second (I think the default time scale with non temperature dependency is one generation of the smallest producer) is causing huge changes in rates. For example, the classic 3-species chain has metabolism and growth millions of times lower than the standard model (see code below).
This means that temperature dependence is not compatible with the default use of TerminateSteadyState, which is terminating the simulations within about 1 timestep. I've tested both removing TerminateSteadyState and reducing the tolerances (Tests 2 & 3, respectively, in the below code example) and either option can resolve this.
What we are unsure about is whether it is ok to let the user of temperature dependence manually have to change tolerances, or make the default tolerance in TerminateSteadyState scale with something like growth rate, or for us to have a switch in simulate triggered by the use of temperature dependence that alters the TerminateSteadyState tolerances to something more suitable?
The second related issue is that the number of time-steps needed to run a model until equilibrium is enormous (for a larger food web this would run into the billions). As an example from the literature, Hana says Binzer ran her models for 100,000 years, with rates all having the units 1/s... The current setup crashes my computer when trying to run the simulation for 1,000 years, I assume because the length of simulation is too great.
The code below has examples of running the three species model for 10 & 100 years, saving every percentile. Even though the 100 year simulation is saving the same number of points it is still running for approximately 10x the amount of time as the 10 year simulation, which is thousands of times slower than without temperature dependence. Is there a good way to speed up really long simulations like this, or do we have to accept that temperature dependence is slower? I have tried non-adaptive time-steps with a large dt but unsurprisingly this is unstable.
Finally, similar to changing the tolerance of TerminateSteadyState, is it best to let the user specify dt when using temperature dependence as the default is inappropriate, or should we have a default of it scaling to tmax
Here is the code to play with 😀 let us know if this makes no sense
# set upusing BEFWM2, Plots, DifferentialEquations # import package, make sure on temperature_dependence_v2 branch# Build a food web
foodweb =FoodWeb([ 000; 100; 010]) # create tri-tropic network# functional_response has to be ClassicResponse
MP =ModelParameters(foodweb, functional_response =ClassicResponse(foodweb)) # generate model parameters using Classic functional response
unchanged_MP =deepcopy(MP)
# Time to introduce temperature dependence!set_temperature!(MP, 293.15, ExponentialBA())
# How do the new rates compare?# Growth
unchanged_MP.biorates.r[1] / MP.biorates.r[1] # ~6,400,000 times smaller# Metabolism
unchanged_MP.biorates.x[2] / MP.biorates.x[2] # ~4,800,000 times smaller
unchanged_MP.biorates.x[3] / MP.biorates.x[3] # ~4,800,000 times smaller# Attack rate
unchanged_MP.functional_response.aᵣ[2] / MP.functional_response.aᵣ[2] # ~240,000 times smaller
unchanged_MP.functional_response.aᵣ[6] / MP.functional_response.aᵣ[6] # ~240,000 times smaller# Handling time
unchanged_MP.functional_response.hₜ[2] / MP.functional_response.hₜ[2] # ~63,000 times bigger
unchanged_MP.functional_response.hₜ[6] / MP.functional_response.hₜ[6] # ~63,000 times bigger# Test 0: Baseline expectation
B0 =repeat([unchanged_MP.environment.K[1]/2], 3) # starting biomass of half the carrying capacity of the system
out =@timedsimulate(unchanged_MP, B0)
out.time # ~0.0035 secondsplot(out[1])
# Test 1: Simulation as normal
B0 =repeat([MP.environment.K[1]/2], 3) # starting biomass of half the carrying capacity of the system
out =@timedsimulate(MP, B0) # retcode: Terminatedmaximum(out[1].t) # Terminates after 1.2 timesteps
out.time # ~0.0004 seconds# Conclusion: very fast but terminates too early# Test 2: Turn off TerminateSteadyState
modified_callback =CallbackSet(PositiveDomain(), BEFWM2.ExtinguishSpecies(1e-5, false))
out =@timedsimulate(MP, B0, callback = modified_callback)
out.time # ~0.0012 secondsmaximum(out[1].t) # 500; full simulation ranplot(out[1])
#Conclusion: Still fast but not a long enough timeseries# Run for longer (10 years)
tmax =60*60*24*365*10
steps = [0:tmax/100:tmax;]
out =@timedsimulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~2.2 secondsplot(out[1]) # Looks to be at equilbrium?# Run for longer (100 years)
tmax =60*60*24*365*100
steps = [0:tmax/100:tmax;]
out =@timedsimulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~29 secondsplot(out[1]) # Seems stable for sure# Run for longer (1000 years)
tmax =60*60*24*365*1000
steps = [0:tmax/100:tmax;]
# WARNING - THIS CRASHES MY JULIA 🤯
out =@timedsimulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
# Test 3: Reduce tolerance of TerminateSteadyState
modified_callback =CallbackSet(PositiveDomain(), TerminateSteadyState(1e-12, 1e-10), BEFWM2.ExtinguishSpecies(1e-5, false))
out =@timedsimulate(MP, B0, callback = modified_callback)
out.time # ~ 0.0012 secondsplot(out[1])
# Run for longer (10 years)
tmax =60*60*24*365*10
steps = [0:tmax/100:tmax;]
out =@timedsimulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~2.2 secondsplot(out[1])
# Conclusion: can either turn of TerminateSteadyState or reduce tolerance...
The text was updated successfully, but these errors were encountered:
Note: I've sped this up ~2000 times by using δt instead of saveat and actually changing when we save the code, the issue with mismatch with the defaults for δt and TerminateSteadyState remains
Hey team!
We've got
temperature_dependence_v2
to the point where the user can useset_temperature!
to specify a temperature and change all the relevant rates, and then runsimulate
without anything crashing 🎉So hopefully everything is working and @hanamayall is rewriting the tests now...
One issue we're facing however is that scaling all the rates to per second (I think the default time scale with non temperature dependency is one generation of the smallest producer) is causing huge changes in rates. For example, the classic 3-species chain has metabolism and growth millions of times lower than the standard model (see code below).
This means that temperature dependence is not compatible with the default use of
TerminateSteadyState
, which is terminating the simulations within about 1 timestep. I've tested both removingTerminateSteadyState
and reducing the tolerances (Tests 2 & 3, respectively, in the below code example) and either option can resolve this.What we are unsure about is whether it is ok to let the user of temperature dependence manually have to change tolerances, or make the default tolerance in
TerminateSteadyState
scale with something like growth rate, or for us to have a switch insimulate
triggered by the use of temperature dependence that alters theTerminateSteadyState
tolerances to something more suitable?The second related issue is that the number of time-steps needed to run a model until equilibrium is enormous (for a larger food web this would run into the billions). As an example from the literature, Hana says Binzer ran her models for 100,000 years, with rates all having the units 1/s... The current setup crashes my computer when trying to run the simulation for 1,000 years, I assume because the length of simulation is too great.
The code below has examples of running the three species model for 10 & 100 years, saving every percentile. Even though the 100 year simulation is saving the same number of points it is still running for approximately 10x the amount of time as the 10 year simulation, which is thousands of times slower than without temperature dependence. Is there a good way to speed up really long simulations like this, or do we have to accept that temperature dependence is slower? I have tried non-adaptive time-steps with a large
dt
but unsurprisingly this is unstable.Finally, similar to changing the tolerance of
TerminateSteadyState
, is it best to let the user specifydt
when using temperature dependence as the default is inappropriate, or should we have a default of it scaling totmax
Here is the code to play with 😀 let us know if this makes no sense
The text was updated successfully, but these errors were encountered: