Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change the temperature dependence timescale and corresponding tolerance 🐌 #81

Open
Thomalpas opened this issue Dec 13, 2022 · 4 comments
Assignees

Comments

@Thomalpas
Copy link
Collaborator

Thomalpas commented Dec 13, 2022

Hey team!

We've got temperature_dependence_v2 to the point where the user can use set_temperature! to specify a temperature and change all the relevant rates, and then run simulate without anything crashing 🎉
So hopefully everything is working and @hanamayall is rewriting the tests now...

One issue we're facing however is that scaling all the rates to per second (I think the default time scale with non temperature dependency is one generation of the smallest producer) is causing huge changes in rates. For example, the classic 3-species chain has metabolism and growth millions of times lower than the standard model (see code below).

This means that temperature dependence is not compatible with the default use of TerminateSteadyState, which is terminating the simulations within about 1 timestep. I've tested both removing TerminateSteadyState and reducing the tolerances (Tests 2 & 3, respectively, in the below code example) and either option can resolve this.

What we are unsure about is whether it is ok to let the user of temperature dependence manually have to change tolerances, or make the default tolerance in TerminateSteadyState scale with something like growth rate, or for us to have a switch in simulate triggered by the use of temperature dependence that alters the TerminateSteadyState tolerances to something more suitable?

The second related issue is that the number of time-steps needed to run a model until equilibrium is enormous (for a larger food web this would run into the billions). As an example from the literature, Hana says Binzer ran her models for 100,000 years, with rates all having the units 1/s... The current setup crashes my computer when trying to run the simulation for 1,000 years, I assume because the length of simulation is too great.

The code below has examples of running the three species model for 10 & 100 years, saving every percentile. Even though the 100 year simulation is saving the same number of points it is still running for approximately 10x the amount of time as the 10 year simulation, which is thousands of times slower than without temperature dependence. Is there a good way to speed up really long simulations like this, or do we have to accept that temperature dependence is slower? I have tried non-adaptive time-steps with a large dt but unsurprisingly this is unstable.

Finally, similar to changing the tolerance of TerminateSteadyState, is it best to let the user specify dt when using temperature dependence as the default is inappropriate, or should we have a default of it scaling to tmax

Here is the code to play with 😀 let us know if this makes no sense

# set up
using BEFWM2, Plots, DifferentialEquations # import package, make sure on temperature_dependence_v2 branch

# Build a food web
foodweb = FoodWeb([ 0 0 0; 1 0 0; 0 1 0]) # create tri-tropic network

# functional_response has to be ClassicResponse
MP = ModelParameters(foodweb, functional_response = ClassicResponse(foodweb)) # generate model parameters using Classic functional response
unchanged_MP = deepcopy(MP)

# Time to introduce temperature dependence!
set_temperature!(MP, 293.15, ExponentialBA()) 

# How do the new rates compare?

# Growth
unchanged_MP.biorates.r[1] / MP.biorates.r[1] # ~6,400,000 times smaller

# Metabolism
unchanged_MP.biorates.x[2] / MP.biorates.x[2] # ~4,800,000 times smaller
unchanged_MP.biorates.x[3] / MP.biorates.x[3] # ~4,800,000 times smaller

# Attack rate
unchanged_MP.functional_response.aᵣ[2] / MP.functional_response.aᵣ[2] # ~240,000 times smaller
unchanged_MP.functional_response.aᵣ[6] / MP.functional_response.aᵣ[6] # ~240,000 times smaller

# Handling time
unchanged_MP.functional_response.hₜ[2] / MP.functional_response.hₜ[2] # ~63,000 times bigger
unchanged_MP.functional_response.hₜ[6] / MP.functional_response.hₜ[6] # ~63,000 times bigger

# Test 0: Baseline expectation
B0 = repeat([unchanged_MP.environment.K[1]/2], 3) # starting biomass of half the carrying capacity of the system
out = @timed simulate(unchanged_MP, B0)
out.time # ~0.0035 seconds
plot(out[1])

# Test 1: Simulation as normal
B0 = repeat([MP.environment.K[1]/2], 3) # starting biomass of half the carrying capacity of the system

out = @timed simulate(MP, B0) # retcode: Terminated
maximum(out[1].t) # Terminates after 1.2 timesteps
out.time # ~0.0004 seconds
# Conclusion: very fast but terminates too early



# Test 2: Turn off TerminateSteadyState
modified_callback = CallbackSet(PositiveDomain(), BEFWM2.ExtinguishSpecies(1e-5, false))
out = @timed simulate(MP, B0, callback = modified_callback)
out.time # ~0.0012 seconds
maximum(out[1].t) # 500; full simulation ran
plot(out[1])
#Conclusion: Still fast but not a long enough timeseries

# Run for longer (10 years)
tmax = 60*60*24*365*10
steps = [0:tmax/100:tmax;]
out = @timed simulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~2.2 seconds

plot(out[1]) # Looks to be at equilbrium?

# Run for longer (100 years)
tmax = 60*60*24*365*100
steps = [0:tmax/100:tmax;]
out = @timed simulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~29 seconds

plot(out[1]) # Seems stable for sure

# Run for longer (1000 years)
tmax = 60*60*24*365*1000
steps = [0:tmax/100:tmax;]
# WARNING - THIS CRASHES MY JULIA 🤯
out = @timed simulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)


# Test 3: Reduce tolerance of TerminateSteadyState
modified_callback = CallbackSet(PositiveDomain(), TerminateSteadyState(1e-12, 1e-10), BEFWM2.ExtinguishSpecies(1e-5, false))
out = @timed simulate(MP, B0, callback = modified_callback)
out.time # ~ 0.0012 seconds
plot(out[1])

# Run for longer (10 years)
tmax = 60*60*24*365*10
steps = [0:tmax/100:tmax;]
out = @timed simulate(MP, B0, callback = modified_callback, tmax = tmax, saveat = steps)
out.time # ~2.2 seconds

plot(out[1])

# Conclusion: can either turn of TerminateSteadyState or reduce tolerance...
@ismael-lajaaiti
Copy link
Collaborator

Still didn't have time to investigate this.
But a side note, when you integrate julia code block with the backticks:

#```
# A julia code block with no color. Full of sadness. 
f(x) = x^2
f(2)
#``` 

You can add "julia" after the opening backticks to have a coloured code block.

#```julia
# Same code block, but full of joy. 
f(x) = x^2
f(2)
#```

I've updated your comment to add this.

@Thomalpas
Copy link
Collaborator Author

Note: I've sped this up ~2000 times by using δt instead of saveat and actually changing when we save the code, the issue with mismatch with the defaults for δt and TerminateSteadyState remains

@Thomalpas
Copy link
Collaborator Author

We'll change timescale to days and this should help with most things

@Thomalpas Thomalpas reopened this Dec 15, 2022
@Thomalpas Thomalpas assigned Thomalpas and hanamayall and unassigned Thomalpas Dec 15, 2022
@Thomalpas Thomalpas changed the title Temperature Dependence is SLOW 🐌 Change the temperature dependence timescale and corresponding tolerance 🐌 Dec 15, 2022
@iago-lito
Copy link
Collaborator

Triage @hanamayall @Thomalpas. Has this been addressed/solved yet? :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

No branches or pull requests

4 participants