-
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance of getu
vs SymbolicIndexingInterface.observed
#39
Comments
I'm having trouble with this bug, but # Using the same initial model
julia> f2 = getu(sol, s1 + s2)
julia> @btime $f2($sol, 2)
88.906 ns (5 allocations: 752 bytes)
julia> obs2 = SymbolicIndexingInterface.observed(sol, s1 + s2)
julia> @btime $obs2($sol.u[2], parameter_values($sol), $sol.t[2])
228.753 ns (6 allocations: 832 bytes)
julia> f2 = getu(sol, s1 + s2) # recreate f2
julia> @btime $f2($sol, 2)
229.476 ns (6 allocations: 832 bytes) I think this is because of how the observed function in The view in your example is wrong, and will give an incorrect result. It should be |
Oh, that explains why I thought it's slower.
Thanks for pointing that out. |
I looked at a couple of cases comparing
getu
andSII.observed
and for unknowns,getu
seems to have ideal performance (i.e. same as just direct indexing in the vector). When computing the value for expressions, it looks likeSII.observed
is faster, but for some reason if I redefine the function forSII.observed
after usinggetu
, I get the same performance asgetu
. Doesgetu
replace the cached generated function or am I missing something?I also tried this on a larger model and there the time for
SII.observed
was still around 100ns for a simple expression, but thegetu
was slower. Is the time forgetu
expected to increase with the size of the model?I was wondering if the slowdown in
getu
was related to the length of the solution, but I tested that and it's not it.Versions:
and julia 1.10
The text was updated successfully, but these errors were encountered: