Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: fast re-solve for same problem with different parameter values #894

Open
MFairley opened this issue Dec 28, 2020 · 1 comment

Comments

@MFairley
Copy link

I also posted this here: https://discourse.julialang.org/t/reduce-memory-allocation-for-repeatedly-solving-problem-using-optim-jl/52292

Could we have a way to cache the problem structure to enable fast re-solves for the same problem with different data/parameters without allocating more memory? Warm starts would also be great.

The following code allocates a lot of memory.

using Random
using Optim, NLSolversBase
using BenchmarkTools

function fun(x, w)
    mapreduce((xi, wi) -> xi * wi, +, x, w)
end

function fun_grad!(g, x, w)
    g .= w
end

function fun_hess!(h, x, w)
    h .= 0.0
end

const n = 10
const x0 = zeros(n)
const lx = ones(n) * -1.0
const ux = ones(n)

function solve_subproblem(w)
    f = (x) -> fun(x, w)
    g! = (g, x) -> fun_grad!(g, x, w)
    h! = (h, x) -> fun_hess!(h, x, w)

    df = TwiceDifferentiable(f, g!, h!, x0)
    dfc = TwiceDifferentiableConstraints(lx, ux)
    
    return optimize(df, dfc, x0, IPNewton())
end

function solve(m)
    for i = 1:m
        w = rand(n)
        solve_subproblem(w)
    end
end

@benchmark solve(10000)
BenchmarkTools.Trial: 
  memory estimate:  9.01 GiB
  allocs estimate:  73275049
  --------------
  minimum time:     6.539 s (5.82% GC)
  median time:      6.539 s (5.82% GC)
  mean time:        6.539 s (5.82% GC)
  maximum time:     6.539 s (5.82% GC)
  --------------
  samples:          1
  evals/sample:     1
@pkofod
Copy link
Member

pkofod commented Dec 30, 2020

Yeah it doesn't appear that this is possible for IPNewton because a lot of stuff happens in the function that sets up the initial state. I will say though that it doesn't appear to spend an extreme amount of time in GC. But I agree, this could be improved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants