Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bayesian Models are not running with Julia 1.9.2 #124

Closed
sourish-cmi opened this issue Jul 19, 2023 · 7 comments
Closed

Bayesian Models are not running with Julia 1.9.2 #124

sourish-cmi opened this issue Jul 19, 2023 · 7 comments
Labels
bug Something isn't working

Comments

@sourish-cmi
Copy link
Collaborator

Describe the bug
Following code is not running with Julia 1.9.2

julia> model = fit(@formula(MPG ~ HP + WT+Gear), df, LinearRegression(),Prior_Ridge())

Error

Sampling 100%|█████████████████████████████████████████████████████████████████████| Time: 0:00:01
ERROR: MethodError: no method matching ADgradient(::Val{:ForwardDiff}, ::WARNING: both Bijectors and Base export "stack"; uses of it in module Turing must be qualified
Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:v, :σ, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:v, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:v, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}; gradientconfig::ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}})

Closest candidates are:
  ADgradient(::Val{:ForwardDiff}, ::Any; chunk, tag, x) got unsupported keyword argument "gradientconfig"
   @ LogDensityProblemsADForwardDiffExt ~/.julia/packages/LogDensityProblemsAD/tXC2y/ext/LogDensityProblemsADForwardDiffExt.jl:98
  ADgradient(::Val{kind}, ::Any; kwargs...) where kind
   @ LogDensityProblemsAD ~/.julia/packages/LogDensityProblemsAD/tXC2y/src/LogDensityProblemsAD.jl:68
  ADgradient(::ADTypes.AutoForwardDiff{C}, ::Any) where C got unsupported keyword argument "gradientconfig"
   @ LogDensityProblemsADADTypesExt ~/.julia/packages/LogDensityProblemsAD/tXC2y/ext/LogDensityProblemsADADTypesExt.jl:26
  ...

Stacktrace:
  [1] kwerr(::NamedTuple{(:gradientconfig,), Tuple{ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}}}}, ::Function, ::Val{:ForwardDiff}, ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:v, :σ, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:v, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:v, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Base ./error.jl:165
  [2] ADgradient(ad::Turing.Essential.ForwardDiffAD{0, true}, ℓ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:v, :σ, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:v, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:v, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Turing.Essential ~/.julia/packages/Turing/UsWJl/src/essential/ad.jl:102
  [3] ADgradient(ℓ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:v, :σ, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:v, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:v, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Turing.Essential ~/.julia/packages/Turing/UsWJl/src/essential/ad.jl:82
  [4] initialstep(rng::Random.MersenneTwister, model::DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, spl::DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, vi::DynamicPPL.TypedVarInfo{NamedTuple{(:v, :σ, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:v, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:v, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}; init_params::Nothing, nadapts::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing.Inference ~/.julia/packages/Turing/UsWJl/src/inference/hmc.jl:161
  [5] step(rng::Random.MersenneTwister, model::DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, spl::DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}; resume_from::Nothing, init_params::Nothing, kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:nadapts,), Tuple{Int64}}})
    @ DynamicPPL ~/.julia/packages/DynamicPPL/UFajj/src/sampler.jl:111
  [6] macro expansion
    @ ~/.julia/packages/AbstractMCMC/fWWW0/src/sample.jl:125 [inlined]
  [7] macro expansion
    @ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:328 [inlined]
  [8] (::AbstractMCMC.var"#21#22"{Bool, String, Nothing, Int64, Int64, Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:nadapts,), Tuple{Int64}}}, Random.MersenneTwister, DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, Int64, Int64})()
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/fWWW0/src/logging.jl:12
  [9] with_logstate(f::Function, logstate::Any)
    @ Base.CoreLogging ./logging.jl:514
 [10] with_logger(f::Function, logger::LoggingExtras.TeeLogger{Tuple{LoggingExtras.EarlyFilteredLogger{TerminalLoggers.TerminalLogger, AbstractMCMC.var"#1#3"{Module}}, LoggingExtras.EarlyFilteredLogger{Logging.ConsoleLogger, AbstractMCMC.var"#2#4"{Module}}}})
    @ Base.CoreLogging ./logging.jl:626
 [11] with_progresslogger(f::Function, _module::Module, logger::Logging.ConsoleLogger)
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/fWWW0/src/logging.jl:36
 [12] macro expansion
    @ ~/.julia/packages/AbstractMCMC/fWWW0/src/logging.jl:11 [inlined]
 [13] mcmcsample(rng::Random.MersenneTwister, model::DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, sampler::DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, N::Int64; progress::Bool, progressname::String, callback::Nothing, discard_initial::Int64, thinning::Int64, chain_type::Type, kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:nadapts,), Tuple{Int64}}})
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/fWWW0/src/sample.jl:116
 [14] sample(rng::Random.MersenneTwister, model::DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, sampler::DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, N::Int64; chain_type::Type, resume_from::Nothing, progress::Bool, nadapts::Int64, discard_adapt::Bool, discard_initial::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing.Inference ~/.julia/packages/Turing/UsWJl/src/inference/hmc.jl:133
 [15] sample
    @ ~/.julia/packages/Turing/UsWJl/src/inference/hmc.jl:103 [inlined]
 [16] #sample#2
    @ ~/.julia/packages/Turing/UsWJl/src/inference/Inference.jl:146 [inlined]
 [17] sample(rng::Random.MersenneTwister, model::DynamicPPL.Model{CRRao.var"#LinearRegression#1"{Float64}, (:X, :y), (), (), Tuple{Matrix{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, alg::NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}, N::Int64)
    @ Turing.Inference ~/.julia/packages/Turing/UsWJl/src/inference/Inference.jl:139
 [18] linear_reg(formula::FormulaTerm{Term, Tuple{Term, Term, Term}}, data::DataFrame, turingModel::CRRao.var"#LinearRegression#1"{Float64}, sim_size::Int64)
    @ CRRao ~/.julia/packages/CRRao/wXk3z/src/bayesian/linear_regression.jl:8
 [19] fit(formula::FormulaTerm{Term, Tuple{Term, Term, Term}}, data::DataFrame, modelClass::LinearRegression, prior::Prior_Ridge, h::Float64, sim_size::Int64)
    @ CRRao ~/.julia/packages/CRRao/wXk3z/src/bayesian/linear_regression.jl:100
 [20] fit(formula::FormulaTerm{Term, Tuple{Term, Term, Term}}, data::DataFrame, modelClass::LinearRegression, prior::Prior_Ridge)
    @ CRRao ~/.julia/packages/CRRao/wXk3z/src/bayesian/linear_regression.jl:83
 [21] top-level scope
    @ REPL[18]:1


Looks like some packages are missing - not sure which caused the error?

@ShouvikGhosh2048 @ayushpatnaikgit @codetalker7

@sourish-cmi sourish-cmi added the bug Something isn't working label Jul 19, 2023
@ShouvikGhosh2048
Copy link
Collaborator

Similar issue occurs in #123

@sourish-cmi
Copy link
Collaborator Author

Yes - seems like same problem. May be Turing is not synced yet with 1.9

@sourish-cmi
Copy link
Collaborator Author

sourish-cmi commented Apr 1, 2024

In the linear_regression.jl - in line number 8, we are calling the turingModel(X,y) - but the function is not defined anywhere. Please check urgently. I think this is breaking the entire Bayesian module.

https://github.com/xKDR/CRRao.jl/blob/main/src/bayesian/linear_regression.jl

@ShouvikGhosh2048 @ayushpatnaikgit @codetalker7 @ajaynshah @SusanXKDR @mousum-github

@ShouvikGhosh2048
Copy link
Collaborator

ShouvikGhosh2048 commented Apr 1, 2024

turingModel is passed as a parameter to the linear_reg function:

function linear_reg(formula::FormulaTerm, data::DataFrame, turingModel::Function, sim_size::Int64)

When we use linear_reg we create the model and pass it to the function:

function fit(
formula::FormulaTerm,
data::DataFrame,
modelClass::LinearRegression,
prior::Prior_Ridge,
h::Float64 = 0.01,
sim_size::Int64 = 1000
)
@model LinearRegression(X, y) = begin
p = size(X, 2)
#priors
a0 = 0.1
b0 = 0.1
v ~ InverseGamma(h, h)
σ ~ InverseGamma(a0, b0)
#α ~ Normal(0, v * σ)
β ~ filldist(Normal(0, v * σ), p)
#likelihood
#y ~ MvNormal(α .+ X * β, σ)
y ~ MvNormal(X * β, σ)
end
return linear_reg(formula, data, LinearRegression, sim_size)
end

@sourish-cmi
Copy link
Collaborator Author

sourish-cmi commented Apr 1, 2024

I understand that turingModel is passed as a parameter to the linear_reg function. But I am not sure if it is passing the LinearRegression(X,y) as a function. I think it is not defined as a function. It is defined as macro @model LinearRegression(X,y) =...end

LinearRegression is a function and as well as it is defined as struct in CRRao.jl -- it is better to use different name

@sourish-cmi
Copy link
Collaborator Author

@ShouvikGhosh2048 @ayushpatnaikgit

I think CRRao is using a really old version of Turing "0.23.2" -- Turing no more supports "0.23.2" - we should bump up CRRao to the Turing latest version of "0.30.7"

@ShouvikGhosh2048 can you please try it?

@sourish-cmi
Copy link
Collaborator Author

It is being fixed and new release 0.1.1 is being done

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants