Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of Dual for NaNMath.pow does not use NaNMath.log #716

Closed
jClugstor opened this issue Nov 1, 2024 · 1 comment · Fixed by #717
Closed

Implementation of Dual for NaNMath.pow does not use NaNMath.log #716

jClugstor opened this issue Nov 1, 2024 · 1 comment · Fixed by #717

Comments

@jClugstor
Copy link
Contributor

I would expect this:

using ForwardDiff
using NaNMath

function new_pow(x)
    NaNMath.pow(x[1],x[2])
end

ForwardDiff.gradient(new_pow, [-1.0, 1.0])

to return a NaN I think?

The issue is right here,

for f in (:(Base.:^), :(NaNMath.pow))
    @eval begin
        @define_binary_dual_op(
            $f,
            begin
                vx, vy = value(x), value(y)
                expv = ($f)(vx, vy)
                powval = vy * ($f)(vx, vy - 1)
                if isconstant(y)
                    logval = one(expv)
                elseif iszero(vx) && vy > 0
                    logval = zero(vx)
                else
                    logval = expv * log(vx)
                end
                new_partials = _mul_partials(partials(x), partials(y), powval, logval)
                return Dual{Txy}(expv, new_partials)
            end,

in logval = expv*log(vx) , I think NaNMath.log should be used if f is NaNMath.pow.

@KristofferC
Copy link
Collaborator

Makes sense to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants