Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mutation function? #2

Open
Bobingstern opened this issue Sep 27, 2021 · 2 comments
Open

Mutation function? #2

Bobingstern opened this issue Sep 27, 2021 · 2 comments

Comments

@Bobingstern
Copy link

This is great! I want to use this for neuro evolution but I can’t figure out how to mutate all the weights by a probability…

@hasinaxp
Copy link
Owner

hasinaxp commented Sep 27, 2021

To mutate the weights depending on the probability you first need to convert the probability into delta error. then the rest is the same. However this one is just a simple example, so it's not so efficient in terms of performance. For better performance in the case of neuroevolution, you need to use relu and sigmoid both in different layers as activation functions and introduce dropout.

this concept comes from local minima. Our goal is to make the error minimum. So we try to calculate the derivative of the error with respect to the weights. This gives us the direction, i.e. whether to increase a certain weight or to decrease. now we decrease or increase the given weight by a small amount. this is considered as the learning weight. you need to have a basic knowledge of calculus to understand how to calculate partial derivatives or the direction of minima. You can search for gradient descent for more clear information.

@Bobingstern
Copy link
Author

Oh ok, I have never properly learned calculus before but I’ll look into it, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants