-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch-gradient descent #139
Comments
@ashwani-rathee can you please assign this issue to me under the label of gssoc 21 |
thank you sir will try to make a pr as soon as possible but require more days sir.it is a request |
Sure @Sibasish-Padhy |
I am a participant GSSOC'21 |
No progress on this,needs to be reassigned |
@ashwani-rathee sir I am working on it my laptop was damaged needed to be repaired .hence it took a week.please sir understand the situation i will try to make a pr by monday if possible |
please reassign me this issue sir @ashwani-rathee sir |
I have already started working on it |
@Sibasish-Padhy sure, Go ahead |
sir i have created a pull request for issue no #139.please do review it and suggest for changes. @ashwani-rathee |
@ashwani-rathee please assign me this issue |
@ashwani-rathee ?? please assign |
Can I work on README for this @ashwani-rathee @geekquad @yukti845? |
A very helpful optimization technique that is quite time efficient and even reduces space complexity. It proves to come in quite handy when the gradient descent seems to have a relatively fast learning rate or follows non convex optimization. I would like to code the optimization technique and prepare a well documented readme file for it explaining it in detail. Please assign this issue to me under the label of gssoc21.Thank you
The text was updated successfully, but these errors were encountered: