Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create Readme #2

Open
Inominie opened this issue Nov 19, 2018 · 1 comment
Open

Create Readme #2

Inominie opened this issue Nov 19, 2018 · 1 comment

Comments

@Inominie
Copy link
Member

Create a readme file to lay out setup procedures and intuitions of the project

@Inominie Inominie assigned Inominie and unassigned Inominie Nov 19, 2018
@josephcoveai
Copy link

I first tried a deeper neural network with lots of nodes and experimented with different activation functions. I found that ReLU seemed to be the most reliable. I also started by using a Dropout of 0.5. I found that what had the most positive effect on my results was the convolutional layers. So, I did two convolutional layers with a pooling layer in between them. I lowered my dropout to 0.1 and placed it in between the input and output layers in my neural network. I found that smaller neural networks can actually be more effective than larger ones, which seemed counter-intuitive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants