You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I first tried a deeper neural network with lots of nodes and experimented with different activation functions. I found that ReLU seemed to be the most reliable. I also started by using a Dropout of 0.5. I found that what had the most positive effect on my results was the convolutional layers. So, I did two convolutional layers with a pooling layer in between them. I lowered my dropout to 0.1 and placed it in between the input and output layers in my neural network. I found that smaller neural networks can actually be more effective than larger ones, which seemed counter-intuitive.
Create a readme file to lay out setup procedures and intuitions of the project
The text was updated successfully, but these errors were encountered: