Skip to content

mirzadeh/IAP2019

 
 

Repository files navigation

Knet Tutorial

This tutorial introduces the programming language Julia and the Knet deep learning framework. By the end, the reader should be able to define, train, evaluate, and visualize basic MLP, CNN, and RNN models. Each notebook is written to work stand-alone but they rely on concepts introduced in earlier notebooks, so I recommend reading them in order. Every Knet function outside of the standard Julia library is defined or explained before use.

To run the notebooks on a Jupyter server, start julia in this directory then install and run IJulia by typing the following at the julia> prompt: (see IJulia.jl for more information).

julia> using Pkg
julia> Pkg.add("IJulia")
julia> using IJulia
julia> notebook()

These notebooks are also available in Google Drive which should let you run them on Google Colab provided you add Julia support first using the colab_install_julia notebook.

Contents:

  • 00.Julia_is_fast: comparison of Julia's speed to C, Python and numpy.
  • 01.Getting_to_know_Julia: basic Julia tutorial from juliabox.com.
  • 02.mnist: introduction to the MNIST handwritten digit recognition dataset.
  • 03.lin: define, train, visualize simple linear models, introduce gradients, SGD, using the GPU.
  • 04.mlp: multi layer perceptrons, nonlinearities, model capacity, overfitting, regularization, dropout.
  • 05.cnn: convolutional neural networks, sparse and shared weights using conv4 and pool operations.
  • 06.rnn: introduction to recurrent neural networks.
  • 07.imdb: a simple RNN sequence classification model for sentiment analysis of IMDB movie reviews.
  • 08.charlm: a character based RNN language model that can write Shakespeare sonnets and Julia programs.
  • 09.s2s: a sequence to sequence RNN model typically used for machine translation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%