Skip to content

thedataninja1786/shallowgrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

shallowgrad: A simplified version of PyTorch based on numpy made for educational purposes.

Its simple implementation aims to dimistify the abstraction of modern deep learning frameworks while providing a similar interface to PyTorch.

MNIST in shallowgrad


import numpy as np
from shallowgrad.nn import nn
from optimizers.optimizers import Adam 

# load MNIST
# ...

X_train = X_train.reshape(-1,28*28)
Y = Y_train.reshape(-1,1)
X = np.array(X_train / 255)

l1 = nn.Linear(784,2500,activation='ReLU',bias=True)
l2 = nn.Linear(2500,1000,activation='ReLU',bias=True)
l3 = nn.Linear(1000,10,bias=True) 
loss = nn.CrossEntropyLoss() 
optim = Adam(layers=[l1,l2,l3],lr=3e-4)

y_hat = []
y_true = []
BS = 256
NUM_EPOCHS = 100

# training loop
for _ in range(NUM_EPOCHS):
    samp = np.random.randint(0, X.shape[0], size=(BS))
    x = X[samp]
    y = Y[samp]
    x = l1(x)
    x = l2(x)
    x = l3(x)
    preds_batch = np.argmax(x, axis=1).reshape(-1, 1)

    # Append batch predictions and true labels 
    y_hat.append(preds_batch)
    y_true.append(y)

    l = loss(x, y)
    loss.backwards()
    optim.step()

About

shallowgrad ❤️

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages