Shortcuts
../_images/nets.png

About

Nets is a vanilla Deep Learning framework, made using only NumPy. This project was first introduced as an assignment I made at the University of Oslo and Stanford University

However, this project was recreated to make it OOP like. Nets was highly inspired from PyTorch and TensorFlow.

Nets package has NO CLAIMS to shadow already build deep learning packages like PyTorch or TensorFlow. Instead, this package was made to understand how all of these libraries work and handle forward / backward propagation by making one from scratch. As I am going through this deep understanding, I found interesting to share as much as possible my work, which I hope will help students or people who want to learn more about deep learning.

Instalation

Nets was made in Python 3.7 and uses multiple machine learning libraries:

  • Numpy

  • Json

  • Time

  • Pandas (Optional)

  • Scipy (Optional)

  • Scikit-Learn (Optional)

To use this package, clone the repository at https://github.com/arthurdjn/nets on your laptop and from the root folder, run on the commandline:

git clone https://github.com/arthurdjn/nets
cd nets
pip install .

Or you can download it directly with pip:

pip install nets

This will install the package in your python environment and download all the latest dependencies. You can know use and tweak the parameters of pynews’ models.

Usage

Nets architecture follows the one from PyTorch. It provides a basic neural network structure so you can create your own with numpy. You will need to wrap your arrays in a Tensor class to keep track of the gradients, just like in PyTorch and TensorFlow.

Models

A model is a Module subclass, where biases, weights and parameters transformations are computed. All modules have a forward method, that MUST be overwritten. This method will compute the forward propagation from an input tensor, and compute the transformation. If using the autograd system, no back-propagation need to be added. However, if you prefer to manually compute the gradients, you will need to override the backward method.

Your Model should inherits from the Module class and override the forward method.

Example:

import nets
import nets.nn as nn

class Model(nn.Module):
    """
    Create your own model.
    The attributes should be your submodels used during the forward pass.
    You don't have to necessary affect the activation function as an attribute,
    unless you want to set a manual backward pass.
    """
    def __init__(self, input_dim, hidden_dim, output_dim):
        # Initialization
        super().__init__() # Don't forget to add this line
        self.layer1 = nn.Linear(input_dim, hidden_dim)
        self.layer2 = nn.Linear(hidden_dim, hidden_dim)
        self.layer3 = nn.Linear(hidden_dim, output_dim)

    def forward(self, inputs):
        # Forward pass
        out1 = nets.tanh(self.layer1(inputs))
        out2 = nets.tanh(self.layer2(out1))
        return self.layer3(out2)

model = Model(10, 100, 2)

# Let's check the architecture
model

Out:

Model(
   (layer1): Linear(input_dim=10, output_dim=100, bias=True)
   (layer2): Linear(input_dim=100, output_dim=100, bias=True)
   (layer3): Linear(input_dim=100, output_dim=2, bias=True)
)

Docs

Access comprehensive developer documentation for Nets

View Docs

Tutorials

Get beginners tutorials and create state-of-the-art models

View Tutorials

Resources

Check the GitHub page and contribute to the project

View GitHub