Skip to content

Latest commit

 

History

History
66 lines (53 loc) · 2.29 KB

README.md

File metadata and controls

66 lines (53 loc) · 2.29 KB

tf_and_torch

This repository contains some handy tensorflow and pytorch scripts for

  • Building custom Models
  • Data loading and visualizing
  • Model Training and Optimizing
  • Visualizing Model feature maps and kernels
  • Visualizing Training
And much more to come...

Tensorflow

Basic Training Pipeline :

  1. Keras pipeline for Feed Forward NN
    Splitting the dataset using KFold, then creating wrapping data in TF dataset api. Then building and training the Feed forward NN. Save the model and visualize the training process
    1. Tensorflow/Basic Pipeline/Basic.py.py

PyTorch

Custom CNN Models made from scratch :

  1. NN using ModuleList
    A simple Neural Network with nn.ModuleList(). You can define its architecture at the runtime.

    1. PyTorch/Models/NN.py
  2. VGG based models
    VGG is the simplest symmetric CNN to start with.

    1. PyTorch/Models/VGG_6.py
    2. PyTorch/Models/VGG_16.py
  3. ResNet based models
    ResNet (Residual Networks) is another simple CNN which uses skip connections to solve vanishing gradient problems and go even deeper.

    1. PyTorch/Models/ResNet_8.py
    2. PyTorch/Models/ResNet_12.py
    3. PyTorch/Models/Resnet_18.py (graph)

Siamese of Triplet nets made from scratch :

  1. Siamese Network
    Siamese networks are the twin networks that try to learn the best possible embeddings for the input data. We then use distance between these embeddings to distinguish between images.

    1. PyTorch/Models/Simple_siamese.py
  2. Triplet Network
    Same as the siamese networks but here there are thress netwokrs that try to learn the possible embeddings for the input data.

    1. PyTorch/Models/Simple_triplet.py

I use Mnist dataset as an example for both of the above models.

Training

The training folder contains various training pipelines.

Visualize Model

In this folder you can find scripts to visualize the kernels and weights learnt by the model. It is a good way to audit the learning process of the models.