Pytorch

VMP 1.10.2020

PyTorch, No Tears
Python
Intermediate Python
Python Numpy Tutorial
Библиотека программиста Подборка книг по машинному обучению
Туториал по PyTorch
TheAlgorithms/Python
pytorch.org
PyTorch, No Tears
Pytorch.org tutorials
Ttutorials beginner deep_learning_60min_blitz
Pytorch
Pytorch examples
Pytorch tutorials
github DeepFaceLab

PyTorch, No Tears
Pytorch
yunjey pytorch-tutorial
deeplearningzerotoall PyTorch
zergtant pytorch-handbook
chenyuntc pytorch-book
pytorch_geometric
PyTorch-GAN
MorvanZhou PyTorch-Tutorial
ГЛУБОКОЕ ОБУЧЕНИЕ
Книга «Программируем с PyTorch: Создание приложений глубокого обучения»
Как подружить PyTorch и C++. Используем TorchScript
TorchScriptTutorial
Deep_Learning_for_Vision_Systems_by_Mohamed_Elgendy_z_lib_org.pdf

The uWSGI project
Введение в WSGI-серверы: Часть первая
Flask Documentation (1.1.x)
Настройка mod_wsgi (Apache) для Flask
WSGI Servers Full Stack Python


Нейронные сети и компьютерное зрение – 66 урок. Строим первую нейронную сеть

Нейронные сети и компьютерное зрение – 75 урок. Классификация в PyTorch
PyTorch-YOLOv3
YOLOv3-in-PyTorch
yolov3

PyTorch documentation
torch.nn
Convolution Layers
Conv2d
A Beginner’s Guide To Understanding Convolutional Neural Networks 1
A Beginner’s Guide To Understanding Convolutional Neural Networks Part 2
CS231n: Convolutional Neural Networks for Visual Recognition
Understanding of Convolutional Neural Network (CNN) — Deep Learning
Padding and Stride

Нейронные сети и компьютерное зрение – 1 урок

Нейронные сети и компьютерное зрение – 103 урок.Свёртка, каскад свёрток

Нейронные сети и компьютерное зрение – 104 урок.Свёртка, каскад свёрток

Нейронные сети и компьютерное зрение – 105 урок.Свёртка, каскад свёрток





Нейронные сети и компьютерное зрение – 110 урок. Архитектура LeNet (1998)













Нейронные сети и компьютерное зрение – 116 – 124 урок. AlexNet (2012) и VGG (2014)





Нейронные сети и компьютерное зрение – 125-130 урок. GoogLeNet и ResNet (2015)

Нейронные сети и компьютерное зрение – 131 урок. Распознавание рукописных чисел свёрточной сетью

torch.nn.Conv2d(in_channels: int, out_channels: int, kernel_size: Union[T, Tuple[T, T]], stride: Union[T, Tuple[T, T]] = 1, padding: Union[T, Tuple[T, T]] = 0, dilation: Union[T, Tuple[T, T]] = 1, groups: int = 1, bias: bool = True, padding_mode: str = ‘zeros’)
Parameters
in_channels (int) – Number of channels in the input image
out_channels (int) – Number of channels produced by the convolution
kernel_size (int or tuple) – Size of the convolving kernel
stride (int or tuple, optional) – Stride of the convolution. Default: 1
padding (int or tuple, optional) – Zero-padding added to both sides of the input. Default: 0
padding_mode (string, optional) – ‘zeros’, ‘reflect’, ‘replicate’ or ‘circular’. Default: ‘zeros’
dilation (int or tuple, optional) – Spacing between kernel elements. Default: 1
groups (int, optional) – Number of blocked connections from input channels to output channels. Default: 1
bias (bool, optional) – If True, adds a learnable bias to the output. Default: True
Pytorch-how-and-when-to-use-Module-Sequential-ModuleList-and-ModuleDict
>>> # With square kernels and equal stride
>>> m = nn.Conv2d(16, 33, 3, stride=2)
>>> # non-square kernels and unequal stride and with padding
>>> m = nn.Conv2d(16, 33, (3, 5), stride=(2, 1), padding=(4, 2))
>>> # non-square kernels and unequal stride and with padding and dilation
>>> m = nn.Conv2d(16, 33, (3, 5), stride=(2, 1), padding=(4, 2), dilation=(3, 1))
>>> input = torch.randn(20, 16, 50, 100)
>>> output = m(input)

import torch.nn.functional as F

class MyCNNClassifier(nn.Module):
def __init__(self, in_c, n_classes):
super().__init__()
self.conv1 = nn.Conv2d(in_c, 32, kernel_size=3, stride=1, padding=1)
self.bn1 = nn.BatchNorm2d(32)

self.conv2 = nn.Conv2d(32, 64, kernel_size=3, stride=1, padding=1)
self.bn2 = nn.BatchNorm2d(64)

self.fc1 = nn.Linear(64 * 28 * 28, 1024)
self.fc2 = nn.Linear(1024, n_classes)

def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = F.relu(x)

x = self.conv2(x)
x = self.bn2(x)
x = F.relu(x)

x = x.view(x.size(0), -1) # flat

x = self.fc1(x)
x = F.sigmoid(x)
x = self.fc2(x)

return x

model = MyCNNClassifier(1, 10)
print(model)

MyCNNClassifier(
(conv1): Conv2d(1, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(conv2): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(fc1): Linear(in_features=50176, out_features=1024, bias=True)
(fc2): Linear(in_features=1024, out_features=10, bias=True)
)

PyTorch-VAE
implementing-an-autoencoder-in-pytorch
Building Autoencoder in Pytorch
L1aoXingyu pytorch-beginner
kaggle.com autoencoders-with-pytorch
pytorch-beginner 08-AutoEncoder
kaggle Convolutional Autoencoder
Denoising-Autoencoder-in-Pytorch
github.com Autoencoders+pytorch

pytorch mobile flutter
torch_mobile flutter plugin

GitHub PyTorch
Pytorch3d
PyTorch GitHub
PyTorch Geometric

Туториал по PyTorch: от установки до готовой нейронной сети
NVIDIA выпустили обертку над PyTorch для обучения моделей
Какие ошибки чаще всего совершают при обучении нейросетей
StyleGAN2: улучшенная нейросеть для генерации лиц людей
StyleGAN2 — Official TensorFlow Implementation
PyTorch — Краткое руководство 2019
Понимание PyTorch на примере
PyTorch Tutorial: How to Develop Deep Learning Models with Python

torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True, padding_mode=’zeros’)
Parameters
• in_channels (int) – Number of channels in the input image
• out_channels (int) – Number of channels produced by the convolution
• kernel_size (int or tuple) – Size of the convolving kernel
• stride (int or tuple, optional) – Stride of the convolution. (Default: 1)
• padding (int or tuple, optional) – Zero-padding added to both sides of the input (Default: 0)
• padding_mode (string, optional) – zeros
• dilation (int or tuple, optional) – Spacing between kernel elements. (Default: 1)
• groups (int, optional) – Number of blocked connections from input to output channels. (Default: 1)
• bias (bool, optional) – If True, adds a learnable bias to the output. (Default: True)
And this URL has helpful visualization of the process.
So the in_channels in the beginning is 3 for images with 3 channels (colored images). For images black and white it should be 1. Some satellite images should have 4.
The out_channels is what convolution will produce so these are the number of filters.
Let’s create an example to “prove” that.
import torch
import torch.nn as nn
c = nn.Conv2d(1,3, stride = 1, kernel_size=(4,5))
print(c.weight.shape)
print(c.weight)
Out
torch.Size([3, 1, 4, 5])

Нейронные сети и компьютерное зрение – 1 урок

PyTorch at Tesla – Andrej Karpathy, Tesla

Pytorch Bidirectional LSTM example
Популярность PyTorch в среднем выросла на 243% за год

Pytorch & related libraries

  1. pytorch: Tensors and Dynamic neural networks in Python with strong GPU acceleration.


NLP & Speech Processing:

  1. pytorch text: Torch text related contents.
  2. pytorch-seq2seq: A framework for sequence-to-sequence (seq2seq) models implemented in PyTorch.
  3. anuvada: Interpretable Models for NLP using PyTorch.
  4. audio: simple audio I/O for pytorch.
  5. loop: A method to generate speech across multiple speakers
  6. fairseq-py: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
  7. speech: PyTorch ASR Implementation.
  8. OpenNMT-py: Open-Source Neural Machine Translation in PyTorch http://opennmt.net
  9. neuralcoref: State-of-the-art coreference resolution based on neural nets and spaCy huggingface.co/coref
  10. sentiment-discovery: Unsupervised Language Modeling at scale for robust sentiment classification.
  11. MUSE: A library for Multilingual Unsupervised or Supervised word Embeddings
  12. nmtpytorch: Neural Machine Translation Framework in PyTorch.
  13. pytorch-wavenet: An implementation of WaveNet with fast generation
  14. Tacotron-pytorch: Tacotron: Towards End-to-End Speech Synthesis.
  15. AllenNLP: An open-source NLP research library, built on PyTorch.
  16. PyTorch-NLP: Text utilities and datasets for PyTorch pytorchnlp.readthedocs.io
  17. quick-nlp: Pytorch NLP library based on FastAI.
  18. TTS: Deep learning for Text2Speech
  19. LASER: Language-Agnostic SEntence Representations
  20. pyannote-audio: Neural building blocks for speaker diarization: speech activity detection, speaker change detection, speaker embedding
  21. gensen: Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning.
  22. translate: Translate – a PyTorch Language Library.
  23. espnet: End-to-End Speech Processing Toolkit espnet.github.io/espnet
  24. pythia: A software suite for Visual Question Answering
  25. UnsupervisedMT: Phrase-Based & Neural Unsupervised Machine Translation.
  26. jiant: The jiant sentence representation learning toolkit.
  27. BERT-PyTorch: Pytorch implementation of Google AI’s 2018 BERT, with simple annotation
  28. InferSent: Sentence embeddings (InferSent) and training code for NLI.
  29. uis-rnn:This is the library for the Unbounded Interleaved-State Recurrent Neural Network (UIS-RNN) algorithm, corresponding to the paper Fully Supervised Speaker Diarization. arxiv.org/abs/1810.04719
  30. flair: A very simple framework for state-of-the-art Natural Language Processing (NLP)
  31. pytext: A natural language modeling framework based on PyTorch fb.me/pytextdocs
  32. voicefilter: Unofficial PyTorch implementation of Google AI’s VoiceFilter system http://swpark.me/voicefilter
  33. BERT-NER: Pytorch-Named-Entity-Recognition-with-BERT.
  34. transfer-nlp: NLP library designed for flexible research and development
  35. texar-pytorch: Toolkit for Machine Learning and Text Generation, in PyTorch texar.io
  36. pytorch-kaldi: pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
  37. NeMo: Neural Modules: a toolkit for conversational AI nvidia.github.io/NeMo
  38. pytorch-struct: A library of vectorized implementations of core structured prediction algorithms (HMM, Dep Trees, CKY, ..,)
  39. espresso: Espresso: A Fast End-to-End Neural Speech Recognition Toolkit
  40. transformers: huggingface Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. huggingface.co/transformers
  41. reformer-pytorch: Reformer, the efficient Transformer, in Pytorch

CV:

  1. pytorch vision: Datasets, Transforms and Models specific to Computer Vision.
  2. pt-styletransfer: Neural style transfer as a class in PyTorch.
  3. OpenFacePytorch: PyTorch module to use OpenFace’s nn4.small2.v1.t7 model
  4. img_classification_pk_pytorch: Quickly comparing your image classification models with the state-of-the-art models (such as DenseNet, ResNet, …)
  5. SparseConvNet: Submanifold sparse convolutional networks.
  6. Convolution_LSTM_pytorch: A multi-layer convolution LSTM module
  7. face-alignment: ? 2D and 3D Face alignment library build using pytorch adrianbulat.com
  8. pytorch-semantic-segmentation: PyTorch for Semantic Segmentation.
  9. RoIAlign.pytorch: This is a PyTorch version of RoIAlign. This implementation is based on crop_and_resize and supports both forward and backward on CPU and GPU.
  10. pytorch-cnn-finetune: Fine-tune pretrained Convolutional Neural Networks with PyTorch.
  11. detectorch: Detectorch – detectron for PyTorch
  12. Augmentor: Image augmentation library in Python for machine learning. http://augmentor.readthedocs.io
  13. s2cnn: This library contains a PyTorch implementation of the SO(3) equivariant CNNs for spherical signals (e.g. omnidirectional cameras, signals on the globe)
  14. TorchCV: A PyTorch-Based Framework for Deep Learning in Computer Vision.
  15. maskrcnn-benchmark: Fast, modular reference implementation of Instance Segmentation and Object Detection algorithms in PyTorch.
  16. image-classification-mobile: Collection of classification models pretrained on the ImageNet-1K.
  17. medicaltorch: A medical imaging framework for Pytorch http://medicaltorch.readthedocs.io
  18. albumentations: Fast image augmentation library.
  19. kornia: Differentiable computer vision library.
  20. pytorch-text-recognition: Text recognition combo – CRAFT + CRNN.
  21. facenet-pytorch: Pretrained Pytorch face detection and recognition models ported from davidsandberg/facenet.
  22. detectron2: Detectron2 is FAIR’s next-generation research platform for object detection and segmentation.
  23. vedaseg: A semantic segmentation framework by pyotrch
  24. ClassyVision: An end-to-end PyTorch framework for image and video classification.
  25. detecto:Computer vision in Python with less than 10 lines of code
  26. pytorch3d: PyTorch3D is FAIR’s library of reusable components for deep learning with 3D data pytorch3d.org
  27. MMDetection: MMDetection is an open source object detection toolbox, a part of the OpenMMLab project.
  28. neural-dream: A PyTorch implementation of the DeepDream algorithm. Creates dream-like hallucinogenic visuals.
  29. FlashTorch: Visualization toolkit for neural networks in PyTorch!
  30. Lucent: Tensorflow and OpenAI Clarity’s Lucid adapted for PyTorch.
  31. MMDetection3D: MMDetection3D is OpenMMLab’s next-generation platform for general 3D object detection, a part of the OpenMMLab project.
  32. MMSegmentation: MMSegmentation is a semantic segmentation toolbox and benchmark, a part of the OpenMMLab project.
  33. MMEditing: MMEditing is a image and video editing toolbox, a part of the OpenMMLab project.
  34. MMAction2: MMAction2 is OpenMMLab’s next generation action understanding toolbox and benchmark, a part of the OpenMMLab project.
  35. MMPose: MMPose is a pose estimation toolbox and benchmark, a part of the OpenMMLab project.

Probabilistic/Generative Libraries:

  1. ptstat: Probabilistic Programming and Statistical Inference in PyTorch
  2. pyro: Deep universal probabilistic programming with Python and PyTorch http://pyro.ai
  3. probtorch: Probabilistic Torch is library for deep generative models that extends PyTorch.
  4. paysage: Unsupervised learning and generative models in python/pytorch.
  5. pyvarinf: Python package facilitating the use of Bayesian Deep Learning methods with Variational Inference for PyTorch.
  6. pyprob: A PyTorch-based library for probabilistic programming and inference compilation.
  7. mia: A library for running membership inference attacks against ML models.
  8. pro_gan_pytorch: ProGAN package implemented as an extension of PyTorch nn.Module.
  9. botorch: Bayesian optimization in PyTorch

Other libraries:

  1. pytorch extras: Some extra features for pytorch.
  2. functional zoo: PyTorch, unlike lua torch, has autograd in it’s core, so using modular structure of torch.nn modules is not necessary, one can easily allocate needed Variables and write a function that utilizes them, which is sometimes more convenient. This repo contains model definitions in this functional way, with pretrained weights for some models.
  3. torch-sampling: This package provides a set of transforms and data structures for sampling from in-memory or out-of-memory data.
  4. torchcraft-py: Python wrapper for TorchCraft, a bridge between Torch and StarCraft for AI research.
  5. aorun: Aorun intend to be a Keras with PyTorch as backend.
  6. logger: A simple logger for experiments.
  7. PyTorch-docset: PyTorch docset! use with Dash, Zeal, Velocity, or LovelyDocs.
  8. convert_torch_to_pytorch: Convert torch t7 model to pytorch model and source.
  9. pretrained-models.pytorch: The goal of this repo is to help to reproduce research papers results.
  10. pytorch_fft: PyTorch wrapper for FFTs
  11. caffe_to_torch_to_pytorch
  12. pytorch-extension: This is a CUDA extension for PyTorch which computes the Hadamard product of two tensors.
  13. tensorboard-pytorch: This module saves PyTorch tensors in tensorboard format for inspection. Currently supports scalar, image, audio, histogram features in tensorboard.
  14. gpytorch: GPyTorch is a Gaussian Process library, implemented using PyTorch. It is designed for creating flexible and modular Gaussian Process models with ease, so that you don’t have to be an expert to use GPs.
  15. spotlight: Deep recommender models using PyTorch.
  16. pytorch-cns: Compressed Network Search with PyTorch
  17. pyinn: CuPy fused PyTorch neural networks ops
  18. inferno: A utility library around PyTorch
  19. pytorch-fitmodule: Super simple fit method for PyTorch modules
  20. inferno-sklearn: A scikit-learn compatible neural network library that wraps pytorch.
  21. pytorch-caffe-darknet-convert: convert between pytorch, caffe prototxt/weights and darknet cfg/weights
  22. pytorch2caffe: Convert PyTorch model to Caffemodel
  23. pytorch-tools: Tools for PyTorch
  24. sru: Training RNNs as Fast as CNNs (arxiv.org/abs/1709.02755)
  25. torch2coreml: Torch7 -> CoreML
  26. PyTorch-Encoding: PyTorch Deep Texture Encoding Network http://hangzh.com/PyTorch-Encoding
  27. pytorch-ctc: PyTorch-CTC is an implementation of CTC (Connectionist Temporal Classification) beam search decoding for PyTorch. C++ code borrowed liberally from TensorFlow with some improvements to increase flexibility.
  28. candlegp: Gaussian Processes in Pytorch.
  29. dpwa: Distributed Learning by Pair-Wise Averaging.
  30. dni-pytorch: Decoupled Neural Interfaces using Synthetic Gradients for PyTorch.
  31. skorch: A scikit-learn compatible neural network library that wraps pytorch
  32. ignite: Ignite is a high-level library to help with training neural networks in PyTorch.
  33. Arnold: Arnold – DOOM Agent
  34. pytorch-mcn: Convert models from MatConvNet to PyTorch
  35. simple-faster-rcnn-pytorch: A simplified implemention of Faster R-CNN with competitive performance.
  36. generative_zoo: generative_zoo is a repository that provides working implementations of some generative models in PyTorch.
  37. pytorchviz: A small package to create visualizations of PyTorch execution graphs.
  38. cogitare: Cogitare – A Modern, Fast, and Modular Deep Learning and Machine Learning framework in Python.
  39. pydlt: PyTorch based Deep Learning Toolbox
  40. semi-supervised-pytorch: Implementations of different VAE-based semi-supervised and generative models in PyTorch.
  41. pytorch_cluster: PyTorch Extension Library of Optimised Graph Cluster Algorithms.
  42. neural-assembly-compiler: A neural assembly compiler for pyTorch based on adaptive-neural-compilation.
  43. caffemodel2pytorch: Convert Caffe models to PyTorch.
  44. extension-cpp: C++ extensions in PyTorch
  45. pytoune: A Keras-like framework and utilities for PyTorch
  46. jetson-reinforcement: Deep reinforcement learning libraries for NVIDIA Jetson TX1/TX2 with PyTorch, OpenAI Gym, and Gazebo robotics simulator.
  47. matchbox: Write PyTorch code at the level of individual examples, then run it efficiently on minibatches.
  48. torch-two-sample: A PyTorch library for two-sample tests
  49. pytorch-summary: Model summary in PyTorch similar to model.summary() in Keras
  50. mpl.pytorch: Pytorch implementation of MaxPoolingLoss.
  51. scVI-dev: Development branch of the scVI project in PyTorch
  52. apex: An Experimental PyTorch Extension(will be deprecated at a later point)
  53. ELF: ELF: a platform for game research.
  54. Torchlite: A high level library on top of(not only) Pytorch
  55. joint-vae: Pytorch implementation of JointVAE, a framework for disentangling continuous and discrete factors of variation star2
  56. SLM-Lab: Modular Deep Reinforcement Learning framework in PyTorch.
  57. bindsnet: A Python package used for simulating spiking neural networks (SNNs) on CPUs or GPUs using PyTorch
  58. pro_gan_pytorch: ProGAN package implemented as an extension of PyTorch nn.Module
  59. pytorch_geometric: Geometric Deep Learning Extension Library for PyTorch
  60. torchplus: Implements the + operator on PyTorch modules, returning sequences.
  61. lagom: lagom: A light PyTorch infrastructure to quickly prototype reinforcement learning algorithms.
  62. torchbearer: torchbearer: A model training library for researchers using PyTorch.
  63. pytorch-maml-rl: Reinforcement Learning with Model-Agnostic Meta-Learning in Pytorch.
  64. NALU: Basic pytorch implementation of NAC/NALU from Neural Arithmetic Logic Units paper by trask et.al arxiv.org/pdf/1808.00508.pdf
  65. QuCumber: Neural Network Many-Body Wavefunction Reconstruction
  66. magnet: Deep Learning Projects that Build Themselves http://magnet-dl.readthedocs.io/
  67. opencv_transforms: OpenCV implementation of Torchvision’s image augmentations
  68. fastai: The fast.ai deep learning library, lessons, and tutorials
  69. pytorch-dense-correspondence: Code for “Dense Object Nets: Learning Dense Visual Object Descriptors By and For Robotic Manipulation” arxiv.org/pdf/1806.08756.pdf
  70. colorization-pytorch: PyTorch reimplementation of Interactive Deep Colorization richzhang.github.io/ideepcolor
  71. beauty-net: A simple, flexible, and extensible template for PyTorch. It’s beautiful.
  72. OpenChem: OpenChem: Deep Learning toolkit for Computational Chemistry and Drug Design Research mariewelt.github.io/OpenChem
  73. torchani: Accurate Neural Network Potential on PyTorch aiqm.github.io/torchani
  74. PyTorch-LBFGS: A PyTorch implementation of L-BFGS.
  75. gpytorch: A highly efficient and modular implementation of Gaussian Processes in PyTorch.
  76. hessian: hessian in pytorch.
  77. vel: Velocity in deep-learning research.
  78. nonechucks: Skip bad items in your PyTorch DataLoader, use Transforms as Filters, and more!
  79. torchstat: Model analyzer in PyTorch.
  80. QNNPACK: Quantized Neural Network PACKage – mobile-optimized implementation of quantized neural network operators.
  81. torchdiffeq: Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.
  82. redner: A differentiable Monte Carlo path tracer
  83. pixyz: a library for developing deep generative models in a more concise, intuitive and extendable way.
  84. euclidesdb: A multi-model machine learning feature embedding database http://euclidesdb.readthedocs.io
  85. pytorch2keras: Convert PyTorch dynamic graph to Keras model.
  86. salad: Semi-Supervised Learning and Domain Adaptation.
  87. netharn: Parameterized fit and prediction harnesses for pytorch.
  88. dgl: Python package built to ease deep learning on graph, on top of existing DL frameworks. http://dgl.ai.
  89. gandissect: Pytorch-based tools for visualizing and understanding the neurons of a GAN. gandissect.csail.mit.edu
  90. delira: Lightweight framework for fast prototyping and training deep neural networks in medical imaging delira.rtfd.io
  91. mushroom: Python library for Reinforcement Learning experiments.
  92. Xlearn: Transfer Learning Library
  93. geoopt: Riemannian Adaptive Optimization Methods with pytorch optim
  94. vegans: A library providing various existing GANs in PyTorch.
  95. torchgeometry: TGM: PyTorch Geometry
  96. AdverTorch: A Toolbox for Adversarial Robustness (attack/defense/training) Research
  97. AdaBound: An optimizer that trains as fast as Adam and as good as SGD.a
  98. fenchel-young-losses: Probabilistic classification in PyTorch/TensorFlow/scikit-learn with Fenchel-Young losses
  99. pytorch-OpCounter: Count the FLOPs of your PyTorch model.
  100. Tor10: A Generic Tensor-Network library that is designed for quantum simulation, base on the pytorch.
  101. Catalyst: High-level utils for PyTorch DL & RL research. It was developed with a focus on reproducibility, fast experimentation and code/ideas reusing. Being able to research/develop something new, rather than write another regular train loop.
  102. Ax: Adaptive Experimentation Platform
  103. pywick: High-level batteries-included neural network training library for Pytorch
  104. torchgpipe: A GPipe implementation in PyTorch torchgpipe.readthedocs.io
  105. hub: Pytorch Hub is a pre-trained model repository designed to facilitate research reproducibility.
  106. pytorch-lightning: Rapid research framework for Pytorch. The researcher’s version of keras.
  107. Tor10: A Generic Tensor-Network library that is designed for quantum simulation, base on the pytorch.
  108. tensorwatch: Debugging, monitoring and visualization for Deep Learning and Reinforcement Learning from Microsoft Research.
  109. wavetorch: Numerically solving and backpropagating through the wave equation arxiv.org/abs/1904.12831
  110. diffdist: diffdist is a python library for pytorch. It extends the default functionality of torch.autograd and adds support for differentiable communication between processes.
  111. torchprof: A minimal dependency library for layer-by-layer profiling of Pytorch models.
  112. osqpth: The differentiable OSQP solver layer for PyTorch.
  113. mctorch: A manifold optimization library for deep learning.
  114. pytorch-hessian-eigenthings: Efficient PyTorch Hessian eigendecomposition using the Hessian-vector product and stochastic power iteration.
  115. MinkowskiEngine: Minkowski Engine is an auto-diff library for generalized sparse convolutions and high-dimensional sparse tensors.
  116. pytorch-cpp-rl: PyTorch C++ Reinforcement Learning
  117. pytorch-toolbelt: PyTorch extensions for fast R&D prototyping and Kaggle farming
  118. argus-tensor-stream: A library for real-time video stream decoding to CUDA memory tensorstream.argus-ai.com
  119. macarico: learning to search in pytorch
  120. rlpyt: Reinforcement Learning in PyTorch
  121. pywarm: A cleaner way to build neural networks for PyTorch. blue-season.github.io/pywarm
  122. learn2learn: PyTorch Meta-learning Framework for Researchers http://learn2learn.net
  123. torchbeast: A PyTorch Platform for Distributed RL
  124. higher: higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
  125. Torchelie: Torchélie is a set of utility functions, layers, losses, models, trainers and other things for PyTorch. torchelie.readthedocs.org
  126. CrypTen: CrypTen is a Privacy Preserving Machine Learning framework written using PyTorch that allows researchers and developers to train models using encrypted data. CrypTen currently supports Secure multi-party computation as its encryption mechanism.
  127. cvxpylayers: cvxpylayers is a Python library for constructing differentiable convex optimization layers in PyTorch
  128. RepDistiller: Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
  129. kaolin: PyTorch library aimed at accelerating 3D deep learning research
  130. PySNN: Efficient Spiking Neural Network framework, built on top of PyTorch for GPU acceleration.
  131. sparktorch: Train and run Pytorch models on Apache Spark.
  132. pytorch-metric-learning: The easiest way to use metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
  133. autonomous-learning-library: A PyTorch library for building deep reinforcement learning agents.
  134. flambe: An ML framework to accelerate research and its path to production. flambe.ai
  135. pytorch-optimizer: Collections of modern optimization algorithms for PyTorch, includes: AccSGD, AdaBound, AdaMod, DiffGrad, Lamb, RAdam, RAdam, Yogi.
  136. PyTorch-VAE: A Collection of Variational Autoencoders (VAE) in PyTorch.
  137. ray: A fast and simple framework for building and running distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library. ray.io
  138. Pytorch Geometric Temporal: A temporal extension library for PyTorch Geometric
  139. Poutyne: A Keras-like framework for PyTorch that handles much of the boilerplating code needed to train neural networks.
  140. Pytorch-Toolbox: This is toolbox project for Pytorch. Aiming to make you write Pytorch code more easier, readable and concise.
  141. Pytorch-contrib: It contains reviewed implementations of ideas from recent machine learning papers.
  142. EfficientNet PyTorch: It contains an op-for-op PyTorch reimplementation of EfficientNet, along with pre-trained models and examples.
  143. PyTorch/XLA: PyTorch/XLA is a Python package that uses the XLA deep learning compiler to connect the PyTorch deep learning framework and Cloud TPUs.
  144. webdataset: WebDataset is a PyTorch Dataset (IterableDataset) implementation providing efficient access to datasets stored in POSIX tar archives.
  145. volksdep: volksdep is an open-source toolbox for deploying and accelerating PyTorch, Onnx and Tensorflow models with TensorRT.

Tutorials, books, & examples

  1. Practical Pytorch: Tutorials explaining different RNN models
  2. DeepLearningForNLPInPytorch: An IPython Notebook tutorial on deep learning, with an emphasis on Natural Language Processing.
  3. pytorch-tutorial: tutorial for researchers to learn deep learning with pytorch.
  4. pytorch-exercises: pytorch-exercises collection.
  5. pytorch tutorials: Various pytorch tutorials.
  6. pytorch examples: A repository showcasing examples of using pytorch
  7. pytorch practice: Some example scripts on pytorch.
  8. pytorch mini tutorials: Minimal tutorials for PyTorch adapted from Alec Radford’s Theano tutorials.
  9. pytorch text classification: A simple implementation of CNN based text classification in Pytorch
  10. cats vs dogs: Example of network fine-tuning in pytorch for the kaggle competition Dogs vs. Cats Redux: Kernels Edition. Currently #27 (0.05074) on the leaderboard.
  11. convnet: This is a complete training example for Deep Convolutional Networks on various datasets (ImageNet, Cifar10, Cifar100, MNIST).
  12. pytorch-generative-adversarial-networks: simple generative adversarial network (GAN) using PyTorch.
  13. pytorch containers: This repository aims to help former Torchies more seamlessly transition to the “Containerless” world of PyTorch by providing a list of PyTorch implementations of Torch Table Layers.
  14. T-SNE in pytorch: t-SNE experiments in pytorch
  15. AAE_pytorch: Adversarial Autoencoders (with Pytorch).
  16. Kind_PyTorch_Tutorial: Kind PyTorch Tutorial for beginners.
  17. pytorch-poetry-gen: a char-RNN based on pytorch.
  18. pytorch-REINFORCE: PyTorch implementation of REINFORCE, This repo supports both continuous and discrete environments in OpenAI gym.
  19. PyTorch-Tutorial: Build your neural network easy and fast https://morvanzhou.github.io/tutorials/
  20. pytorch-intro: A couple of scripts to illustrate how to do CNNs and RNNs in PyTorch
  21. pytorch-classification: A unified framework for the image classification task on CIFAR-10/100 and ImageNet.
  22. pytorch_notebooks – hardmaru: Random tutorials created in NumPy and PyTorch.
  23. pytorch_tutoria-quick: Quick PyTorch introduction and tutorial. Targets computer vision, graphics and machine learning researchers eager to try a new framework.
  24. Pytorch_fine_tuning_Tutorial: A short tutorial on performing fine tuning or transfer learning in PyTorch.
  25. pytorch_exercises: pytorch-exercises
  26. traffic-sign-detection: nyu-cv-fall-2017 example
  27. mss_pytorch: Singing Voice Separation via Recurrent Inference and Skip-Filtering Connections – PyTorch Implementation. Demo: js-mim.github.io/mss_pytorch
  28. DeepNLP-models-Pytorch Pytorch implementations of various Deep NLP models in cs-224n(Stanford Univ: NLP with Deep Learning)
  29. Mila introductory tutorials: Various tutorials given for welcoming new students at MILA.
  30. pytorch.rl.learning: for learning reinforcement learning using PyTorch.
  31. minimal-seq2seq: Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
  32. tensorly-notebooks: Tensor methods in Python with TensorLy tensorly.github.io/dev
  33. pytorch_bits: time-series prediction related examples.
  34. skip-thoughts: An implementation of Skip-Thought Vectors in PyTorch.
  35. video-caption-pytorch: pytorch code for video captioning.
  36. Capsule-Network-Tutorial: Pytorch easy-to-follow Capsule Network tutorial.
  37. code-of-learn-deep-learning-with-pytorch: This is code of book “Learn Deep Learning with PyTorch” item.jd.com/17915495606.html
  38. RL-Adventure: Pytorch easy-to-follow step-by-step Deep Q Learning tutorial with clean readable code.
  39. accelerated_dl_pytorch: Accelerated Deep Learning with PyTorch at Jupyter Day Atlanta II.
  40. RL-Adventure-2: PyTorch4 tutorial of: actor critic / proximal policy optimization / acer / ddpg / twin dueling ddpg / soft actor critic / generative adversarial imitation learning / hindsight experience replay
  41. Generative Adversarial Networks (GANs) in 50 lines of code (PyTorch)
  42. adversarial-autoencoders-with-pytorch
  43. transfer learning using pytorch
  44. how-to-implement-a-yolo-object-detector-in-pytorch
  45. pytorch-for-recommenders-101
  46. pytorch-for-numpy-users
  47. PyTorch Tutorial: PyTorch Tutorials in Chinese.
  48. grokking-pytorch: The Hitchiker’s Guide to PyTorch
  49. PyTorch-Deep-Learning-Minicourse: Minicourse in Deep Learning with PyTorch.
  50. pytorch-custom-dataset-examples: Some custom dataset examples for PyTorch
  51. Multiplicative LSTM for sequence-based Recommenders
  52. deeplearning.ai-pytorch: PyTorch Implementations of Coursera’s Deep Learning(deeplearning.ai) Specialization.
  53. MNIST_Pytorch_python_and_capi: This is an example of how to train a MNIST network in Python and run it in c++ with pytorch 1.0
  54. torch_light: Tutorials and examples include Reinforcement Training, NLP, CV
  55. portrain-gan: torch code to decode (and almost encode) latents from art-DCGAN’s Portrait GAN.
  56. mri-analysis-pytorch: MRI analysis using PyTorch and MedicalTorch
  57. cifar10-fast: Demonstration of training a small ResNet on CIFAR10 to 94% test accuracy in 79 seconds as described in this blog series.
  58. Intro to Deep Learning with PyTorch: A free course by Udacity and facebook, with a good intro to PyTorch, and an interview with Soumith Chintala, one of the original authors of PyTorch.
  59. pytorch-sentiment-analysis: Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
  60. pytorch-image-models: PyTorch image models, scripts, pretrained weights — (SE)ResNet/ResNeXT, DPN, EfficientNet, MobileNet-V3/V2/V1, MNASNet, Single-Path NAS, FBNet, and more.
  61. CIFAR-ZOO: Pytorch implementation for multiple CNN architectures and improve methods with state-of-the-art results.
  62. d2l-pytorch: This is an attempt to modify Dive into Deep Learning, Berkeley STAT 157 (Spring 2019) textbook’s code into PyTorch.
  63. thinking-in-tensors-writing-in-pytorch: Thinking in tensors, writing in PyTorch (a hands-on deep learning intro).
  64. NER-BERT-pytorch: PyTorch solution of named entity recognition task Using Google AI’s pre-trained BERT model.
  65. pytorch-sync-batchnorm-example: How to use Cross Replica / Synchronized Batchnorm in Pytorch.
  66. SentimentAnalysis: Sentiment analysis neural network trained by fine tuning BERT on the Stanford Sentiment Treebank, thanks to Hugging Face‘s Transformers library.
  67. pytorch-cpp: C++ implementations of PyTorch tutorials for deep learning researchers (based on the Python tutorials from pytorch-tutorial).
  68. Deep Learning with PyTorch: Zero to GANs: Interactive and coding-focused tutorial series on introduction to Deep Learning with PyTorch (video).
  69. Deep Learning with PyTorch: Deep Learning with PyTorch teaches you how to implement deep learning algorithms with Python and PyTorch, the book includes a case study: building an algorithm capable of detecting malignant lung tumors using CT scans.
  70. Serverless Machine Learning in Action with PyTorch and AWS: Serverless Machine Learning in Action is a guide to bringing your experimental PyTorch machine learning code to production using serverless capabilities from major cloud providers like AWS, Azure, or GCP.

Paper implementations

  1. google_evolution: This implements one of result networks from Large-scale evolution of image classifiers by Esteban Real, et. al.
  2. pyscatwave: Fast Scattering Transform with CuPy/PyTorch,read the paper here
  3. scalingscattering: Scaling The Scattering Transform : Deep Hybrid Networks.
  4. deep-auto-punctuation: a pytorch implementation of auto-punctuation learned character by character.
  5. Realtime_Multi-Person_Pose_Estimation: This is a pytorch version of Realtime_Multi-Person_Pose_Estimation, origin code is here .
  6. PyTorch-value-iteration-networks: PyTorch implementation of the Value Iteration Networks (NIPS ’16) paper
  7. pytorch_Highway: Highway network implemented in pytorch.
  8. pytorch_NEG_loss: NEG loss implemented in pytorch.
  9. pytorch_RVAE: Recurrent Variational Autoencoder that generates sequential data implemented in pytorch.
  10. pytorch_TDNN: Time Delayed NN implemented in pytorch.
  11. eve.pytorch: An implementation of Eve Optimizer, proposed in Imploving Stochastic Gradient Descent with Feedback, Koushik and Hayashi, 2016.
  12. e2e-model-learning: Task-based end-to-end model learning.
  13. pix2pix-pytorch: PyTorch implementation of “Image-to-Image Translation Using Conditional Adversarial Networks”.
  14. Single Shot MultiBox Detector: A PyTorch Implementation of Single Shot MultiBox Detector.
  15. DiscoGAN: PyTorch implementation of “Learning to Discover Cross-Domain Relations with Generative Adversarial Networks”
  16. official DiscoGAN implementation: Official implementation of “Learning to Discover Cross-Domain Relations with Generative Adversarial Networks”.
  17. pytorch-es: This is a PyTorch implementation of Evolution Strategies .
  18. piwise: Pixel-wise segmentation on VOC2012 dataset using pytorch.
  19. pytorch-dqn: Deep Q-Learning Network in pytorch.
  20. neuraltalk2-pytorch: image captioning model in pytorch(finetunable cnn in branch with_finetune)
  21. vnet.pytorch: A Pytorch implementation for V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation.
  22. pytorch-fcn: PyTorch implementation of Fully Convolutional Networks.
  23. WideResNets: WideResNets for CIFAR10/100 implemented in PyTorch. This implementation requires less GPU memory than what is required by the official Torch implementation: https://github.com/szagoruyko/wide-residual-networks .
  24. pytorch_highway_networks: Highway networks implemented in PyTorch.
  25. pytorch-NeuCom: Pytorch implementation of DeepMind’s differentiable neural computer paper.
  26. captionGen: Generate captions for an image using PyTorch.
  27. AnimeGAN: A simple PyTorch Implementation of Generative Adversarial Networks, focusing on anime face drawing.
  28. Cnn-text classification: This is the implementation of Kim’s Convolutional Neural Networks for Sentence Classification paper in PyTorch.
  29. deepspeech2: Implementation of DeepSpeech2 using Baidu Warp-CTC. Creates a network based on the DeepSpeech2 architecture, trained with the CTC activation function.
  30. seq2seq: This repository contains implementations of Sequence to Sequence (Seq2Seq) models in PyTorch
  31. Asynchronous Advantage Actor-Critic in PyTorch: This is PyTorch implementation of A3C as described in Asynchronous Methods for Deep Reinforcement Learning. Since PyTorch has a easy method to control shared memory within multiprocess, we can easily implement asynchronous method like A3C.
  32. densenet: This is a PyTorch implementation of the DenseNet-BC architecture as described in the paper Densely Connected Convolutional Networks by G. Huang, Z. Liu, K. Weinberger, and L. van der Maaten. This implementation gets a CIFAR-10+ error rate of 4.77 with a 100-layer DenseNet-BC with a growth rate of 12. Their official implementation and links to many other third-party implementations are available in the liuzhuang13/DenseNet repo on GitHub.
  33. nninit: Weight initialization schemes for PyTorch nn.Modules. This is a port of the popular nninit for Torch7 by @kaixhin.
  34. faster rcnn: This is a PyTorch implementation of Faster RCNN. This project is mainly based on py-faster-rcnn and TFFRCNN.For details about R-CNN please refer to the paper Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks by Shaoqing Ren, Kaiming He, Ross Girshick, Jian Sun.
  35. doomnet: PyTorch’s version of Doom-net implementing some RL models in ViZDoom environment.
  36. flownet: Pytorch implementation of FlowNet by Dosovitskiy et al.
  37. sqeezenet: Implementation of Squeezenet in pytorch, #### pretrained models on CIFAR10 data to come Plan to train the model on cifar 10 and add block connections too.
  38. WassersteinGAN: wassersteinGAN in pytorch.
  39. optnet: This repository is by Brandon Amos and J. Zico Kolter and contains the PyTorch source code to reproduce the experiments in our paper OptNet: Differentiable Optimization as a Layer in Neural Networks.
  40. qp solver: A fast and differentiable QP solver for PyTorch. Crafted by Brandon Amos and J. Zico Kolter.
  41. Continuous Deep Q-Learning with Model-based Acceleration : Reimplementation of Continuous Deep Q-Learning with Model-based Acceleration.
  42. Learning to learn by gradient descent by gradient descent: PyTorch implementation of Learning to learn by gradient descent by gradient descent.
  43. fast-neural-style: pytorch implementation of fast-neural-style, The model uses the method described in Perceptual Losses for Real-Time Style Transfer and Super-Resolution along with Instance Normalization.
  44. PytorchNeuralStyleTransfer: Implementation of Neural Style Transfer in Pytorch.
  45. Fast Neural Style for Image Style Transform by Pytorch: Fast Neural Style for Image Style Transform by Pytorch .
  46. neural style transfer: An introduction to PyTorch through the Neural-Style algorithm (https://arxiv.org/abs/1508.06576) developed by Leon A. Gatys, Alexander S. Ecker and Matthias Bethge.
  47. VIN_PyTorch_Visdom: PyTorch implementation of Value Iteration Networks (VIN): Clean, Simple and Modular. Visualization in Visdom.
  48. YOLO2: YOLOv2 in PyTorch.
  49. attention-transfer: Attention transfer in pytorch, read the paper here.
  50. SVHNClassifier: A PyTorch implementation of Multi-digit Number Recognition from Street View Imagery using Deep Convolutional Neural Networks.
  51. pytorch-deform-conv: PyTorch implementation of Deformable Convolution.
  52. BEGAN-pytorch: PyTorch implementation of BEGAN: Boundary Equilibrium Generative Adversarial Networks.
  53. treelstm.pytorch: Tree LSTM implementation in PyTorch.
  54. AGE: Code for paper “Adversarial Generator-Encoder Networks” by Dmitry Ulyanov, Andrea Vedaldi and Victor Lempitsky which can be found here
  55. ResNeXt.pytorch: Reproduces ResNet-V3 (Aggregated Residual Transformations for Deep Neural Networks) with pytorch.
  56. pytorch-rl: Deep Reinforcement Learning with pytorch & visdom
  57. Deep-Leafsnap: LeafSnap replicated using deep neural networks to test accuracy compared to traditional computer vision methods.
  58. pytorch-CycleGAN-and-pix2pix: PyTorch implementation for both unpaired and paired image-to-image translation.
  59. A3C-PyTorch:PyTorch implementation of Advantage async actor-critic Algorithms (A3C) in PyTorch
  60. pytorch-value-iteration-networks: Pytorch implementation of Value Iteration Networks (NIPS 2016 best paper)
  61. PyTorch-Style-Transfer: PyTorch Implementation of Multi-style Generative Network for Real-time Transfer
  62. pytorch-deeplab-resnet: pytorch-deeplab-resnet-model.
  63. pointnet.pytorch: pytorch implementation for “PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation” https://arxiv.org/abs/1612.00593
  64. pytorch-playground: Base pretrained models and datasets in pytorch (MNIST, SVHN, CIFAR10, CIFAR100, STL10, AlexNet, VGG16, VGG19, ResNet, Inception, SqueezeNet).
  65. pytorch-dnc: Neural Turing Machine (NTM) & Differentiable Neural Computer (DNC) with pytorch & visdom.
  66. pytorch_image_classifier: Minimal But Practical Image Classifier Pipline Using Pytorch, Finetune on ResNet18, Got 99% Accuracy on Own Small Datasets.
  67. mnist-svhn-transfer: PyTorch Implementation of CycleGAN and SGAN for Domain Transfer (Minimal).
  68. pytorch-yolo2: pytorch-yolo2
  69. dni: Implement Decoupled Neural Interfaces using Synthetic Gradients in Pytorch
  70. wgan-gp: A pytorch implementation of Paper “Improved Training of Wasserstein GANs”.
  71. pytorch-seq2seq-intent-parsing: Intent parsing and slot filling in PyTorch with seq2seq + attention
  72. pyTorch_NCE: An implementation of the Noise Contrastive Estimation algorithm for pyTorch. Working, yet not very efficient.
  73. molencoder: Molecular AutoEncoder in PyTorch
  74. GAN-weight-norm: Code for “On the Effects of Batch and Weight Normalization in Generative Adversarial Networks”
  75. lgamma: Implementations of polygamma, lgamma, and beta functions for PyTorch
  76. bigBatch: Code used to generate the results appearing in “Train longer, generalize better: closing the generalization gap in large batch training of neural networks”
  77. rl_a3c_pytorch: Reinforcement learning with implementation of A3C LSTM for Atari 2600.
  78. pytorch-retraining: Transfer Learning Shootout for PyTorch’s model zoo (torchvision)
  79. nmp_qc: Neural Message Passing for Computer Vision
  80. grad-cam: Pytorch implementation of Grad-CAM
  81. pytorch-trpo: PyTorch Implementation of Trust Region Policy Optimization (TRPO)
  82. pytorch-explain-black-box: PyTorch implementation of Interpretable Explanations of Black Boxes by Meaningful Perturbation
  83. vae_vpflows: Code in PyTorch for the convex combination linear IAF and the Householder Flow, J.M. Tomczak & M. Welling https://jmtomczak.github.io/deebmed.html
  84. relational-networks: Pytorch implementation of “A simple neural network module for relational reasoning” (Relational Networks) https://arxiv.org/pdf/1706.01427.pdf
  85. vqa.pytorch: Visual Question Answering in Pytorch
  86. end-to-end-negotiator: Deal or No Deal? End-to-End Learning for Negotiation Dialogues
  87. odin-pytorch: Principled Detection of Out-of-Distribution Examples in Neural Networks.
  88. FreezeOut: Accelerate Neural Net Training by Progressively Freezing Layers.
  89. ARAE: Code for the paper “Adversarially Regularized Autoencoders for Generating Discrete Structures” by Zhao, Kim, Zhang, Rush and LeCun.
  90. forward-thinking-pytorch: Pytorch implementation of “Forward Thinking: Building and Training Neural Networks One Layer at a Time” https://arxiv.org/pdf/1706.02480.pdf
  91. context_encoder_pytorch: PyTorch Implement of Context Encoders
  92. attention-is-all-you-need-pytorch: A PyTorch implementation of the Transformer model in “Attention is All You Need”.https://github.com/thnkim/OpenFacePytorch
  93. OpenFacePytorch: PyTorch module to use OpenFace’s nn4.small2.v1.t7 model
  94. neural-combinatorial-rl-pytorch: PyTorch implementation of Neural Combinatorial Optimization with Reinforcement Learning.
  95. pytorch-nec: PyTorch Implementation of Neural Episodic Control (NEC)
  96. seq2seq.pytorch: Sequence-to-Sequence learning using PyTorch
  97. Pytorch-Sketch-RNN: a pytorch implementation of arxiv.org/abs/1704.03477
  98. pytorch-pruning: PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
  99. DrQA: A pytorch implementation of Reading Wikipedia to Answer Open-Domain Questions.
  100. YellowFin_Pytorch: auto-tuning momentum SGD optimizer
  101. samplernn-pytorch: PyTorch implementation of SampleRNN: An Unconditional End-to-End Neural Audio Generation Model.
  102. AEGeAN: Deeper DCGAN with AE stabilization
  103. /pytorch-SRResNet: pytorch implementation for Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network arXiv:1609.04802v2
  104. vsepp: Code for the paper “VSE++: Improved Visual Semantic Embeddings”
  105. Pytorch-DPPO: Pytorch implementation of Distributed Proximal Policy Optimization: arxiv.org/abs/1707.02286
  106. UNIT: PyTorch Implementation of our Coupled VAE-GAN algorithm for Unsupervised Image-to-Image Translation
  107. efficient_densenet_pytorch: A memory-efficient implementation of DenseNets
  108. tsn-pytorch: Temporal Segment Networks (TSN) in PyTorch.
  109. SMASH: An experimental technique for efficiently exploring neural architectures.
  110. pytorch-retinanet: RetinaNet in PyTorch
  111. biogans: Implementation supporting the ICCV 2017 paper “GANs for Biological Image Synthesis”.
  112. Semantic Image Synthesis via Adversarial Learning: A PyTorch implementation of the paper “Semantic Image Synthesis via Adversarial Learning” in ICCV 2017.
  113. fmpytorch: A PyTorch implementation of a Factorization Machine module in cython.
  114. ORN: A PyTorch implementation of the paper “Oriented Response Networks” in CVPR 2017.
  115. pytorch-maml: PyTorch implementation of MAML: arxiv.org/abs/1703.03400
  116. pytorch-generative-model-collections: Collection of generative models in Pytorch version.
  117. vqa-winner-cvprw-2017: Pytorch Implementation of winner from VQA Chllange Workshop in CVPR’17.
  118. tacotron_pytorch: PyTorch implementation of Tacotron speech synthesis model.
  119. pspnet-pytorch: PyTorch implementation of PSPNet segmentation network
  120. LM-LSTM-CRF: Empower Sequence Labeling with Task-Aware Language Model http://arxiv.org/abs/1709.04109
  121. face-alignment: Pytorch implementation of the paper “How far are we from solving the 2D & 3D Face Alignment problem? (and a dataset of 230,000 3D facial landmarks)”, ICCV 2017
  122. DepthNet: PyTorch DepthNet Training on Still Box dataset.
  123. EDSR-PyTorch: PyTorch version of the paper ‘Enhanced Deep Residual Networks for Single Image Super-Resolution’ (CVPRW 2017)
  124. e2c-pytorch: Embed to Control implementation in PyTorch.
  125. 3D-ResNets-PyTorch: 3D ResNets for Action Recognition.
  126. bandit-nmt: This is code repo for our EMNLP 2017 paper “Reinforcement Learning for Bandit Neural Machine Translation with Simulated Human Feedback”, which implements the A2C algorithm on top of a neural encoder-decoder model and benchmarks the combination under simulated noisy rewards.
  127. pytorch-a2c-ppo-acktr: PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO) and Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR).
  128. zalando-pytorch: Various experiments on the Fashion-MNIST dataset from Zalando.
  129. sphereface_pytorch: A PyTorch Implementation of SphereFace.
  130. Categorical DQN: A PyTorch Implementation of Categorical DQN from A Distributional Perspective on Reinforcement Learning.
  131. pytorch-ntm: pytorch ntm implementation.
  132. mask_rcnn_pytorch: Mask RCNN in PyTorch.
  133. graph_convnets_pytorch: PyTorch implementation of graph ConvNets, NIPS’16
  134. pytorch-faster-rcnn: A pytorch implementation of faster RCNN detection framework based on Xinlei Chen’s tf-faster-rcnn.
  135. torchMoji: A pyTorch implementation of the DeepMoji model: state-of-the-art deep learning model for analyzing sentiment, emotion, sarcasm etc.
  136. semantic-segmentation-pytorch: Pytorch implementation for Semantic Segmentation/Scene Parsing on MIT ADE20K dataset
  137. pytorch-qrnn: PyTorch implementation of the Quasi-Recurrent Neural Network – up to 16 times faster than NVIDIA’s cuDNN LSTM
  138. pytorch-sgns: Skipgram Negative Sampling in PyTorch.
  139. SfmLearner-Pytorch : Pytorch version of SfmLearner from Tinghui Zhou et al.
  140. deformable-convolution-pytorch: PyTorch implementation of Deformable Convolution.
  141. skip-gram-pytorch: A complete pytorch implementation of skipgram model (with subsampling and negative sampling). The embedding result is tested with Spearman’s rank correlation.
  142. stackGAN-v2: Pytorch implementation for reproducing StackGAN_v2 results in the paper StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks by Han Zhang*, Tao Xu*, Hongsheng Li, Shaoting Zhang, Xiaogang Wang, Xiaolei Huang, Dimitris Metaxas.
  143. self-critical.pytorch: Unofficial pytorch implementation for Self-critical Sequence Training for Image Captioning.
  144. pygcn: Graph Convolutional Networks in PyTorch.
  145. dnc: Differentiable Neural Computers, for Pytorch
  146. prog_gans_pytorch_inference: PyTorch inference for “Progressive Growing of GANs” with CelebA snapshot.
  147. pytorch-capsule: Pytorch implementation of Hinton’s Dynamic Routing Between Capsules.
  148. PyramidNet-PyTorch: A PyTorch implementation for PyramidNets (Deep Pyramidal Residual Networks, arxiv.org/abs/1610.02915)
  149. radio-transformer-networks: A PyTorch implementation of Radio Transformer Networks from the paper “An Introduction to Deep Learning for the Physical Layer”. arxiv.org/abs/1702.00832
  150. honk: PyTorch reimplementation of Google’s TensorFlow CNNs for keyword spotting.
  151. DeepCORAL: A PyTorch implementation of ‘Deep CORAL: Correlation Alignment for Deep Domain Adaptation.’, ECCV 2016
  152. pytorch-pose: A PyTorch toolkit for 2D Human Pose Estimation.
  153. lang-emerge-parlai: Implementation of EMNLP 2017 Paper “Natural Language Does Not Emerge ‘Naturally’ in Multi-Agent Dialog” using PyTorch and ParlAI
  154. Rainbow: Rainbow: Combining Improvements in Deep Reinforcement Learning
  155. pytorch_compact_bilinear_pooling v1: This repository has a pure Python implementation of Compact Bilinear Pooling and Count Sketch for PyTorch.
  156. CompactBilinearPooling-Pytorch v2: (Yang Gao, et al.) A Pytorch Implementation for Compact Bilinear Pooling.
  157. FewShotLearning: Pytorch implementation of the paper “Optimization as a Model for Few-Shot Learning”
  158. meProp: Codes for “meProp: Sparsified Back Propagation for Accelerated Deep Learning with Reduced Overfitting”.
  159. SFD_pytorch: A PyTorch Implementation of Single Shot Scale-invariant Face Detector.
  160. GradientEpisodicMemory: Continuum Learning with GEM: Gradient Episodic Memory. https://arxiv.org/abs/1706.08840
  161. DeblurGAN: Pytorch implementation of the paper DeblurGAN: Blind Motion Deblurring Using Conditional Adversarial Networks.
  162. StarGAN: StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Tranlsation.
  163. CapsNet-pytorch: PyTorch implementation of NIPS 2017 paper Dynamic Routing Between Capsules.
  164. CondenseNet: CondenseNet: An Efficient DenseNet using Learned Group Convolutions.
  165. deep-image-prior: Image restoration with neural networks but without learning.
  166. deep-head-pose: Deep Learning Head Pose Estimation using PyTorch.
  167. Random-Erasing: This code has the source code for the paper “Random Erasing Data Augmentation”.
  168. FaderNetworks: Fader Networks: Manipulating Images by Sliding Attributes – NIPS 2017
  169. FlowNet 2.0: FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
  170. pix2pixHD: Synthesizing and manipulating 2048×1024 images with conditional GANs tcwang0509.github.io/pix2pixHD
  171. pytorch-smoothgrad: SmoothGrad implementation in PyTorch
  172. RetinaNet: An implementation of RetinaNet in PyTorch.
  173. faster-rcnn.pytorch: This project is a faster faster R-CNN implementation, aimed to accelerating the training of faster R-CNN object detection models.
  174. mixup_pytorch: A PyTorch implementation of the paper Mixup: Beyond Empirical Risk Minimization in PyTorch.
  175. inplace_abn: In-Place Activated BatchNorm for Memory-Optimized Training of DNNs
  176. pytorch-pose-hg-3d: PyTorch implementation for 3D human pose estimation
  177. nmn-pytorch: Neural Module Network for VQA in Pytorch.
  178. bytenet: Pytorch implementation of bytenet from “Neural Machine Translation in Linear Time” paper
  179. bottom-up-attention-vqa: vqa, bottom-up-attention, pytorch
  180. yolo2-pytorch: The YOLOv2 is one of the most popular one-stage object detector. This project adopts PyTorch as the developing framework to increase productivity, and utilize ONNX to convert models into Caffe 2 to benifit engineering deployment.
  181. reseg-pytorch: PyTorch Implementation of ReSeg (arxiv.org/pdf/1511.07053.pdf)
  182. binary-stochastic-neurons: Binary Stochastic Neurons in PyTorch.
  183. pytorch-pose-estimation: PyTorch Implementation of Realtime Multi-Person Pose Estimation project.
  184. interaction_network_pytorch: Pytorch Implementation of Interaction Networks for Learning about Objects, Relations and Physics.
  185. NoisyNaturalGradient: Pytorch Implementation of paper “Noisy Natural Gradient as Variational Inference”.
  186. ewc.pytorch: An implementation of Elastic Weight Consolidation (EWC), proposed in James Kirkpatrick et al. Overcoming catastrophic forgetting in neural networks 2016(10.1073/pnas.1611835114).
  187. pytorch-zssr: PyTorch implementation of 1712.06087 “Zero-Shot” Super-Resolution using Deep Internal Learning
  188. deep_image_prior: An implementation of image reconstruction methods from Deep Image Prior (Ulyanov et al., 2017) in PyTorch.
  189. pytorch-transformer: pytorch implementation of Attention is all you need.
  190. DeepRL-Grounding: This is a PyTorch implementation of the AAAI-18 paper Gated-Attention Architectures for Task-Oriented Language Grounding
  191. deep-forecast-pytorch: Wind Speed Prediction using LSTMs in PyTorch (arxiv.org/pdf/1707.08110.pdf)
  192. cat-net: Canonical Appearance Transformations
  193. minimal_glo: Minimal PyTorch implementation of Generative Latent Optimization from the paper “Optimizing the Latent Space of Generative Networks”
  194. LearningToCompare-Pytorch: Pytorch Implementation for Paper: Learning to Compare: Relation Network for Few-Shot Learning.
  195. poincare-embeddings: PyTorch implementation of the NIPS-17 paper “Poincaré Embeddings for Learning Hierarchical Representations”.
  196. pytorch-trpo(Hessian-vector product version): This is a PyTorch implementation of “Trust Region Policy Optimization (TRPO)” with exact Hessian-vector product instead of finite differences approximation.
  197. ggnn.pytorch: A PyTorch Implementation of Gated Graph Sequence Neural Networks (GGNN).
  198. visual-interaction-networks-pytorch: This’s an implementation of deepmind Visual Interaction Networks paper using pytorch
  199. adversarial-patch: PyTorch implementation of adversarial patch.
  200. Prototypical-Networks-for-Few-shot-Learning-PyTorch: Implementation of Prototypical Networks for Few Shot Learning (arxiv.org/abs/1703.05175) in Pytorch
  201. Visual-Feature-Attribution-Using-Wasserstein-GANs-Pytorch: Implementation of Visual Feature Attribution using Wasserstein GANs (arxiv.org/abs/1711.08998) in PyTorch.
  202. PhotographicImageSynthesiswithCascadedRefinementNetworks-Pytorch: Photographic Image Synthesis with Cascaded Refinement Networks – Pytorch Implementation
  203. ENAS-pytorch: PyTorch implementation of “Efficient Neural Architecture Search via Parameters Sharing”.
  204. Neural-IMage-Assessment: A PyTorch Implementation of Neural IMage Assessment.
  205. proxprop: Proximal Backpropagation – a neural network training algorithm that takes implicit instead of explicit gradient steps.
  206. FastPhotoStyle: A Closed-form Solution to Photorealistic Image Stylization
  207. Deep-Image-Analogy-PyTorch: A python implementation of Deep-Image-Analogy based on pytorch.
  208. Person-reID_pytorch: PyTorch for Person re-ID.
  209. pt-dilate-rnn: Dilated RNNs in pytorch.
  210. pytorch-i-revnet: Pytorch implementation of i-RevNets.
  211. OrthNet: TensorFlow and PyTorch layers for generating Orthogonal Polynomials.
  212. DRRN-pytorch: An implementation of Deep Recursive Residual Network for Super Resolution (DRRN), CVPR 2017
  213. shampoo.pytorch: An implementation of shampoo.
  214. Neural-IMage-Assessment 2: A PyTorch Implementation of Neural IMage Assessment.
  215. TCN: Sequence modeling benchmarks and temporal convolutional networks locuslab/TCN
  216. DCC: This repository contains the source code and data for reproducing results of Deep Continuous Clustering paper.
  217. packnet: Code for PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning arxiv.org/abs/1711.05769
  218. PyTorch-progressive_growing_of_gans: PyTorch implementation of Progressive Growing of GANs for Improved Quality, Stability, and Variation.
  219. nonauto-nmt: PyTorch Implementation of “Non-Autoregressive Neural Machine Translation”
  220. PyTorch-GAN: PyTorch implementations of Generative Adversarial Networks.
  221. PyTorchWavelets: PyTorch implementation of the wavelet analysis found in Torrence and Compo (1998)
  222. pytorch-made: MADE (Masked Autoencoder Density Estimation) implementation in PyTorch
  223. VRNN: Pytorch implementation of the Variational RNN (VRNN), from A Recurrent Latent Variable Model for Sequential Data.
  224. flow: Pytorch implementation of ICLR 2018 paper Deep Learning for Physical Processes: Integrating Prior Scientific Knowledge.
  225. deepvoice3_pytorch: PyTorch implementation of convolutional networks-based text-to-speech synthesis models
  226. psmm: imlementation of the the Pointer Sentinel Mixture Model, as described in the paper by Stephen Merity et al.
  227. tacotron2: Tacotron 2 – PyTorch implementation with faster-than-realtime inference.
  228. AccSGD: Implements pytorch code for the Accelerated SGD algorithm.
  229. QANet-pytorch: an implementation of QANet with PyTorch (EM/F1 = 70.5/77.2 after 20 epoches for about 20 hours on one 1080Ti card.)
  230. ConvE: Convolutional 2D Knowledge Graph Embeddings
  231. Structured-Self-Attention: Implementation for the paper A Structured Self-Attentive Sentence Embedding, which is published in ICLR 2017: arxiv.org/abs/1703.03130 .
  232. graphsage-simple: Simple reference implementation of GraphSAGE.
  233. Detectron.pytorch: A pytorch implementation of Detectron. Both training from scratch and inferring directly from pretrained Detectron weights are available.
  234. R2Plus1D-PyTorch: PyTorch implementation of the R2Plus1D convolution based ResNet architecture described in the paper “A Closer Look at Spatiotemporal Convolutions for Action Recognition”
  235. StackNN: A PyTorch implementation of differentiable stacks for use in neural networks.
  236. translagent: Code for Emergent Translation in Multi-Agent Communication.
  237. ban-vqa: Bilinear attention networks for visual question answering.
  238. pytorch-openai-transformer-lm: This is a PyTorch implementation of the TensorFlow code provided with OpenAI’s paper “Improving Language Understanding by Generative Pre-Training” by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever.
  239. T2F: Text-to-Face generation using Deep Learning. This project combines two of the recent architectures StackGAN and ProGAN for synthesizing faces from textual descriptions.
  240. pytorch – fid: A Port of Fréchet Inception Distance (FID score) to PyTorch
  241. vae_vpflows:Code in PyTorch for the convex combination linear IAF and the Householder Flow, J.M. Tomczak & M. Welling jmtomczak.github.io/deebmed.html
  242. CoordConv-pytorch: Pytorch implementation of CoordConv introduced in ‘An intriguing failing of convolutional neural networks and the CoordConv solution’ paper. (arxiv.org/pdf/1807.03247.pdf)
  243. SDPoint: Implementation of “Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks”, published in CVPR 2018.
  244. SRDenseNet-pytorch: SRDenseNet-pytorch(ICCV_2017)
  245. GAN_stability: Code for paper “Which Training Methods for GANs do actually Converge? (ICML 2018)”
  246. Mask-RCNN: A PyTorch implementation of the architecture of Mask RCNN, serves as an introduction to working with PyTorch
  247. pytorch-coviar: Compressed Video Action Recognition
  248. PNASNet.pytorch: PyTorch implementation of PNASNet-5 on ImageNet.
  249. NALU-pytorch: Basic pytorch implementation of NAC/NALU from Neural Arithmetic Logic Units arxiv.org/pdf/1808.00508.pdf
  250. LOLA_DiCE: Pytorch implementation of LOLA (arxiv.org/abs/1709.04326) using DiCE (arxiv.org/abs/1802.05098)
  251. generative-query-network-pytorch: Generative Query Network (GQN) in PyTorch as described in “Neural Scene Representation and Rendering”
  252. pytorch_hmax: Implementation of the HMAX model of vision in PyTorch.
  253. FCN-pytorch-easiest: trying to be the most easiest and just get-to-use pytorch implementation of FCN (Fully Convolotional Networks)
  254. transducer: A Fast Sequence Transducer Implementation with PyTorch Bindings.
  255. AVO-pytorch: Implementation of Adversarial Variational Optimization in PyTorch.
  256. HCN-pytorch: A pytorch reimplementation of { Co-occurrence Feature Learning from Skeleton Data for Action Recognition and Detection with Hierarchical Aggregation }.
  257. binary-wide-resnet: PyTorch implementation of Wide Residual Networks with 1-bit weights by McDonnel (ICLR 2018)
  258. piggyback: Code for Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights arxiv.org/abs/1801.06519
  259. vid2vid: Pytorch implementation of our method for high-resolution (e.g. 2048×1024) photorealistic video-to-video translation.
  260. poisson-convolution-sum: Implements an infinite sum of poisson-weighted convolutions
  261. tbd-nets: PyTorch implementation of “Transparency by Design: Closing the Gap Between Performance and Interpretability in Visual Reasoning” arxiv.org/abs/1803.05268
  262. attn2d: Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction
  263. yolov3: YOLOv3: Training and inference in PyTorch pjreddie.com/darknet/yolo
  264. deep-dream-in-pytorch: Pytorch implementation of the DeepDream computer vision algorithm.
  265. pytorch-flows: PyTorch implementations of algorithms for density estimation
  266. quantile-regression-dqn-pytorch: Quantile Regression DQN a Minimal Working Example
  267. relational-rnn-pytorch: An implementation of DeepMind’s Relational Recurrent Neural Networks in PyTorch.
  268. DEXTR-PyTorch: Deep Extreme Cut http://www.vision.ee.ethz.ch/~cvlsegmentation/dextr
  269. PyTorch_GBW_LM: PyTorch Language Model for Google Billion Word Dataset.
  270. Pytorch-NCE: The Noise Contrastive Estimation for softmax output written in Pytorch
  271. generative-models: Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN.
  272. convnet-aig: PyTorch implementation for Convolutional Networks with Adaptive Inference Graphs.
  273. integrated-gradient-pytorch: This is the pytorch implementation of the paper – Axiomatic Attribution for Deep Networks.
  274. MalConv-Pytorch: Pytorch implementation of MalConv.
  275. trellisnet: Trellis Networks for Sequence Modeling
  276. Learning to Communicate with Deep Multi-Agent Reinforcement Learning: pytorch implementation of Learning to Communicate with Deep Multi-Agent Reinforcement Learning paper.
  277. pnn.pytorch: PyTorch implementation of CVPR’18 – Perturbative Neural Networks http://xujuefei.com/pnn.html.
  278. Face_Attention_Network: Pytorch implementation of face attention network as described in Face Attention Network: An Effective Face Detector for the Occluded Faces.
  279. waveglow: A Flow-based Generative Network for Speech Synthesis.
  280. deepfloat: This repository contains the SystemVerilog RTL, C++, HLS (Intel FPGA OpenCL to wrap RTL code) and Python needed to reproduce the numerical results in “Rethinking floating point for deep learning”
  281. EPSR: Pytorch implementation of Analyzing Perception-Distortion Tradeoff using Enhanced Perceptual Super-resolution Network. This work has won the first place in PIRM2018-SR competition (region 1) held as part of the ECCV 2018.
  282. ClariNet: A Pytorch Implementation of ClariNet arxiv.org/abs/1807.07281
  283. pytorch-pretrained-BERT: PyTorch version of Google AI’s BERT model with script to load Google’s pre-trained models
  284. torch_waveglow: A PyTorch implementation of the WaveGlow: A Flow-based Generative Network for Speech Synthesis.
  285. 3DDFA: The pytorch improved re-implementation of TPAMI 2017 paper: Face Alignment in Full Pose Range: A 3D Total Solution.
  286. loss-landscape: loss-landscape Code for visualizing the loss landscape of neural nets.
  287. famos: Pytorch implementation of the paper “Copy the Old or Paint Anew? An Adversarial Framework for (non-) Parametric Image Stylization” available at http://arxiv.org/abs/1811.09236.
  288. back2future.pytorch: This is a Pytorch implementation of Janai, J., Güney, F., Ranjan, A., Black, M. and Geiger, A., Unsupervised Learning of Multi-Frame Optical Flow with Occlusions. ECCV 2018.
  289. FFTNet: Unofficial Implementation of FFTNet vocode paper.
  290. FaceBoxes.PyTorch: A PyTorch Implementation of FaceBoxes.
  291. Transformer-XL: Transformer-XL: Attentive Language Models Beyond a Fixed-Length Contexthttps://github.com/kimiyoung/transformer-xl
  292. associative_compression_networks: Associative Compression Networks for Representation Learning.
  293. fluidnet_cxx: FluidNet re-written with ATen tensor lib.
  294. Deep-Reinforcement-Learning-Algorithms-with-PyTorch: This repository contains PyTorch implementations of deep reinforcement learning algorithms.
  295. Shufflenet-v2-Pytorch: This is a Pytorch implementation of faceplusplus’s ShuffleNet-v2.
  296. GraphWaveletNeuralNetwork: This is a Pytorch implementation of Graph Wavelet Neural Network. ICLR 2019.
  297. AttentionWalk: This is a Pytorch implementation of Watch Your Step: Learning Node Embeddings via Graph Attention. NIPS 2018.
  298. SGCN: This is a Pytorch implementation of Signed Graph Convolutional Network. ICDM 2018.
  299. SINE: This is a Pytorch implementation of SINE: Scalable Incomplete Network Embedding. ICDM 2018.
  300. GAM: This is a Pytorch implementation of Graph Classification using Structural Attention. KDD 2018.
  301. neural-style-pt: A PyTorch implementation of Justin Johnson’s Neural-style.
  302. TuckER: TuckER: Tensor Factorization for Knowledge Graph Completion.
  303. pytorch-prunes: Pruning neural networks: is it time to nip it in the bud?
  304. SimGNN: SimGNN: A Neural Network Approach to Fast Graph Similarity Computation.
  305. Character CNN: PyTorch implementation of the Character-level Convolutional Networks for Text Classification paper.
  306. XLM: PyTorch original implementation of Cross-lingual Language Model Pretraining.
  307. DiffAI: A provable defense against adversarial examples and library for building compatible PyTorch models.
  308. APPNP: Combining Neural Networks with Personalized PageRank for Classification on Graphs. ICLR 2019.
  309. NGCN: A Higher-Order Graph Convolutional Layer. NeurIPS 2018.
  310. gpt-2-Pytorch: Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
  311. Splitter: Splitter: Learning Node Representations that Capture Multiple Social Contexts. (WWW 2019).
  312. CapsGNN: Capsule Graph Neural Network. (ICLR 2019).
  313. BigGAN-PyTorch: The author’s officially unofficial PyTorch BigGAN implementation.
  314. ppo_pytorch_cpp: This is an implementation of the proximal policy optimization algorithm for the C++ API of Pytorch.
  315. RandWireNN: Implementation of: “Exploring Randomly Wired Neural Networks for Image Recognition”.
  316. Zero-shot Intent CapsNet: GPU-accelerated PyTorch implementation of “Zero-shot User Intent Detection via Capsule Neural Networks”.
  317. SEAL-CI Semi-Supervised Graph Classification: A Hierarchical Graph Perspective. (WWW 2019).
  318. MixHop: MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing. ICML 2019.
  319. densebody_pytorch: PyTorch implementation of CloudWalk’s recent paper DenseBody.
  320. voicefilter: Unofficial PyTorch implementation of Google AI’s VoiceFilter system http://swpark.me/voicefilter.
  321. NVIDIA/semantic-segmentation: A PyTorch Implementation of Improving Semantic Segmentation via Video Propagation and Label Relaxation, In CVPR2019.
  322. ClusterGCN: A PyTorch implementation of “Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks” (KDD 2019).
  323. NVlabs/DG-Net: A PyTorch implementation of “Joint Discriminative and Generative Learning for Person Re-identification” (CVPR19 Oral).
  324. NCRF: Cancer metastasis detection with neural conditional random field (NCRF)
  325. pytorch-sift: PyTorch implementation of SIFT descriptor.
  326. brain-segmentation-pytorch: U-Net implementation in PyTorch for FLAIR abnormality segmentation in brain MRI.
  327. glow-pytorch: PyTorch implementation of Glow, Generative Flow with Invertible 1×1 Convolutions (arxiv.org/abs/1807.03039)
  328. EfficientNets-PyTorch: A PyTorch implementation of EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.
  329. STEAL: STEAL – Learning Semantic Boundaries from Noisy Annotations nv-tlabs.github.io/STEAL
  330. EigenDamage-Pytorch: Official implementation of the ICML’19 paper “EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis”.
  331. Aspect-level-sentiment: Code and dataset for ACL2018 paper “Exploiting Document Knowledge for Aspect-level Sentiment Classification”
  332. breast_cancer_classifier: Deep Neural Networks Improve Radiologists’ Performance in Breast Cancer Screening arxiv.org/abs/1903.08297
  333. DGC-Net: A PyTorch implementation of “DGC-Net: Dense Geometric Correspondence Network”.
  334. universal-triggers: Universal Adversarial Triggers for Attacking and Analyzing NLP (EMNLP 2019)
  335. Deep-Reinforcement-Learning-Algorithms-with-PyTorch: PyTorch implementations of deep reinforcement learning algorithms and environments.
  336. simple-effective-text-matching-pytorch: A pytorch implementation of the ACL2019 paper “Simple and Effective Text Matching with Richer Alignment Features”.
  337. Adaptive-segmentation-mask-attack (ASMA): A pytorch implementation of the MICCAI2019 paper “Impact of Adversarial Examples on Deep Learning Models for Biomedical Image Segmentation”.
  338. NVIDIA/unsupervised-video-interpolation: A PyTorch Implementation of Unsupervised Video Interpolation Using Cycle Consistency, In ICCV 2019.
  339. Seg-Uncertainty: Unsupervised Scene Adaptation with Memory Regularization in vivo, In IJCAI 2020.
  340. pulse: Self-Supervised Photo Upsampling via Latent Space Exploration of Generative Models
  341. distance-encoding: Distance-Encoding – Design Provably More PowerfulGNNs for Structural Representation Learning.

Talks & conferences

  1. PyTorch Conference 2018: First PyTorch developer conference at 2018.

Pytorch elsewhere

  1. the-incredible-pytorch: The Incredible PyTorch: a curated list of tutorials, papers, projects, communities and more relating to PyTorch.
  2. generative models: Collection of generative models, e.g. GAN, VAE in Tensorflow, Keras, and Pytorch. http://wiseodd.github.io
  3. pytorch vs tensorflow: an informative thread on reddit.
  4. Pytorch discussion forum
  5. pytorch notebook: docker-stack: A project similar to Jupyter Notebook Scientific Python Stack
  6. drawlikebobross: Draw like Bob Ross using the power of Neural Networks (With PyTorch)!
  7. pytorch-tvmisc: Totally Versatile Miscellanea for Pytorch
  8. pytorch-a3c-mujoco: Implement A3C for Mujoco gym envs.
  9. PyTorch in 5 Minutes.
  10. pytorch_chatbot: A Marvelous ChatBot implemented using PyTorch.
  11. malmo-challenge: Malmo Collaborative AI Challenge – Team Pig Catcher
  12. sketchnet: A model that takes an image and generates Processing source code to regenerate that image
  13. Deep-Learning-Boot-Camp: A nonprofit community run, 5-day Deep Learning Bootcamp http://deep-ml.com.
  14. Amazon_Forest_Computer_Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks. kaggle competition.
  15. AlphaZero_Gomoku: An implementation of the AlphaZero algorithm for Gomoku (also called Gobang or Five in a Row)
  16. pytorch-cv: Repo for Object Detection, Segmentation & Pose Estimation.
  17. deep-person-reid: Pytorch implementation of deep person re-identification approaches.
  18. pytorch-template: PyTorch template project
  19. Deep Learning With Pytorch TextBook A practical guide to build neural network models in text and vision using PyTorch. Purchase on Amazon github code repo
  20. compare-tensorflow-pytorch: Compare outputs between layers written in Tensorflow and layers written in Pytorch.
  21. hasktorch: Tensors and neural networks in Haskell
  22. Deep Learning With Pytorch Deep Learning with PyTorch teaches you how to implement deep learning algorithms with Python and PyTorch.
  23. nimtorch: PyTorch – Python + Nim
  24. derplearning: Self Driving RC Car Code.
  25. pytorch-saltnet: Kaggle | 9th place single model solution for TGS Salt Identification Challenge.
  26. pytorch-scripts: A few Windows specific scripts for PyTorch.
  27. pytorch_misc: Code snippets created for the PyTorch discussion board.
  28. awesome-pytorch-scholarship: A list of awesome PyTorch scholarship articles, guides, blogs, courses and other resources.
  29. MentisOculi: A raytracer written in PyTorch (raynet?)
  30. DoodleMaster: “Don’t code your UI, Draw it !”
  31. ocaml-torch: OCaml bindings for PyTorch.
  32. extension-script: Example repository for custom C++/CUDA operators for TorchScript.
  33. pytorch-inference: PyTorch 1.0 inference in C++ on Windows10 platforms.
  34. pytorch-cpp-inference: Serving PyTorch 1.0 Models as a Web Server in C++.
  35. tch-rs: Rust bindings for PyTorch.
  36. TorchSharp: .NET bindings for the Pytorch engine
  37. ML Workspace: All-in-one web IDE for machine learning and data science. Combines Jupyter, VS Code, PyTorch, and many other tools/libraries into one Docker image.
  38. PyTorch Style Guide Style guide for PyTorch code. Consistent and good code style helps collaboration and prevents errors!


PyTorch – сверточные нейронные сети

Лекция: Сверточные нейронные сети

torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True, padding_mode=’zeros’)
Parameters
• in_channels (int) – Number of channels in the input image
• out_channels (int) – Number of channels produced by the convolution
• kernel_size (int or tuple) – Size of the convolving kernel
• stride (int or tuple, optional) – Stride of the convolution. (Default: 1)
• padding (int or tuple, optional) – Zero-padding added to both sides of the input (Default: 0)
• padding_mode (string, optional) – zeros
• dilation (int or tuple, optional) – Spacing between kernel elements. (Default: 1)
• groups (int, optional) – Number of blocked connections from input to output channels. (Default: 1)
• bias (bool, optional) – If True, adds a learnable bias to the output. (Default: True)
And this URL has helpful visualization of the process.
So the in_channels in the beginning is 3 for images with 3 channels (colored images). For images black and white it should be 1. Some satellite images should have 4.
The out_channels is what convolution will produce so these are the number of filters.
Let’s create an example to “prove” that.
import torch
import torch.nn as nn
c = nn.Conv2d(1,3, stride = 1, kernel_size=(4,5))
print(c.weight.shape)
print(c.weight)
Out
torch.Size([3, 1, 4, 5])
Parameter containing:
tensor([[[[ 0.1571, 0.0723, 0.0900, 0.1573, 0.0537],
[-0.1213, 0.0579, 0.0009, -0.1750, 0.1616],
[-0.0427, 0.1968, 0.1861, -0.1787, -0.2035],
[-0.0796, 0.1741, -0.2231, 0.2020, -0.1762]]],

[[[ 0.1811, 0.0660, 0.1653, 0.0605, 0.0417],
[ 0.1885, -0.0440, -0.1638, 0.1429, -0.0606],
[-0.1395, -0.1202, 0.0498, 0.0432, -0.1132],
[-0.2073, 0.1480, -0.1296, -0.1661, -0.0633]]],

[[[ 0.0435, -0.2017, 0.0676, -0.0711, -0.1972],
[ 0.0968, -0.1157, 0.1012, 0.0863, -0.1844],
[-0.2080, -0.1355, -0.1842, -0.0017, -0.2123],
[-0.1495, -0.2196, 0.1811, 0.1672, -0.1817]]]], requires_grad=True)
If we would alter the number of out_channels,
c = nn.Conv2d(1,5, stride = 1, kernel_size=(4,5))
print(c.weight.shape) # torch.Size([5, 1, 4, 5])
We will get 5 filters each filter 4×5 as this is our kernel size. If we would set 2 channels, (some images may have 2 channels only)
c = nn.Conv2d(2,5, stride = 1, kernel_size=(4,5))
print(c.weight.shape) # torch.Size([5, 2, 4, 5])
our filter will have 2 channels.
I think they have terms from this book and since in there they haven’t called it filters, they haven’t used that term.
So you are right; filters are the what conv layer is learning and the number of filters is the number of out channels. They are set random at start.
Number of activations is calculated based on bs and image dimension:
bs=16
x = torch.randn(bs, 3, 28, 28)
c = nn.Conv2d(3,10,kernel_size=5,stride=1,padding=2)
out = c(x)
print(out.nelement()) #125440 number of activations

https://www.programcreek.com/python/example/107691/torch.nn.Conv2d

CONV2D
CLASStorch.nn.Conv2d(in_channels: int, out_channels: int, kernel_size: Union[T, Tuple[T, T]], stride: Union[T, Tuple[T, T]] = 1, padding: Union[T, Tuple[T, T]] = 0, dilation: Union[T, Tuple[T, T]] = 1, groups: int = 1, bias: bool = True, padding_mode: str = ‘zeros’)[SOURCE]
Applies a 2D convolution over an input signal composed of several input planes.
In the simplest case, the output value of the layer with input size (N, C_{\text{in}}, H, W)(N,Cin,H,W) and output (N, C_{\text{out}}, H_{\text{out}}, W_{\text{out}})(N,Cout,Hout,Wout) can be precisely described as:
\text{out}(N_i, C_{\text{out}_j}) = \text{bias}(C_{\text{out}_j}) + \sum_{k = 0}^{C_{\text{in}} – 1} \text{weight}(C_{\text{out}_j}, k) \star \text{input}(N_i, k)out(Ni,Coutj)=bias(Coutj)+k=0∑Cin−1weight(Coutj,k)⋆input(Ni,k)
where \star⋆ is the valid 2D cross-correlation operator, NN is a batch size, CC denotes a number of channels, HH is a height of input planes in pixels, and WW is width in pixels.
• stride controls the stride for the cross-correlation, a single number or a tuple.
• padding controls the amount of implicit zero-paddings on both sides for padding number of points for each dimension.
• dilation controls the spacing between the kernel points; also known as the à trous algorithm. It is harder to describe, but this link has a nice visualization of what dilation does.
• groups controls the connections between inputs and outputs. in_channels and out_channels must both be divisible by groups. For example,
o At groups=1, all inputs are convolved to all outputs.
o At groups=2, the operation becomes equivalent to having two conv layers side by side, each seeing half the input channels, and producing half the output channels, and both subsequently concatenated.
o At groups= in_channels, each input channel is convolved with its own set of filters, of size: \left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor⌊in_channelsout_channels⌋ .
The parameters kernel_size, stride, padding, dilation can either be:
• a single int – in which case the same value is used for the height and width dimension
• a tuple of two ints – in which case, the first int is used for the height dimension, and the second int for the width dimension
NOTE
Depending of the size of your kernel, several (of the last) columns of the input might be lost, because it is a valid cross-correlation, and not a full cross-correlation. It is up to the user to add proper padding.
NOTE
When groups == in_channels and out_channels == K * in_channels, where K is a positive integer, this operation is also termed in literature as depthwise convolution.
In other words, for an input of size (N, C_{in}, H_{in}, W_{in})(N,Cin,Hin,Win) , a depthwise convolution with a depthwise multiplier K, can be constructed by arguments (in\_channels=C_{in}, out\_channels=C_{in} \times K, …, groups=C_{in})(in_channels=Cin,out_channels=Cin×K,…,groups=Cin) .
NOTE
In some circumstances when using the CUDA backend with CuDNN, this operator may select a nondeterministic algorithm to increase performance. If this is undesirable, you can try to make the operation deterministic (potentially at a performance cost) by setting torch.backends.cudnn.deterministic = True. Please see the notes on Reproducibility for background.
Parameters
• in_channels (int) – Number of channels in the input image
• out_channels (int) – Number of channels produced by the convolution
• kernel_size (int or tuple) – Size of the convolving kernel
• stride (int or tuple, optional) – Stride of the convolution. Default: 1
• padding (int or tuple, optional) – Zero-padding added to both sides of the input. Default: 0
• padding_mode (string, optional) – ‘zeros’, ‘reflect’, ‘replicate’ or ‘circular’. Default: ‘zeros’
• dilation (int or tuple, optional) – Spacing between kernel elements. Default: 1
• groups (int, optional) – Number of blocked connections from input channels to output channels. Default: 1
• bias (bool, optional) – If True, adds a learnable bias to the output. Default: True
Shape:
• Input: (N, C_{in}, H_{in}, W_{in})(N,Cin,Hin,Win)
• Output: (N, C_{out}, H_{out}, W_{out})(N,Cout,Hout,Wout) where
H_{out} = \left\lfloor\frac{H_{in} + 2 \times \text{padding}[0] – \text{dilation}[0] \times (\text{kernel\_size}[0] – 1) – 1}{\text{stride}[0]} + 1\right\rfloorHout=⌊stride[0]Hin+2×padding[0]−dilation[0]×(kernel_size[0]−1)−1+1⌋
W_{out} = \left\lfloor\frac{W_{in} + 2 \times \text{padding}[1] – \text{dilation}[1] \times (\text{kernel\_size}[1] – 1) – 1}{\text{stride}[1]} + 1\right\rfloorWout=⌊stride[1]Win+2×padding[1]−dilation[1]×(kernel_size[1]−1)−1+1⌋
Variables
• ~Conv2d.weight (Tensor) – the learnable weights of the module of shape (\text{out\_channels}, \frac{\text{in\_channels}}{\text{groups}},(out_channels,groupsin_channels, \text{kernel\_size[0]}, \text{kernel\_size[1]})kernel_size[0],kernel_size[1]) . The values of these weights are sampled from \mathcal{U}(-\sqrt{k}, \sqrt{k})U(−k,k) where k = \frac{groups}{C_\text{in} * \prod_{i=0}^{1}\text{kernel\_size}[i]}k=Cin∗∏i=01kernel_size[i]groups
• ~Conv2d.bias (Tensor) – the learnable bias of the module of shape (out_channels). If bias is True, then the values of these weights are sampled from \mathcal{U}(-\sqrt{k}, \sqrt{k})U(−k,k) where k = \frac{groups}{C_\text{in} * \prod_{i=0}^{1}\text{kernel\_size}[i]}k=Cin∗∏i=01kernel_size[i]groups
Examples
>>> # With square kernels and equal stride
>>> m = nn.Conv2d(16, 33, 3, stride=2)
>>> # non-square kernels and unequal stride and with padding
>>> m = nn.Conv2d(16, 33, (3, 5), stride=(2, 1), padding=(4, 2))
>>> # non-square kernels and unequal stride and with padding and dilation
>>> m = nn.Conv2d(16, 33, (3, 5), stride=(2, 1), padding=(4, 2), dilation=(3, 1))
>>> input = torch.randn(20, 16, 50, 100)
>>> output = m(input)

К концу 2020 года библиотеку машинного обучения PyTorch будут развивать больше разработчиков, чем TensorFlow. PyTorch создана Facebook, TensorFlow — Google, обе с открытым кодом. TensorFlow считается фактическим стандартом, она появилась раньше, чем PyTorch. Но по данным сайта OpenHub, за последний год у обоих проектов было примерно одинаковое количество активных разработчиков. При этом сообщество пользователей TensorFlow гораздо крупнее, чем у PyTorch, но в научно-исследовательской среде библиотека Facebook вырвалась вперед и сейчас используется гораздо шире. Преимуществом PyTorch называют то, что это нативная библиотека для Python, языка программирования, который сейчас наиболее широко применяется для задач машинного обучения, тогда как для использования TensorFlow с Python предоставляется специальный API. Кроме того, в PyTorch применяется динамическая модель работы с графами, упрощающая программирование, хотя в TensorFlow, начиная с версии 2.0, тоже появилась аналогичная особенность.

Deep Learning (with PyTorch)

Week 1 – Lecture: History, motivation, and evolution of Deep Learning

Week 1 – Practicum: Classification, linear algebra, and visualisation

Week 2 – Lecture: Stochastic gradient descent and backpropagation

Week 3 – Lecture: Convolutional neural networks