Deep learning in Rust, with shape checked tensors and neural networks
-
Updated
Jul 23, 2024 - Rust
Deep learning in Rust, with shape checked tensors and neural networks
automatic differentiation made easier for C++
Tensors and dynamic neural networks in pure Rust.
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
Drop-in autodiff for NumPy.
FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
Differentiate python calls from Julia
Fazang is a Fortran library for reverse-mode automatic differentiation, inspired by Stan/Math library.
[wip] Lightweight Automatic Differentiation & DeepLearning Framework implemented in pure Julia.
A toy deep learning framework implemented in pure Numpy from scratch. Aka homemade PyTorch lol.
Yaae: Yet another autodiff engine (written in Numpy).
XLuminA, a highly-efficient, auto-differentiating discovery framework for super-resolution microscopy.
Forward mode automatic differentiation for Fortran
A minimalist neural networks library built on a tiny autograd engine
Algorithmic differentiation with hyper-dual numbers in C++ and Python
JAX Tutorial notebooks : basics, crash & tips, usage of optax/JaxOptim/Numpyro
Chemical Explosive Mode Analysis for computational/experimental combustion diagnostics using Julia SciML features
Scala embedded universal probabilistic programming language
C++20 numerical and analytical derivative computations
Add a description, image, and links to the autodifferentiation topic page so that developers can more easily learn about it.
To associate your repository with the autodifferentiation topic, visit your repo's landing page and select "manage topics."