Introduction
This autograd engine provides a powerful framework for automatic differentiation and backpropagation. It is designed to be efficient, and flexible. it is mainly designed to understand the internal concepts underlying deep learning.
Key Features
- Automatic Differentiation : The engine supports numerical automatic differentiation, allowing you to compute gradients of complex functions with ease.
- Good-Performance
- Flexibility : The engine is designed to be modular and extensible, allowing developers to easily add new features and integrations.
Usage
Building the Engine
To build the engine, simply run bash build.sh
in the root directory. This will compile the source code into a shared library that can be linked against other projects.
Unit tests
To run unit tests you simply run this program ./build/bin/micro_torch_unit_tests.exe
after building the root directory.
Unit tests covers
- Basic operations like (+, -, *, /, …).
- Automatic differentiation.
- Simple networks like (not, and, or) gates.
Using the Engine
Here is a simple network (Not Gate)
|
|