Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding requires_grad flag to Value class #12

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

rhalbersma
Copy link

I've added a requires_grad flag to the Value class that defaults to False, mimicking PyTorch's behavior.

The topological sort of the computational graph can then stop expanding nodes that don't require gradient propagation. E.g. in expressions such as x + 2 the conversion of the scalar 2 to Value(2) will no longer propagate the gradient for that node.

I've also incorporated these changes into the neural network where all parameters receive requires_grad=True. All unit tests continue to pass and the demo notebook finds the exact same loss after 99 iterations.

Note that I also did some other small refactorings such as the separation between local gradient computation and propagation. This made it easier to selectively propagate the gradient based on the requires_grad flags. I've renamed the internal _backward attribute to a constructor argument gradient_fn.

All in all, the engine.py module was reduced by 15 lines, and weighs in at less than 80 lines now.

@rhalbersma rhalbersma changed the title Add requires_grad flag to Value class Adding requires_grad flag to Value class Dec 18, 2020
@rhalbersma
Copy link
Author

Any update on this? Is this repo not taking PRs, or should I polish the PR a bit more?

@fcakyon
Copy link

fcakyon commented Apr 10, 2021

@rhalbersma unfortunately this repo doesn't take PRs..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants