-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zygote
+ ForwardDiff
support for complex differentiation
#977
Labels
Comments
This is Zygote trying to use forwarddiff for differentiating broadcasting of complex numbers using forwarddiff which won't work nicely. Can you add a |
avik-pal
changed the title
Lux support for complex differentiation
Oct 16, 2024
Zygote
+ ForwardDiff
support for complex differentiation
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi! In my efforts to assess different modes of nested differentiation tools, I am interested in the combination of complex step differentiation with reverse AD. I have the following example of a small neural network where I pass complex numbers, but when computing the gradient in reverse mode it gives an error.
I am posting this issue here because this used to work with previous versions of Lux, so I am wondering if new changes had been added to now break this kind of uses
@avik-pal something like this is what I was using before the new nested AD feature in Lux, so I would like to have all the differentiation modes working with the same architecture/Lux version since I am hoping to be able to compare them. Happy to provide more information!
The text was updated successfully, but these errors were encountered: