Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

GradType: a subset of dtype that is differentiable, containing all float and complex dtypes #128793

Open
dvorst opened this issue Jun 16, 2024 · 2 comments
Labels
module: autograd Related to torch.autograd, and the autograd engine in general module: docs Related to our documentation, both in docs/ and docblocks triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@dvorst
Copy link
Contributor

dvorst commented Jun 16, 2024

馃殌 The feature, motivation and pitch

In the documentation, dtype is often used as a type hint where in actuality only differentiable dtypes are allowed (float and complex). Therefore the request to add GradType, a type that can have a gradient / is differentiable, which encompasses all different float and complex types. This allows type hints to more precisely indicate what is allowed.

I would like to implement this feature myself if accepted.

Alternatives

No response

Additional context

No response

cc @svekars @brycebortree @ezyang @albanD @gqchen @pearu @nikitaved @soulitzer @Varal7

@mikaylagawarecki mikaylagawarecki added triage review topic: docs topic category module: docs Related to our documentation, both in docs/ and docblocks module: autograd Related to torch.autograd, and the autograd engine in general and removed topic: docs topic category labels Jun 17, 2024
@mikaylagawarecki
Copy link
Contributor

Do you have specific examples of functions that are type-hinted with dtype that would benefit from this

@mikaylagawarecki mikaylagawarecki added triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module and removed triage review labels Jun 17, 2024
@dvorst
Copy link
Contributor Author

dvorst commented Jun 17, 2024

I suppose everything for which a gradient is calculated during backprop, such as torch.nn.Linear

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: autograd Related to torch.autograd, and the autograd engine in general module: docs Related to our documentation, both in docs/ and docblocks triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Status: To pick up
Development

No branches or pull requests

2 participants