Conversation
* With this extra linear module, we now also check that the gradients wrt the args that require grad are correctly backpropagated.
Codecov Report✅ All modified and coverable lines are covered by tests.
🚀 New features to boost your workflow:
|
PierreQuinton
left a comment
There was a problem hiding this comment.
Not clear to me that we do use a non-empty kwargs in the test. But I trust you checked. LGTM
We do it in def forward(self, input: Tensor) -> Tensor:
return self.with_string_arg(s="two", input=input)=> The hooked module called
Now, those are correctly passed to the hook. I'm gonna add a few more architectures to test this more heavily. |
09d6daa to
2ec986a
Compare
This adds support for modules that have some keyword arguments passed to their
forwardmethod.Seems to work on a simple example, but needs more heavy testing.