Fix GradCAM ReLU placement per original paper#809
Open
Mr-Neutr0n wants to merge 2 commits intosalesforce:mainfrom
Open
Fix GradCAM ReLU placement per original paper#809Mr-Neutr0n wants to merge 2 commits intosalesforce:mainfrom
Mr-Neutr0n wants to merge 2 commits intosalesforce:mainfrom
Conversation
Move ReLU (clamp) from intermediate gradients to the final gradcam computation. Per the original GradCAM paper, ReLU should be applied to the weighted combination of feature maps (cams * grads), not to the gradients alone. Fixes salesforce#789 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
Thanks for the contribution! Before we can merge this, we need @Mr-Neutr0n to sign the Salesforce Inc. Contributor License Agreement. |
Author
|
I have signed the CLA. Please recheck. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Changes
.clamp(0)from gradient computation on line 178.clamp(min=0)to the finalgradcams = (cams * grads).clamp(min=0)resultBackground
The GradCAM paper (Selvaraju et al., 2017) specifies that ReLU is applied to the final linear combination of weighted feature maps to obtain the class-discriminative localization map. Applying ReLU to gradients prematurely removes negative gradient information that may be relevant for the weighted combination.
Test plan
Fixes #789
🤖 Generated with Claude Code