-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] get_attention parameter in GlobalAttentionPooling #3837
Conversation
To trigger regression tests:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks!
@decoherencer Could you fix these issues caught by CI?
You need to remove the extra empty spaces at the end of L458 and add a new empty line after L1268. |
Sorry. I got into this weird state with my pip, which I only noticed now. |
@decoherencer No worries. There is something wrong with the CI. If you don't mind, I can directly modify your code when CI is back. Thanks. |
Sure. I pushed lint fix, it seems to have passed in last CI lint checks before it crashed on cpu build, you can modify if any other errors |
@decoherencer Thanks for your patience. The PR has been merged. |
nice. Looking forward to help more |
Thanks. That will be great! |
Description
@mufeili Added
get_attention
optional parameter to return node attention weights from GlobalAttentionPoolingfrom the https://discuss.dgl.ai/t/about-globalattentionpooling/2766
Checklist
or have been fixed to be compatible with this change
Changes