Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] get_attention parameter in GlobalAttentionPooling #3837

Merged
merged 6 commits into from
Mar 24, 2022
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
get_attention parameter in GlobalAttentionPooling
  • Loading branch information
decoherencer committed Mar 13, 2022
commit 1bdfcc7c6eb2fad416d3d908d92057a689933941
14 changes: 11 additions & 3 deletions python/dgl/nn/pytorch/glob.py
Original file line number Diff line number Diff line change
Expand Up @@ -419,7 +419,7 @@ def __init__(self, gate_nn, feat_nn=None):
self.gate_nn = gate_nn
self.feat_nn = feat_nn

def forward(self, graph, feat):
def forward(self, graph, feat, get_attention=False):
r"""

Compute global attention pooling.
Expand All @@ -431,12 +431,17 @@ def forward(self, graph, feat):
feat : torch.Tensor
The input node feature with shape :math:`(N, D)` where :math:`N` is the
number of nodes in the graph, and :math:`D` means the size of features.
get_attention : bool, optional
Whether to return the attention values from gate_nn. Default to False.

Returns
-------
torch.Tensor
The output feature with shape :math:`(B, D)`, where :math:`B` refers
to the batch size.
torch.Tensor, optional
The attention values of shape :math:`(N, 1)`, where :math:`N` is the number of
nodes in the graph. This is returned only when :attr:`get_attention` is ``True``.
"""
with graph.local_scope():
gate = self.gate_nn(feat)
Expand All @@ -450,8 +455,11 @@ def forward(self, graph, feat):
graph.ndata['r'] = feat * gate
readout = sum_nodes(graph, 'r')
graph.ndata.pop('r')

return readout

if get_attention:
return readout, gate
else:
return readout


class Set2Set(nn.Module):
Expand Down