Hi there,
Met a problem when using GAT with DGL. Forward works with no problem but when do loss.backward(), the model gives the following error:
File "run_twosides_ML.py", line 148, in train loss.backward() File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/site-packages/torch/tensor.py", line 118, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/site-packages/torch/autograd/__init__.py", line 93, in backward allow_unreachable=True) # allow_unreachable flag File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/site-packages/torch/autograd/function.py", line 77, in apply return self._forward_cls.backward(self, *args) File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/site-packages/dgl/nn/pytorch/softmax.py", line 74, in backward g.ndata.pop(accum_name) File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/_collections_abc.py", line 795, in pop value = self[key] File "/mnt/cephfs2/asr/users/ming.tu/anaconda3/envs/py36/lib/python3.6/site-packages/dgl/view.py", line 57, in __getitem__ return self._graph.get_n_repr(self._nodes)[key] KeyError: 'accum'