Pinsage error when Fasttext pretrained embeddings are used

When I uncomment “field.build_vocab(getattr(textset, key), vectors=‘fasttext.simple.300d’)” to use fasttext embedding, I get this error:

File “model.py”, line 96, in train
opt = torch.optim.Adam(model.parameters(), lr=args.lr)
File “/Users/n0f00bt/PycharmProjects/untitled/venv/lib/python3.7/site-packages/torch/optim/adam.py”, line 44, in init
super(Adam, self).init(params, defaults)
File “/Users/n0f00bt/PycharmProjects/untitled/venv/lib/python3.7/site-packages/torch/optim/optimizer.py”, line 51, in init
self.add_param_group(param_group)
File “/Users/n0f00bt/PycharmProjects/untitled/venv/lib/python3.7/site-packages/torch/optim/optimizer.py”, line 213, in add_param_group
raise ValueError(“can’t optimize a non-leaf Tensor”)
ValueError: can’t optimize a non-leaf Tensor

There’s a small bug in L55 of layers.py. You need to change

self.emb.weight[:] = field.vocab.vectors

to

self.emb.weight.data[:] = field.vocab.vectors