Hello!
I have noticed a strange behaviour when saving/loading graphs with incorrect label dimension:
import torch
import dgl
g = dgl.graph((torch.tensor([1,2,3]), torch.tensor([2,1,3]))
dgl.save_graphs(
filename = 'test.bin',
g_list=[g], labels = {"test": torch.tensor(42)}
)
if we load it, the result is not expected (label tensor value is different):
print(dgl.load_graphs(filename = 'test.bin'))
([Graph(num_nodes=4, num_edges=3,
ndata_schemes={}
edata_schemes={})], {'test': tensor(0)})
“Correct” way to save is to add matching dimension to the label tensor
dgl.save_graphs(
filename = 'test.bin',
g_list=[g], labels = {"test": torch.tensor([42])}
dgl.load_graphs(filename = 'test.bin')
([Graph(num_nodes=4, num_edges=3,
ndata_schemes={}
edata_schemes={})], {'test': tensor([42])})
It would perhaps be beneficial to add a dimension check during save_graphs
, to stop this mistake.
Typical pattern when this happens is when one wishes to save each graph separately for future data-loading purposes.
Thank you!