DGLError: Expect all graphs to have the same schema on nodes

Hi there DGL community,

I am facing a problem which occurs following DGL graph batching.
Upon inspection, my graphs, as well as my batched graphs, look fine. However, upon trying to print my graphs from the batched dataloader I get the following error:

DGLError                                  Traceback (most recent call last)
<ipython-input-21-98b43f91c76b> in <module>()
----> 1 for batched_g in data_loader:
      2   print (batched_g)

6 frames
/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py in __next__(self)
    519             if self._sampler_iter is None:
    520                 self._reset()
--> 521             data = self._next_data()
    522             self._num_yielded += 1
    523             if self._dataset_kind == _DatasetKind.Iterable and \

/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py in _next_data(self)
    559     def _next_data(self):
    560         index = self._next_index()  # may raise StopIteration
--> 561         data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
    562         if self._pin_memory:
    563             data = _utils.pin_memory.pin_memory(data)

/usr/local/lib/python3.7/dist-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
     45         else:
     46             data = self.dataset[possibly_batched_index]
---> 47         return self.collate_fn(data)

<ipython-input-10-2eef0974b7c3> in collate_graphs(samples)
      8     index_list.append(sample['index'])
      9 
---> 10   batched_graph = dgl.batch(graph_list)
     11   batched_label = torch.tensor(label_list)
     12   batched_index = torch.tensor(index_list)

/usr/local/lib/python3.7/dist-packages/dgl/batch.py in batch(graphs, ndata, edata, node_attrs, edge_attrs)
    197             # TODO: do we require graphs with no nodes/edges to have the same schema?  Currently
    198             # we allow empty graphs to have no features during batching.
--> 199             ret_feat = _batch_feat_dicts(frames, ndata, 'nodes["{}"].data'.format(ntype))
    200             retg.nodes[ntype].data.update(ret_feat)
    201 

/usr/local/lib/python3.7/dist-packages/dgl/batch.py in _batch_feat_dicts(frames, keys, feat_dict_name)
    235     # sanity checks
    236     if is_all(keys):
--> 237         utils.check_all_same_schema(schemas, feat_dict_name)
    238         keys = schemas[0].keys()
    239     else:

/usr/local/lib/python3.7/dist-packages/dgl/utils/checks.py in check_all_same_schema(schemas, name)
    130                 'Expect all graphs to have the same schema on {}, '
    131                 'but graph {} got\n\t{}\nwhich is different from\n\t{}.'.format(
--> 132                     name, i, schema, schemas[0]))
    133 
    134 def check_all_same_schema_for_keys(schemas, keys, name):

DGLError: Expect all graphs to have the same schema on nodes["_N"].data, but graph 3 got
	{}
which is different from
	{'features': Scheme(shape=(4,), dtype=torch.float32)}.

Graph 3 is by no means special; this number (3) changes if I change the batch size.

If someone could help I would be eternally thankful!
Instead of posting more code, I am willing to share my Colab notebook with anyone who is willing to take a closer look.

Many thanks in advance!

Roy

Hi,

This means Graph 3 doesn’t have ‘features’ section in ndata. Is this the case in your scenario?

dgl.batch requires all the graph have same schemes. The error means some graph doesn’t have certain feature

Hi, thanks! Indeed some graphs were just filled with random data. I’m now trying to only take “proper” graphs to be batched, which seems to improve things. I’ll report further as this progresses.
Cheers.

Hi,
I got this:

DGLError: Expect all graphs to have the same schema on nodes["_N"].data, but graph 1 got
{‘feat’: Scheme(shape=(571,), dtype=torch.float32)}
which is different from
{‘feat’: Scheme(shape=(659,), dtype=torch.float32)}.
If I do have different shapes of schemas, what should I do to use the dataloader?

I think you need to make sure (as mentioned above) all graphs have some features.

Hi,
thanks a lot! Actually, these graphs have different features, i.e. (571,) and (659,). But in some cases, we do need some graphs with different shapes of schemas.

Indeed. But maybe in this case you should look into using heterographs instead.

Much thanks for your invaluable suggestion!

With pleasure, any time :slight_smile:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.