Is it possible to use dgl.sparse.SparseMatrix as node features in message passing?

Hi, I am working with large homogenous graphs (2M+ nodes) with sparse features.
In order to fit my features in memory I want to store the features using sparse matrices.
Is using sparse matrices (torch sparse tensor or dgl sparse matrix) currently supported for node features and message passing?
Here is an example of what I am trying to do:

feat_sp = dglsp.from_torch_sparse(feat) 
graph.ndata['h'] = feat_sp 
graph.update_all(fn.copy_u('h', 'm'),
                 fn.sum('m', 'h'))

When I run this I get an error:

Thanks!

Hi @ysteven13, DGL sparse matrix is not compatible with DGL message passing primitives. You can try SPMM of DGL sparse matrix for feature aggregation. Building a Graph Convolutional Network Using Sparse Matrices — DGL 1.1.3 documentation

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.