Is it possible to use sparse tensors on GATConv layers?

I haven’t tried it yet, but I was wondering if it was possible pass a sparse tensor to GATConv layer instead of a dense one?

Thank you.

Hi,

Yes, but why you want to do this? Could you use a piece of pseudocode tell us what you want to do?

Thanks VoVAllen.

It was just curiosity really. My feature vector is very sparse and a bit long (3kb) and my dataset end up being rather large. I was wondering if I could save everything as a sparse vector and save on both storage and transfer times, but I wanted to ask before making experiments.

So your problem is your feature vector is sparse. My suggestion is you firstly linear project the feature vector to a dense one with smaller dimensionality (say 300 for example), and then feed the tensor to GATConv.

Hi Zihao,

I tried that (via PCA) and I have a sensible loss in accuracy when I choose less than 1k components. I eventually decided to remove the PCA as the dataset is 1:10 unbalanced and it seems to help to have all components available. Thanks for your suggestion!

What if you don’t use PCA but directly use a learnable weight via a nn.Linear(feature_size, small_feature_size) in PyTorch?

I hadn’t thought of it, I will try right away, thank you for the idea.