I am trying to implement the node dropout mentioned in paper “Self-supervised Graph Learning for Recommendation”. Should I modify the message passing in conv module (e.g., GATConv)? or should I create a new graph with nodes dropped before the training? Any advices is appreciated!
Does DGL provides any support to data augmentation or contrastive learning?
Thank you all in advance.