Hello Graph Enthusiasts,
I am interested in learning on one knowledge graph and then predicting links in another graph. For example, if I have two graphs in different languages but in the same knowledge domain, it would be cool to be able to learn on just one of them and then use that learned model to predict on the second graph.
My approach so far has been to initialize node embeddings similarly in both graphs (using multilingual BERT, for example) and hope that the learned model would be able to make use of these embeddings to make predictions in the second graph. Unfortunately the results have been quite poor, and I think it’s because the hidden representation learned by the model using the first graph differs too much from a hidden representation that would be useful for prediction on the second graph.
So far I have tried to modify KBAT [0] and R-GCN [1]. Unfortunately KBAT recently was shown to have some test data leakage [2], so the results I obtained with it are unreliable (I wasn’t able to achieve any transfer learning there anyway). Is anyone aware of other graph attention networks for link prediction? Also any transfer learning experiments I ran using R-GCN didn’t work out. I’m planning on checking out CompGCN [3] next.
I was just curious if any of you have thought about knowledge graph transfer learning and if you would have a different approach. Any references to relevant literature would also be highly appreciated, it seems like KG transfer learning hasn’t been studied that much yet. I’m also happy to share any of the code I’ve written (custom embedding initialization, training on one KG and predicting on another, etc.)
Thank you!
[0] https://arxiv.org/abs/1906.01195
[1] https://arxiv.org/abs/1703.06103
[2] https://arxiv.org/abs/1911.03903
[3] https://arxiv.org/abs/1911.03082v2