Knowledge Graph Embeddings

I was trying to train the Knowledge Graph Embedding with dgl-ke. 2 key questions on the concept that I have doubts on

  1. When is Knowledge Graphs better than NLP in Question-Answering systems. Couldn’t we create a model with BERT that could be better in QA models
  2. Couldn’t the Knowledge Graphs be trained with Graph Convolution and Graph Attention Networks to obtain the embedding. Why do we need models like TransE or TransR for the same. Please help to clarify

Hi, for question 1, the answer depends on the knowledge base of the QA system.

If the QA system is based on textual data, like web pages and documents, then it’s better to use pretrained Language Models like BERT trained on the corpus. If the QA system is based on Knowledge Graphs (KGs), then we are not able to use Language Models to train on KGs.

There are also works on training joint language and knowledge models, for example, ERNIE, KEPLER, and CoLAKE.

For the second question. We can use GCN or GAT to generate embeddings. The advantage of KGE model is they are cheeper.

Thanks for the clarification