DGL for Abstract Meaning Representation

Abstract Meaning Representation (AMR) is a formalism for converting English sentences into a sentence format agnostic graph structure. Much of the work today revolves around converting a sequence to graph and then back to a sequence. The state of the art for StoG and GtoS use graph attention based NNs plus other misc techniques such as direct copy mechanisms. The AMR LDC dataset for training these consist of about 60K sentence / graph pairs.

I have not seen any DGL implementations for AMR, only models in raw pytorch. Is DGL applicable for this type of problem and are there example models that are reasonably close to this type of implementation?

I’m familiar with DNNs but I’m a DGL newbie. I’m starting with the tutorial but I’m hoping someone can point me to the correct example pytorch model here to get started with.

I’d appreciate any advice on which type of models to start with and any pointers on things to look out for or be aware of when trying to move one of the SotA models into a DGL implementation.

1 Like

We have a graph-to-text example GraphWriter using knowledge graph WebNLG. There is also a good reference CycleGT for cycle-training of graph-to-text and text-to-graph which also using WebNLG, based on DGL.

For AMR, we haven’t implement examples for it, but we are very interested in it. But I’m not sure which models worth to implement. I think we can work together to make some examples.

2 Likes