Do Seq2Graph models exist?

Hi,

I was wondering whether there are auto-encoders in which the decoder creates graphs from a single vector in latent space?

I would like to train a model from which generates a graph from a predefined sequence. That is why I care mostly about the decoder
Many simple variational autoencoders seem to use a 2d tensor in the latent space, from which the graph is reconstructed.
However, are there possibilities to generate graphs from a 1d tensor?

Are

Hi, building a graph from a sequence consists two steps:

  1. entity recognition;
  2. relation extraction.

So a 2d tensor could contain these information. In theory, these information can be cram into a vector like in VAE, then decoder can generate a 2d tensor as the adjacency matrix of the graph from the vector.
However, the entities should be predefined and fixed. Also, if the number of entities N is large, the output will be N*N, and each entry of the matrix is a probability distribution of the relation types, which is hard to train.
So, if the entity vocabulary is large or cannot be predefined, I think it’ more plausible to use the above 2-step pipeline, or generate graph from 2d tensor.