I am dealing with many large graphs where the **structure** and **edge values** are the same. The node values, however, vary greatly.

```
# structure and edge values are the same in every graph
# but node values vary in each graph
graph 1 = (node: 2.1)-[edge: 1.0]-(node: 3.6)-[edge: 4.5]-(node: 5.7)
graph 2 = (node: 1.8)-[edge: 1.0]-(node: 2.9)-[edge: 4.5]-(node: 6.2)
```

Since the structure of the graphs does not vary, *does it even make sense to use a GNN*? The number of edges is exponential to the number of nodes, so would a GNN be wasting a lot of time constructing edge-based embeddings/ matrixes, or do I still need to know at least which nodes are connected to which other nodes?

Alternatively, I was thinking about training a regular 1D CNN or LSTM on the features of each node, but I am worried that they will misinterpret the nodes to be ordered sequentially.

Is there an obvious kind of GNN model for this use case?