Concat in-going adjacency and out-going adjacency for GatedGraphConv

Hi team,

I was working on reproducing Gated Graph Neural Network. In the work, the adjacency A should be the concatenation of out-going adjacency and in-going adjacency. Then, feed A into GatedGraphConv.

I have some code snippets on building separate in-going graph and out-going graph:

import numpy as np
import networkx as nx
import dgl

A_out = np.array([[0, 1, 0, 0],
                  [0, 0, 0.5, 0.5],
                  [0, 1, 0, 0],
                  [0, 0, 0, 0]])

A_in = np.array([[0, 0, 0, 0],
                 [0.5, 0, 0.5, 0],
                 [0, 1, 0, 0],
                 [0, 1, 0, 0]])
# We could not concat A_out and A_in into networkx.graph, for non-square adjacency error

# Transform to networkx.graph
nx_g_out = nx.from_numpy_array(A_out)
nx_g_in = nx.from_numpy_array(A_in)
# nx_g_out = {0: {1: {'weight': 1.0}}, 1: {0: {'weight': 1.0}, 2: {'weight': 1.0}, 3: {'weight': 0.5}}, 2: {1: {'weight': 1.0}}, 3: {1: {'weight': 0.5}}}
# nx_g_in = {0: {1: {'weight': 0.5}}, 1: {0: {'weight': 0.5}, 2: {'weight': 1.0}, 3: {'weight': 1.0}}, 2: {1: {'weight': 1.0}}, 3: {1: {'weight': 1.0}}}

# Transform to dgl.graph
dgl_g_out = dgl.from_networkx(nx_g_out)
dgl_g_in = dgl.from_networkx(nx_g_in)
# The adjacency output of dgl_g_out and dgl_g_in is the same, as following
'''
tensor(indices=tensor([[0, 1, 1, 1, 2, 3],
                       [1, 0, 2, 3, 1, 1]]),
       values=tensor([1., 1., 1., 1., 1., 1.]),
       size=(4, 4), nnz=6, layout=torch.sparse_coo)
tensor(indices=tensor([[0, 1, 1, 1, 2, 3],
                       [1, 0, 2, 3, 1, 1]]),
       values=tensor([1., 1., 1., 1., 1., 1.]),
       size=(4, 4), nnz=6, layout=torch.sparse_coo)
'''

I was wondering how to concat the out-going adjacency and in-going adjacency to generate the graph for GatedGraphConv?

Hi,

What do you mean by concat adjacency matrix? We have a GatedGraphConv module at dgl/gatedgraphconv.py at master · dmlc/dgl · GitHub.
Do you mean multiple edge types?

Hi,

Thanks for replying. I wanna create the graph as figure, which takes outgoing adjacency and incoming adjacency combined and generates a new adjacency.

I was wondering how I can feed the generated graph into GatedGraphConv? I tried but shows adjacency is non-square error. Thanks in advance for your reply.

Hi, could you share the paper/manuscript of the figure? Note that an adjacency matrix, by definition, must be a square matrix, while your matrix is rectangle shape (so that’s why the figure calls it “connection matrix”). I wonder given that connection matrix A_s, what the following operations look like.

Thanks for replying. Here is the paper-[19 AAAI] Session-based Recommendation with Graph Neural Networks.

nx.from_numpy_array creates an undirected NetworkX graph by default. dgl.from_networkx converts the NetworkX graph to a directed one, hence you have edges for both directions. To address the issue, you can simply do

import torch

def get_weighted_structure(adj):
    dst, src = adj.nonzero()
    eweight = adj[dst, src]
    return src, dst, eweight

src_out, dst_out, e_out = get_weighted_structure(A_out)
src_in, dst_in, e_in = get_weighted_structure(A_in)
src = np.concatenate([src_out, src_in])
dst = np.concatenate([dst_out, dst_in])
g = dgl.graph((src, dst))
g.edata['w'] = torch.from_numpy(np.concatenate([e_out, e_in]))
1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.