NodeFlow object cannot be passed between processes

I have a requirement to deliver NodeFlow object in different processes. The code is as following.

q = multiprocessing.Queue()

def read_loader(loader, q):
    for nf in loader:
        #nf.copy_from_parents()
        q.put(nf)

g = dgl.contrib.graph_store.create_graph_from_store(
        args.dataset, "shared_mem")
train_loader = dgl.contrib.sampling.NeighborSampler(g, args.batch_size,
                                                        args.num_neighbors,
                                                        neighbor_type='in',
                                                        shuffle=True,
                                                        num_workers=16,
                                                        num_hops=args.n_layers+1,
                                                        seed_nodes=train_nid,
                                                        prefetch=False)

p = multiprocessing.Process(target=read_loader, args=(train_loader, q), daemon=True)
p.start()

While I got the error:

AttributeError: Can't pickle local object 'SharedMemoryDGLGraph.__init__.<locals>.<lambda>'

How can I slove this problem?

My suggestion is to create a unique NeighborSampler per process.