0.5.X user DistEmbedding occur Error

when I run unsupervised dist demo and I modify load_subtensor method about at line 245 , there occur error about process hanging.

I modify
batch_inputs = g.ndata[‘features’][input_nodes].to(device)

to
batch_inputs = emb(input_nodes).to(device)

there emb is a DistEmbedding

but why hanging 30min and occur timeout.

Is my input_nodes param need to mapping or any other thing I use wrong ?

can dist embedding create on gpu direct ?

another question is when I use init_function ,there occur error on graph server , and say can’t find init_emb

You cannot create embedding on gpu directly.

Can you provide the completed error information? Thanks!

The hanging extremly likely to be caused by the failure of the server process. The input_nodes should be the local id instead of the original global id.

thx for reply you are right .