In the API documentation, local_var, I found this statement in the
However, inplace operations do change the shared tensor values, so will be reflected to the original graph.
Could you give an example of what
inplace operations will change the shared tensor value?
It is not clear to me and I do not know how this function can be safely used when writing a customized models.
Related Implementation Question:
In the code of gatconv
In forward function, it uses the
graph = graph.local_var().
Does this mean that this implementation of
gatconv will not tune the graph node features during training?
For example, if I am using node features, I want to tune the feature during training, I need to implement another gatconv, rather than use the version in
Thank you very much for your answer.
The same question is also asked at github issue: github
By the way, is it preferred to ask in forum or github?