You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your impressive work!
I have a question regarding to running ogbn-arxiv dataset. It seems like directly running this large dataset under the current framework within a single GPU is not possible. Could you provide any tips on efficiently conduct robust evaluation on ogbn-arxiv? Specifically, I have several questions such as:
Do you use any sampling techniques during training GNN models? If it is, what sampling method would you recommend?
Do you run ogbn-arxiv on a single GPU or CPU? If it can be done on a single GPU, how large memories should it has?
It seems like it would take weeks for directly employing Mettack and Nettack to finish attacking on such large network, if it is the case it would be very infeasible especially for the poisoning setting. I am not sure if this observation is correct.
Thanks in advance!
The text was updated successfully, but these errors were encountered:
Hi Xiang,
Thanks for your impressive work!
I have a question regarding to running ogbn-arxiv dataset. It seems like directly running this large dataset under the current framework within a single GPU is not possible. Could you provide any tips on efficiently conduct robust evaluation on ogbn-arxiv? Specifically, I have several questions such as:
Thanks in advance!
The text was updated successfully, but these errors were encountered: