Training stress with allegro #80
Replies: 3 comments 8 replies
-
I had the same problem... |
Beta Was this translation helpful? Give feedback.
-
Update: I solved the first part of the problem here regarding training with stress. It looks like stress does not pass equivariance test. So I removed the equivariance-test tag from the nequip-train command. |
Beta Was this translation helpful? Give feedback.
-
My understanding is that this is a bit of an open question. Certainly you don't need to converge models all the way to get a sense of the relative ranking of hyperparameters, but how many epochs that takes will depend on the system and the size of your dataset (one epoch of training is very different in a 1M frame dataset and a 100 frame one). |
Beta Was this translation helpful? Give feedback.
-
Hello,
I am trying to train Allegro model with stress. I have made the following modifications in my config file:
I tried to train this separately with NVT and NPT data. Both times I encountered an error failing the equivariance test. The error is as follows:
I am using nequip==0.5.5, Torch==1.11.0+cu113, and mir-allegro==0.2.0. Can you please tell me a solution for this?
I have another question regarding hyperparameter tuning. I am trying to find good hyperparameter combinations of number of layers, batch number, learning rate, cutoff, and number of features for my dataset. For this purpose, if I use the nequip-benchmark, it will help me to understand the model's performance for my future MD calculations. If I aim to understand the effect of hyperparameters on model errors (validation and training error) and thereby get an optimized hyperparameter combination for the lowest error, do I need to use nequip-train? If I need to use nequip-train, can I do 10 epochs for all hyperparameter combinations to compare the MAE and RMSE at 10th epoch for all combinations of hyperparameters? I am asking this because it will save a good amount of computational time as I will not need to do the full training until convergence for all these hyperparameter combinations.
Beta Was this translation helpful? Give feedback.
All reactions