This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
[Retiarii] Usage of DataLoaders in one-shot NAS Trainers #4218
thomasschmied
started this conversation in
New Feature Design Discussion
Replies: 2 comments
-
We might refactor all the one-shot trainers from a high-level perspective, and make customization more friendly. We will consider your use case when we do that. Meanwhile, the recommended practice is to copy/inherit the existing one-shot trainers and adapt to your own use scenario. |
Beta Was this translation helpful? Give feedback.
0 replies
-
moving the issue to discussion to get more awareness for the design discussions. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the issue:
Hello!
I've been working with NNI recently and I really like the Retiarii Mutation API.
While running one-shot NAS experiments on custom datasets with ENAS and Darts, I encountered a problem. All Trainer classes in
nni.retiarii.oneshot.pytorch
, (e.g., ENASTrainer) construct thetorch.utils.data.DataLoader
instances in_init_dataloader
. Furthermore, the_init_dataloader
function does a 50:50 split of the given PyTorchDataset
instance to construct the train and validation sets.However, this behaviour is a bit limiting. It is particularly problematic if custom datasets or datasets with predefined train-valid-test splits are used. Therefore, my question: would it be possible to change this behaviour?
Possible solutions:
DataLoader
instances directly to the Trainer class instead of constructing them in_init_dataloader
_init_dataloader
from the Trainer__init__
and make the function configurable for the userCurrently, I bypassed this problem by overwriting the behaviour of
_init_dataloader
.However, I believe that these changes would make the library more generally applicable to a broader range of use cases. Not sure if other people encountered this problem before.
Environment:
Thank you,
Thomas
Beta Was this translation helpful? Give feedback.
All reactions