Replies: 2 comments 2 replies
-
2 years and that question never got answered, I can't imagine how much pressure PyTorch devs have to deny requests for such a looong period. |
Beta Was this translation helpful? Give feedback.
-
And yes, MultiEpochsDataLoader is amazing, and it shrinked a period of three days for my model to finish training, to 6 hours, by using 8 workers only, Imagine if I added more, It's very weird why such and amazing and polished library like PyTorch turned their gaze over such a critical problem, I can't imagine they was respawning all workers and reinitializing datasets in them for every iteration, this is unbelievable how they left it like that. |
Beta Was this translation helpful? Give feedback.
-
Hey everyone,
I've reused the MultiEpochsDataLoader because the normal Dataloader slows down after every epoch and it works just as desired.
Is there any reason why this isn't a standard Pytorch class? are there any Problems with type of dataloading?
Best greetings,
Filos
Beta Was this translation helpful? Give feedback.
All reactions