You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using .cache() (with the default memory cacher) does nothing when the Dataset is used in a multi-process DataLoader. This is a gotcha that should probably pointed out in the documentation and the tutorial, as it is easy to overlook.
In my case I was dropping in torchdatas cache in a program that already had the DataLoaders defined. The DataLoaders were initialized with a positive int in num_workers. It took me a while to figure out why the cache didn't seem to work, at all.
The text was updated successfully, but these errors were encountered:
Using
.cache()
(with the default memory cacher) does nothing when the Dataset is used in a multi-process DataLoader. This is a gotcha that should probably pointed out in the documentation and the tutorial, as it is easy to overlook.In my case I was dropping in torchdatas cache in a program that already had the DataLoaders defined. The DataLoaders were initialized with a positive int in num_workers. It took me a while to figure out why the cache didn't seem to work, at all.
The text was updated successfully, but these errors were encountered: