Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable option to disable pin_memory in pytorch #239

Merged
merged 2 commits into from
Oct 30, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions dlio_benchmark/data_loader/torch_data_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def read(self):
batch_size=self.batch_size,
sampler=sampler,
num_workers=self._args.read_threads,
pin_memory=True,
pin_memory=self._args.pin_memory,
drop_last=True,
worker_init_fn=dataset.worker_init,
**kwargs)
Expand All @@ -152,7 +152,7 @@ def read(self):
batch_size=self.batch_size,
sampler=sampler,
num_workers=self._args.read_threads,
pin_memory=True,
pin_memory=self._args.pin_memory,
drop_last=True,
worker_init_fn=dataset.worker_init,
**kwargs) # 2 is the default value
Expand Down
3 changes: 3 additions & 0 deletions dlio_benchmark/utils/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,7 @@ class ConfigArguments:
data_loader_sampler: DataLoaderSampler = None
reader_classname: str = None
multiprocessing_context: str = "fork"
pin_memory: bool = True

# derived fields
required_samples: int = 1
Expand Down Expand Up @@ -521,6 +522,8 @@ def LoadConfig(args, config):
args.preprocess_time = reader['preprocess_time']
if 'preprocess_time_stdev' in reader:
args.preprocess_time_stdev = reader['preprocess_time_stdev']
if 'pin_memory' in reader:
args.pin_memory = reader['pin_memory']

# training relevant setting
if 'train' in config:
Expand Down
3 changes: 3 additions & 0 deletions docs/source/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -201,6 +201,9 @@ reader
* - read_threads*
- 1
- number of threads to load the data (for tensorflow and pytorch data loader)
* - pin_memory
- True
- whether to pin the memory for pytorch data loader
* - computation_threads
- 1
- number of threads to preprocess the data
Expand Down
Loading