Skip to content

Commit

Permalink
Touchups
Browse files Browse the repository at this point in the history
  • Loading branch information
fchollet committed Jun 30, 2023
1 parent 107d5a1 commit 2a367f0
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions guides/distributed_training_with_torch.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,7 @@
single-device training.
Specifically, this guide teaches you how to use PyTorch's `DistributedDataParallel`
module wrapper to train Keras
models on multiple GPUs, with minimal changes to your code,
module wrapper to train Keras, with minimal changes to your code,
on multiple GPUs (typically 2 to 16) installed on a single machine (single host,
multi-device training). This is the most common setup for researchers and small-scale
industry workflows.
Expand Down

0 comments on commit 2a367f0

Please sign in to comment.