Skip to content

Commit 611a688

Browse files
authored
Minor fixes for distributed tutorials (pytorch#1978)
1 parent 6c3b79d commit 611a688

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

beginner_source/dist_overview.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ three main components:
4949
after the backward pass instead of using DDP to communicate gradients. This can
5050
decouple communications from computations and allow finer-grain control over
5151
what to communicate, but on the other hand, it also gives up the performance
52-
optimizations offered by DDP. The
52+
optimizations offered by DDP.
5353
`Writing Distributed Applications with PyTorch <../intermediate/dist_tuto.html>`__
5454
shows examples of using c10d communication APIs.
5555

intermediate_source/dist_tuto.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ with your local sysadmin or use your favorite coordination tool (e.g.,
3939
`pdsh <https://linux.die.net/man/1/pdsh>`__,
4040
`clustershell <https://cea-hpc.github.io/clustershell/>`__, or
4141
`others <https://slurm.schedmd.com/>`__). For the purpose of this
42-
tutorial, we will use a single machine and fork multiple processes using
42+
tutorial, we will use a single machine and spawn multiple processes using
4343
the following template.
4444

4545
.. code:: python

0 commit comments

Comments
 (0)