-
Beta Was this translation helpful? Give feedback.
Answered by
PaParaZz1
Aug 15, 2024
Replies: 1 comment
-
|
You should use the |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
PaParaZz1
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

You should use the
torchruncommand to enable multiple worker, here is the related doc in PyTorch.