Skip to content
This repository has been archived by the owner on Dec 14, 2023. It is now read-only.

Enabling Multi-GPU training #73

Open
julkaztwittera opened this issue Jun 12, 2023 · 3 comments
Open

Enabling Multi-GPU training #73

julkaztwittera opened this issue Jun 12, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@julkaztwittera
Copy link

How to enable multi-GPU training? No matter how many GPUs I use, only one process starts.

@ExponentialML ExponentialML added the enhancement New feature or request label Jun 25, 2023
@maximepeabody
Copy link
Contributor

I'm also curious about this

@Splendon
Copy link

Is there any progress?

@zyshin
Copy link

zyshin commented Sep 15, 2023

FYI, since the training code is wrapped in Accelerator, I just launch training with multiple GPUs and it seems work. Also, make sure that some functions are called only once in the main process by checking accelerator.is_main_process (e.g., create_output_folders, save_pipe, and the evaluation part)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants