Skip to content

Support Single-Device Training#64

Open
NatureGeorge wants to merge 2 commits intoKellerJordan:masterfrom
NatureGeorge:fix_dist
Open

Support Single-Device Training#64
NatureGeorge wants to merge 2 commits intoKellerJordan:masterfrom
NatureGeorge:fix_dist

Conversation

@NatureGeorge
Copy link
Copy Markdown

Previously, Muon assumes distributed training by default and directly call torch.distributed.get_world_size() and torch.distributed.get_rank().

This pull request introduce a simple fix for that using torch.distributed.is_available() and torch.distributed.is_initialized().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant