Skip to content

Commit

Permalink
Apply isort and black reformatting
Browse files Browse the repository at this point in the history
Signed-off-by: jomitchellnv <[email protected]>
  • Loading branch information
jomitchellnv committed Jul 8, 2024
1 parent 4dfbe36 commit 3f53682
Showing 1 changed file with 5 additions and 4 deletions.
9 changes: 5 additions & 4 deletions nemo/lightning/megatron_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,10 @@ def default_data_step(dataloader_iter: Iterator[DataT]) -> DataT:
DataT: The data moved to the device.
"""
if parallel_state.get_context_parallel_world_size() > 1:
raise ValueError("Default data step is being used in a context parallel environment."
"Please define your own data step that appropriately slices the data for context parallel."
)
raise ValueError(
"Default data step is being used in a context parallel environment."
"Please define your own data step that appropriately slices the data for context parallel."
)

match next(dataloader_iter):
# If its wrapped in a tuple, unpack it.
Expand All @@ -74,7 +75,7 @@ def default_data_step(dataloader_iter: Iterator[DataT]) -> DataT:
case batch:
pass

return move_data_to_device(batch, torch.cuda.current_device())
return move_data_to_device(batch, torch.cuda.current_device())

Check failure

Code scanning / CodeQL

Potentially uninitialized local variable Error

Local variable 'batch' may be used before it is initialized.


def default_forward_step(model: nn.Module, batch, *args, **kwargs) -> torch.Tensor:
Expand Down

0 comments on commit 3f53682

Please sign in to comment.