Skip to content

Commit

Permalink
[BUG] Fix bug with drop_last when mod is 0 (#136)
Browse files Browse the repository at this point in the history
Closes #130 .  Will be accompanied by a PR to `cugraph` that shows how many batches reached each worker, so it is obvious when there are more GPUs than batches.

Resolves a bug where `drop_last` deleted all input if the length of the input modulo batch size was 0.

Authors:
  - Alex Barghi (https://github.com/alexbarghi-nv)

Approvers:
  - Tingyu Wang (https://github.com/tingyu66)

URL: #136
  • Loading branch information
alexbarghi-nv authored Feb 6, 2025
1 parent 8bf2012 commit 5baac8b
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
3 changes: 2 additions & 1 deletion python/cugraph-pyg/cugraph_pyg/loader/link_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,8 @@ def __iter__(self):

if self.__drop_last:
d = perm.numel() % self.__batch_size
perm = perm[:-d]
if d > 0:
perm = perm[:-d]

input_data = torch_geometric.sampler.EdgeSamplerInput(
input_id=self.__input_data.input_id[perm],
Expand Down
3 changes: 2 additions & 1 deletion python/cugraph-pyg/cugraph_pyg/loader/node_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,8 @@ def __iter__(self):

if self.__drop_last:
d = perm.numel() % self.__batch_size
perm = perm[:-d]
if d > 0:
perm = perm[:-d]

input_data = torch_geometric.sampler.NodeSamplerInput(
input_id=self.__input_data.input_id[perm],
Expand Down

0 comments on commit 5baac8b

Please sign in to comment.