Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dumping the entire sequence into GPU causes memory out error. #4

Open
nivesh48 opened this issue Sep 18, 2020 · 4 comments
Open

Dumping the entire sequence into GPU causes memory out error. #4

nivesh48 opened this issue Sep 18, 2020 · 4 comments

Comments

@nivesh48
Copy link

In def__getitem__(self, index), dataset.py you are stacking the entire train, test set sequences and returning it back. How ever getitem iterable and returns images with labels over index.

@fshamshirdar
Copy link
Owner

getitem is not the stacking the entire train/test set sequences. It is just one sequence (with trajectory_length size) from either train or test sets, not both of them. the "test" variable in the constructor indicates whether it is for testing or training. You may want to change the trajectory_length to a smaller number (~5) in case it fits into your GPU.

@nivesh48
Copy link
Author

For instance,

def __getitem__(self, index):
      ...............
      index = index - (size-self.trajectory_length)
      ................
      return imgstack, poses

lets take trajectory length = 5,

def __len__():
          return  self.size - (self.trajectory_length * len(self.sequences))

seq = ["00", "01"]
size = 4540+1100
def__len__() returns 5540 - 5*2 = 5530
def__getitem__(index): where index = 5530-1 (Pytorch automatically assigns)

and for i in range(index, index+self.trajectory_length) with return 5stacked images5530 times and there is overlap of images stacked with previous batch to present.

@fshamshirdar
Copy link
Owner

fshamshirdar commented Sep 18, 2020

This is a different question.

Alright, considering your example: 4540 and 1100 sequences. given index 5529, the for loop is iterating 5 times only from 5529 to 5534 (given trajectory_length is 5). meaning that it will select 1094 to 1099 images from the second sequence.

Taking the len function into account, the second part of the equation, (trajectory_len * len(sequences)), is making an offset for the data so that there are no overlaps or index out of array.

@nivesh48
Copy link
Author

You are not getting the exact thing what I am trying to convey.

When using torch.utils.Dataset inheritance. getitem will iterate through the index which is len-1 you times. As your len function returns size - (trajectory*sequences). Your getitem will return a pair of 5stacked images and 5 poses for len-1 times. I would suggest you to roughly write the math on the paper of your loop once. And can kindly close this issue now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants