-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dumping the entire sequence into GPU causes memory out error. #4
Comments
getitem is not the stacking the entire train/test set sequences. It is just one sequence (with trajectory_length size) from either train or test sets, not both of them. the "test" variable in the constructor indicates whether it is for testing or training. You may want to change the trajectory_length to a smaller number (~5) in case it fits into your GPU. |
For instance,
lets take trajectory length = 5,
seq = ["00", "01"] and |
This is a different question. Alright, considering your example: 4540 and 1100 sequences. given index 5529, the for loop is iterating 5 times only from 5529 to 5534 (given trajectory_length is 5). meaning that it will select 1094 to 1099 images from the second sequence. Taking the len function into account, the second part of the equation, (trajectory_len * len(sequences)), is making an offset for the data so that there are no overlaps or index out of array. |
You are not getting the exact thing what I am trying to convey. When using torch.utils.Dataset inheritance. getitem will iterate through the index which is len-1 you times. As your len function returns size - (trajectory*sequences). Your getitem will return a pair of 5stacked images and 5 poses for len-1 times. I would suggest you to roughly write the math on the paper of your loop once. And can kindly close this issue now. |
In
def__getitem__(self, index)
, dataset.py you are stacking the entire train, test set sequences and returning it back. How ever getitem iterable and returns images with labels over index.The text was updated successfully, but these errors were encountered: