Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with Validation depth data processing #244

Open
KoushikSamudrala opened this issue Jan 26, 2023 · 0 comments
Open

Issue with Validation depth data processing #244

KoushikSamudrala opened this issue Jan 26, 2023 · 0 comments

Comments

@KoushikSamudrala
Copy link

Hello everyone,
I am trying to finetune the packnetsfm checkpoint with my custom Image data and I assumed depth data from the RGBD camera as my ground truth data and fed the RGB Images for training and to plot the validation loss curve, I used part of the RGB data as evaluation in a config file and depth path is given with the folder of depth images. Now the model can recognize the training data and outputs the training loss whereas in the case of validation, it faces trouble with interpolation of batch['depth'] size with the depth_pp size and throws an error as follows:
image
Has anyone faced this error before and what can be done in order to avoid this error?
I'm using Colab to run the scripts and here is what my config .yaml file looks like:
image
I modified the scripts modelwrapper.py and Image_dataset.py in order to take depth images in the batch dictionary.
I even tried to reduce the batch_size to avoid the above error but even though I used 1 image as a batch the error is still prevailing. Any help in this regard is appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant