Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] MultiTaskGP without data #2360

Open
AdrianSosic opened this issue Jun 4, 2024 · 2 comments
Open

[Bug] MultiTaskGP without data #2360

AdrianSosic opened this issue Jun 4, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@AdrianSosic
Copy link
Contributor

AdrianSosic commented Jun 4, 2024

🐛 Bug

The evaluation of the MultiTaskGP fails in cases where there is no training data available. The same situation works without problems for a SingleTaskGP, which conceptually should simply return values from the prior. However, the MultiTaskGP crashes with a ZeroDivisionError.

To reproduce

SingleTaskGP

import torch
from botorch.models.gp_regression import SingleTaskGP

N_train = 0
N_test = 10

train_X = torch.rand(N_train, 2, dtype=torch.float64)
train_Y = torch.sin(train_X).sum(dim=1, keepdim=True)

test_X = torch.rand(N_test, 2, dtype=torch.float64)

model = SingleTaskGP(train_X, train_Y)
model.posterior(test_X).mean

MultiTaskGP

import torch
from botorch.models.multitask import MultiTaskGP

N_train = 0
N_test = 10

train_X1, train_X2 = torch.rand(N_train, 2), torch.rand(N_train, 2)
i1, i2 = torch.zeros(N_train, 1), torch.ones(N_train, 1)
train_X = torch.cat(
    [
        torch.cat([train_X1, i1], -1),
        torch.cat([train_X2, i2], -1),
    ]
)
train_Y = torch.randn(train_X.shape[0], 1)

test_X1, test_X2 = torch.rand(N_train, 2), torch.rand(N_train, 2)
i1, i2 = torch.zeros(N_train, 1), torch.ones(N_train, 1)
test_X = torch.cat(
    [
        torch.cat([test_X1, i1], -1),
        torch.cat([test_X2, i2], -1),
    ]
)

model = MultiTaskGP(train_X, train_Y, task_feature=-1)
model.posterior(test_X).mean

Stack trace/error message

ZeroDivisionError: integer division or modulo by zero

Expected Behavior

Just like the SingleTaskGP, the MultiTaskGP should return the prior values.

System information

Please complete the following information:

  • BoTorch Version: 0.11.0
  • GPyTorch Version: 1.11
  • PyTorch Version: 2.3.0
  • OS: macOS
@AdrianSosic AdrianSosic added the bug Something isn't working label Jun 4, 2024
@sdaulton
Copy link
Contributor

sdaulton commented Jun 5, 2024

Thanks for flagging this. This appears to be an issue in IndexKernel: specifically, InterpolatedLinearOperator doesn't work when left_interp_indices and right_interp_indices are empty tensors. Would you mind opening an issue in gpytorch?

@AdrianSosic
Copy link
Contributor Author

Sure thing, done! Thanks for the help 🥇

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants