Skip to content
This repository has been archived by the owner on Dec 18, 2023. It is now read-only.

AffineTransformed returns all zeros as samples #180

Open
feynmanliang opened this issue May 20, 2020 · 0 comments
Open

AffineTransformed returns all zeros as samples #180

feynmanliang opened this issue May 20, 2020 · 0 comments

Comments

@feynmanliang
Copy link
Contributor

Issue Description

Applying an AffineTransform which shifts by +1 to a U[0,0.1] distribution should produce a U[1, 1.1] distribution. However, it appears that the samples being produced are all "0" after the transform.

The 0s are set https://github.com/facebookincubator/beanmachine/blob/master/beanmachine/ppl/world/variable.py#L106. It looks like the distribution's .support is Interval(0, 0.1) before applying the transform and _Real after applying it.

Steps to Reproduce

Happy path

import torch.distributions as dist
import beanmachine.ppl as bmp

distn = dist.Uniform(0.0, 0.1)

test_rv = bmp.random_variable(lambda: distn)
print(f"Directly sampling the distribution:\n {test_rv().function().sample_n(10)}")
mh = bmp.SingleSiteAncestralMetropolisHastings()
samples = mh.infer([test_rv()], [], 10, num_chains=1)
print(f"Using ancestral sampling:\n {samples.get_variable(test_rv())}")

Output

Directly sampling the distribution:
 tensor([0.0705, 0.0308, 0.0847, 0.0166, 0.0036, 0.0702, 0.0592, 0.0095, 0.0174,
        0.0169])
Using ancestral sampling:
 tensor([[0.0315, 0.0232, 0.0954, 0.0654, 0.0193, 0.0687, 0.0661, 0.0167, 0.0428,
         0.0200]], grad_fn=<SliceBackward>)

Sad path (the only difference is the AffineTransform)

import torch.distributions as dist
import beanmachine.ppl as bmp

distn = dist.TransformedDistribution(
        dist.Uniform(0.0, 0.1),
        dist.transforms.AffineTransform(loc=1, scale=1))

test_rv = bmp.random_variable(lambda: distn)
print(f"Directly sampling the distribution:\n {test_rv().function().sample_n(10)}")
mh = bmp.SingleSiteAncestralMetropolisHastings()
samples = mh.infer([test_rv()], [], 10, num_chains=1)
print(f"Using ancestral sampling:\n {samples.get_variable(test_rv())}")

Output

Directly sampling the distribution:
 tensor([1.0456, 1.0438, 1.0513, 1.0372, 1.0348, 1.0866, 1.0195, 1.0504, 1.0766,
        1.0444])
Using ancestral sampling:
 tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]], grad_fn=<SliceBackward>)

Expected Behavior

The second example should produce samples from U[1.0, 1.1].

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant