Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation loses temporal information #19

Open
harrystuart opened this issue Jun 25, 2021 · 1 comment
Open

Evaluation loses temporal information #19

harrystuart opened this issue Jun 25, 2021 · 1 comment

Comments

@harrystuart
Copy link

Hey guys,

Thanks for the repo, when I am evaluating the model, input_sentence_i does not seem to be influenced by input_sentence_i-1. Is it possible to run the model in inference mode but so that it retains memory of the previous sentences you entered?

Thanks

@bryanlimy
Copy link
Owner

bryanlimy commented Jun 25, 2021

Hi, the vanilla Transformer (which this repository implements) does not incorporate information across data points so the closest thing you could do with this model is to combine multiple input sentences and treat it as a single input. Recent work by Kossen et al. attempts to apply self-attention across data points, you might wanna have a look!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants