Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attention Transformer #9

Open
leonlenk opened this issue Jul 5, 2024 · 0 comments
Open

Attention Transformer #9

leonlenk opened this issue Jul 5, 2024 · 0 comments
Assignees
Labels

Comments

@leonlenk
Copy link
Contributor

leonlenk commented Jul 5, 2024

  • Create a transformer with only attention (no dense layers)
  • Discuss depth of model and the limitations
  • This notebook should also go over how layernorm works
  • You can use what has been previously created (embeddings, utils, attention)
@leonlenk leonlenk added the week6 label Jul 5, 2024
@Bkwan27 Bkwan27 self-assigned this Sep 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants