Fix: Apply Uniform Attention Masks Explicitly #152
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix: Apply Uniform Attention Masks Explicitly
Description
This PR changes the attention base class implementation to an explicit assumption that the attention mask is applied uniformly accross a single sample. This ensures consistency in which parts of the input are masked out or attended to across all queries. Without this assumption the mask needs to be repeated for every query, leading to big overheads. In its current state all queries should have acess to all key/value pairs. While there is potential impplementations utilizing a more efficient/distributed matching, this approach will be not followed for now.
Which issue does this PR tackle?
How does it solve the problem?
How are the changes tested?
Checklist for Contributors
feature/title-slug
convention.Bugfix: Title
convention.Checklist for Reviewers: