Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explicit Memory #8

Open
ClashLuke opened this issue Apr 30, 2022 · 0 comments
Open

Explicit Memory #8

ClashLuke opened this issue Apr 30, 2022 · 0 comments
Labels
core Improves core model while keeping core idea intact ML Requires machine-learning knowledge (can be built up on the fly) research Creative project that might fail but could give high returns

Comments

@ClashLuke
Copy link
Member

ClashLuke commented Apr 30, 2022

Many modern architectures, such as Memorizing Transformers, RETRO and PKM have an explicit memory where the model can retrieve information from and optionally even store it. Some hypothesise that Mixture-of-Experts embeds fuzzy representations of books and other things it must memorise into its weights.
That's why adding explicit memory to our models could give them a considerable boost in performance. Instead of storing this information in dense layers and having the weights fight about whether they should be storing concepts or memorising sequences, our model would be able to do both.
This issue is about implementing such an explicit memory (be it PKM, MoE or even a new architecture) and improving the convergence of our language model at the same runtime.

@ClashLuke ClashLuke added research Creative project that might fail but could give high returns ML Requires machine-learning knowledge (can be built up on the fly) labels Apr 30, 2022
@ClashLuke ClashLuke added the core Improves core model while keeping core idea intact label May 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Improves core model while keeping core idea intact ML Requires machine-learning knowledge (can be built up on the fly) research Creative project that might fail but could give high returns
Projects
None yet
Development

No branches or pull requests

1 participant