Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Add memory to transformer #340

Open
Arij-Aladel opened this issue Aug 12, 2021 · 0 comments
Open

Add memory to transformer #340

Arij-Aladel opened this issue Aug 12, 2021 · 0 comments

Comments

@Arij-Aladel
Copy link

Arij-Aladel commented Aug 12, 2021

@louismartin
Hello! thanks for sharing this work
I am working on new transformer architecture and I want to try PKM so reading the instruction and the repo I have a simple question. Regarding "mem_enc_positions" and "mem_dec_positions"
in the read me page was explained:

"To add a memory in (for instance) the layers 4 and 7 of an encoder, you can simply provide --use_memory true --mem_enc_positions 4,7"

but in this line we see

for layer_id, pos in mem_positions:

according to that, I understand that recovered mem_positions should return a tuple(int,str)= (layer_id, pos) and not one value
according to the explanation above 4 and 7, are layers, so I was wondering what is the right syntax of "mem_enc_positions" and "mem_dec_positions" parameters.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant