How to pretrain Llama-2 with tokenmonster #25
Unanswered
jerryjalapeno
asked this question in
Q&A
Replies: 1 comment
-
To do it you need VERY VERY MUCH computation power, are you sure? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, how would we go about pretraining a llama2 model from scratch? All the existing codebases and documentation use a BPE based model, so how would we use tokenmonster instead? Is there a drop in replacement, or do we need to edit the training codebases on our own? Specifically, I am using axolotl to pretrain.
Beta Was this translation helpful? Give feedback.
All reactions