-
Notifications
You must be signed in to change notification settings - Fork 790
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request] LongT5 and Instructions! #303
Comments
Hi there! longt5 is indeed supported by Optimum, so it should be quite simple to add! I'm in the process of improving the contributing guide, but in the meantime, you could search for references to t5 in the codebase, make copies for each class, and make suitable modifications to get it to work with longt5. However, considering that I haven't made a solid contributing guide, you can also just list some existing longt5 models on the HF hub you would like to test, along with example usage code and output in the python transformers library, so that when we do add it, we can compare to make sure it's correct! 🤗 |
While writing the contributing guide, I started adding support for longt5 (which was very simple). I'm making a separate PR for it, to guide users in future if they want to add new models. By the way, do you have any longt5 models that you would like to have ONNX models ready for? Most are just pre-trained models. I've tested with this summarization model, and it does seem to operate correctly. |
Would this work for translation models in long-t5 like this one https://huggingface.co/KETI-AIR-Downstream/long-ke-t5-base-translation-aihub-bidirection |
Yes it should 👍 You can convert the model to ONNX with our conversion script:
|
I'm getting this error whnever I use a longt5 model with less than ~30 token input is there a reason for this? `[INFO:CONSOLE(34)] "D:/a/_work/1/s/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:26 onnxruntime::ReshapeHelper::ReshapeHelper(const TensorShape &, TensorShapeVector &, bool) i < input_shape.NumDimensions() was false. The dimension with value zero exceeds the dimension size of the input tensor.", source: https://cdn.jsdelivr.net/npm/@xenova/transformers@latest (34) [INFO:CONSOLE(70)] "An error occurred during model execution: "Error: failed to call OrtRun(). error code = 6.".", source: https://cdn.jsdelivr.net/npm/@xenova/transformers@latest (70) [INFO:CONSOLE(70)] "Inputs given to model: [object Object]", source: https://cdn.jsdelivr.net/npm/@xenova/transformers@latest (70) [INFO:CONSOLE(34)] "Uncaught (in promise) Error: failed to call OrtRun(). error code = 6.", source: https://cdn.jsdelivr.net/npm/@xenova/transformers@latest (34) |
Hey! I wanted to know what the process / requirements were to add new models to the repo. I was working with LongT5 and wanted to see if it was possible to get that added to the compatibility list.
Thanks!
The text was updated successfully, but these errors were encountered: