You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the Feature
A way to generate test sets (and run evaluations) in languages other than English.
Why is the feature important for you?
I do not live in an English-speaking country.
The source material I am using is multi-lingual
Additional context
When using Ragas v1, there was an option to "adapt" the generator to different languages:
Example v1 code: generator.adapt('french', evolutions=[simple, reasoning, conditional, multi_context], cache_dir=cache_dir)
I am not able to use this code in the new version of Ragas
(this generates the following error: AttributeError: 'TestsetGenerator' object has no attribute 'adapt')
Add any other context about the feature you want to share with us.
When using non-English source material the resulting question-answer pairs are a mix between the source language and English.
I also cannot seem to find how to do this in the v2 documentation.
If there is no translation feature present, could it be possible to append something to the generation / evaluation prompt?
The text was updated successfully, but these errors were encountered:
Describe the Feature
A way to generate test sets (and run evaluations) in languages other than English.
Why is the feature important for you?
I do not live in an English-speaking country.
The source material I am using is multi-lingual
Additional context
When using Ragas v1, there was an option to "adapt" the generator to different languages:
Example v1 code:
generator.adapt('french', evolutions=[simple, reasoning, conditional, multi_context], cache_dir=cache_dir)
I am not able to use this code in the new version of Ragas
(this generates the following error:
AttributeError: 'TestsetGenerator' object has no attribute 'adapt'
)Add any other context about the feature you want to share with us.
When using non-English source material the resulting question-answer pairs are a mix between the source language and English.
I also cannot seem to find how to do this in the v2 documentation.
If there is no translation feature present, could it be possible to append something to the generation / evaluation prompt?
The text was updated successfully, but these errors were encountered: