Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarifications and additions to record and prepare output #51

Open
JasperDekoninck opened this issue Dec 9, 2023 · 0 comments
Open

Comments

@JasperDekoninck
Copy link

The following aspects of the output and questions of the recording and preparing process were a bit confusing to me, would be great if they could be improved :)

prepare

  • Allow to specify multiple models at the same time. Vulnerabilties are more valuable the more generalizable they are, so encouraging to report the biggest amount of models possible is better.

record

  • Ask extra questions for parameters such as temperature, max_tokens, top_p (leave empty for default). I have no idea how I would go about changing them right now.
  • Do not just say model failed the test: at first, I thought this meant the instance was not a good example of the vulnerability (I actually ran the ToxicityChecker with threshold 0 before realizing it was the other way around, since I just though the Toxicity Model was bad). Rename passed to something more appropriate such as vulnerability present. Also, might be good to add the exact toxicity/bias score to the outputs. This gives the user the possibility to do some checks.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant