Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge main into levenshtein #1458

Merged
merged 54 commits into from
Mar 25, 2024
Merged

Merge main into levenshtein #1458

merged 54 commits into from
Mar 25, 2024

Conversation

aakrem
Copy link
Collaborator

@aakrem aakrem commented Mar 25, 2024

No description provided.

MohammedMaaz and others added 30 commits January 30, 2024 15:06
support passing agenta api key as an env var
…ance will be there & inherits context through Layout and fixes all the alert popup usages
…ion-modal-bigger

Increased code evaluation modal width
…to-playground-only-when-we-click-on-the-variant-name

Removed cell redirect on variant and testset
Remove unused modal impor
VenkataRavitejaGullapudi and others added 24 commits March 16, 2024 12:35
fix: Delete evaluator modal is light in dark mode #1424
…n-views

[Docs]: Added human evaluation views section
…oundary

Fix:324 | Unblock ui on error boundary
…-status-column

Freeze evaluation status column
Added dialog to indicate test set is being saved in create test set UI
docs: add Drewski2222 as a contributor for code
…-in-Human-evaluation

Improve card view in Human Evaluation
…-output-on-result-overflow

We can't see the full output value in evaluations scenarios
@aakrem aakrem merged commit 1b57abb into levenshtein-evaluator Mar 25, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants