Skip to content

Commit

Permalink
Update - read llm output from json response and added backward compat…
Browse files Browse the repository at this point in the history
…ibility
  • Loading branch information
aybruhm committed Jan 9, 2024
1 parent dc1da36 commit 19bd17e
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion agenta-backend/agenta_backend/services/llm_apps_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,12 @@ async def get_llm_app_output(uri: str, datapoint: Any, parameters: dict) -> AppO
url, json=payload, timeout=httpx.Timeout(timeout=5, read=None, write=5)
)
response.raise_for_status()
return AppOutput(output=response.json(), status="success")
response_data = response.json()
if isinstance(response_data, dict):
llm_output = response_data["message"]
else:
llm_output = response_data
return AppOutput(output=llm_output, status="success")


async def run_with_retry(
Expand Down

0 comments on commit 19bd17e

Please sign in to comment.