From 35c19bb23d05f7863c7f922ed78b16fda68d41fe Mon Sep 17 00:00:00 2001 From: Nicola Di Cicco <93935338+nicoladicicco@users.noreply.github.com> Date: Tue, 24 Sep 2024 15:10:18 +0900 Subject: [PATCH] Update README.md --- README.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index b955169..a8fcf06 100644 --- a/README.md +++ b/README.md @@ -44,11 +44,13 @@ CityA,CityB,10 CityA,CityC,8 """ -response = translate(llm, description) +response = translate(llm, description, interactive=true) ``` The `translate` function will first produce a Markdown representation of the problem, and then return the generated Julia code for parsing the input data and building the model. +The flag `interactive=true` will enable a simple interactive command-line application, where you will be able to inspect, edit and regenerate each intermediate output. + This example uses Google Gemini as an LLM. You will need an API key and a model id to access proprietary API endpoints. Use `help?>` in the Julia REPL to learn more about the available models. At each generation step, it will prompt the user in an interactive menu to accept the answer, edit the prompt and/or the generated text, or generate another answer with the same prompt.