From e75a05974c9b6036a4f31901768bbe04d969b973 Mon Sep 17 00:00:00 2001 From: Jean-Francois Baffier Date: Sat, 21 Sep 2024 10:20:03 +0900 Subject: [PATCH 1/3] Documentation update --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index b955169..9712d36 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ [![Coverage](https://codecov.io/gh/Azzaare/ConstraintsTranslator.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/Azzaare/ConstraintsTranslator.jl) [![Aqua](https://raw.githubusercontent.com/JuliaTesting/Aqua.jl/master/badge.svg)](https://github.com/JuliaTesting/Aqua.jl) -A package for translating natural-language descriptions of optimization problems into Constraint Programming models to be solved via [`CBLS.jl`](https://github.com/JuliaConstraints/CBLS.jl) using Large Language Models (LLMs). +A package for translating natural-language descriptions of optimization problems into Constraint Programming models using Large Language Models (LLMs). For this pre-stable version stage, our target is to have models solved via [`CBLS.jl`](https://github.com/JuliaConstraints/CBLS.jl). Eventually, we expect this library to work for most of Julia CP ecosystem, alongside toher CP modeling languages such as MiniZinc, OR-Tools, etc. This package acts as a light wrapper around common LLM API endpoints, supplying appropriate system prompts and context informations to the LLMs to generate CP models. Specifically, we first prompt the model for generating an high-level representation of the problem in editable Markdown format, and then we prompt the model to generate Julia code. From 0f4bcb1ff94beb41b6749c44dddf4a610f6883d1 Mon Sep 17 00:00:00 2001 From: Nicola Di Cicco <93935338+nicoladicicco@users.noreply.github.com> Date: Tue, 24 Sep 2024 11:54:26 +0900 Subject: [PATCH 2/3] Update README.md --- README.md | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 9712d36..494dc34 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ [![Coverage](https://codecov.io/gh/Azzaare/ConstraintsTranslator.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/Azzaare/ConstraintsTranslator.jl) [![Aqua](https://raw.githubusercontent.com/JuliaTesting/Aqua.jl/master/badge.svg)](https://github.com/JuliaTesting/Aqua.jl) -A package for translating natural-language descriptions of optimization problems into Constraint Programming models using Large Language Models (LLMs). For this pre-stable version stage, our target is to have models solved via [`CBLS.jl`](https://github.com/JuliaConstraints/CBLS.jl). Eventually, we expect this library to work for most of Julia CP ecosystem, alongside toher CP modeling languages such as MiniZinc, OR-Tools, etc. +A package for translating natural-language descriptions of optimization problems into Constraint Programming models using Large Language Models (LLMs). For this pre-stable version stage, our target is to have models solved via [`CBLS.jl`](https://github.com/JuliaConstraints/CBLS.jl). Eventually, we expect this library to work for most of Julia CP ecosystem, alongside other CP modeling languages such as MiniZinc, OR-Tools, etc. This package acts as a light wrapper around common LLM API endpoints, supplying appropriate system prompts and context informations to the LLMs to generate CP models. Specifically, we first prompt the model for generating an high-level representation of the problem in editable Markdown format, and then we prompt the model to generate Julia code. @@ -22,11 +22,15 @@ To begin playing with the package, you can start from the example below: ```julia using ConstraintsTranslator -llm = GoogleLLM("gemini-1.5-pro") - # Optional setup of a terminal editor (uncomment and select a viable editor on your machine such as vim, nano, emacs, ...) +# The EDITOR variable *must* be set for the interactive mode to work properly. ENV["EDITOR"] = "vim" +# Optional setup of an API key for proprietary models. +# The appropriate API key environment variables *must* be set for using properietary LLMs. +ENV["GOOGLE_API_KEY"] = "42" + +llm = GoogleLLM("gemini-1.5-pro") description = """ We need to determine the shortest possible route for a salesman who must visit a set of cities exactly once and return to the starting city. @@ -54,4 +58,3 @@ This example uses Google Gemini as an LLM. You will need an API key and a model At each generation step, it will prompt the user in an interactive menu to accept the answer, edit the prompt and/or the generated text, or generate another answer with the same prompt. The LLM expects the user to provide examples of the input data format. If no examples are present, the LLM will make assumptions about the data format based on the problem description. - From 7794774ccae4a6d957dceda0eae752435ee3beb5 Mon Sep 17 00:00:00 2001 From: Nicola Di Cicco <93935338+nicoladicicco@users.noreply.github.com> Date: Tue, 24 Sep 2024 12:02:49 +0900 Subject: [PATCH 3/3] Update README.md --- README.md | 27 +++++++++++++++++---------- 1 file changed, 17 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index 494dc34..4c24338 100644 --- a/README.md +++ b/README.md @@ -17,18 +17,25 @@ We currently support the following LLM APIs: Groq and Gemini are currently offering rate-limited free access to their APIs, and llama.cpp is free and open-source. We are still actively experimenting with this package, and we are not in a position to pay for API access. We might consider adding support for other APIs in the future. ## Workflow example -To begin playing with the package, you can start from the example below: +Before playing with the package, we need to set up two environment variables: +1. The EDITOR variable for specifying a text editor (such as `vim`, `nano`, `emacs`, ...). This will be used during interactive execution. +2. An API key. This is necessary only for interacting with proprietary LLMs. + +We can configure those variables by, e.g., appending the following to your `.bashrc` or equivalent: +```bash +export EDITOR="vim" +export GOOGLE_API_KEY="42" +``` +Or we can configure them in Julia: ```julia -using ConstraintsTranslator - -# Optional setup of a terminal editor (uncomment and select a viable editor on your machine such as vim, nano, emacs, ...) -# The EDITOR variable *must* be set for the interactive mode to work properly. ENV["EDITOR"] = "vim" - -# Optional setup of an API key for proprietary models. -# The appropriate API key environment variables *must* be set for using properietary LLMs. ENV["GOOGLE_API_KEY"] = "42" +``` + +Finally, we can start playing with the package. Below, an example for translating a natural-language description of the Traveling Salesman Problem: +```julia +using ConstraintsTranslator llm = GoogleLLM("gemini-1.5-pro") @@ -53,8 +60,8 @@ response = translate(llm, description) The `translate` function will first produce a Markdown representation of the problem, and then return the generated Julia code for parsing the input data and building the model. -This example uses Google Gemini as an LLM. You will need an API key and a model id to access proprietary API endpoints. Use `help?>` in the Julia REPL to learn more about the available models. - At each generation step, it will prompt the user in an interactive menu to accept the answer, edit the prompt and/or the generated text, or generate another answer with the same prompt. The LLM expects the user to provide examples of the input data format. If no examples are present, the LLM will make assumptions about the data format based on the problem description. + +This example uses Google Gemini as an LLM. You will need an API key and a model id to access proprietary API endpoints. Use `help?>` in the Julia REPL to learn more about the available models.