diff --git a/README.md b/README.md index df7905d..f004c96 100644 --- a/README.md +++ b/README.md @@ -5,7 +5,7 @@ [![Tests](https://github.com/simonw/llm-jq/actions/workflows/test.yml/badge.svg)](https://github.com/simonw/llm-jq/actions/workflows/test.yml) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-jq/blob/main/LICENSE) -Write and execute jq programs with the help of LLM +Write and execute Python scripts with the help of LLM See [Run a prompt to generate and execute jq programs using llm-jq](https://simonwillison.net/2024/Oct/27/llm-jq/) for background on this project. @@ -13,15 +13,15 @@ See [Run a prompt to generate and execute jq programs using llm-jq](https://simo Install this plugin in the same environment as [LLM](https://llm.datasette.io/). ```bash -llm install llm-jq +llm install llm-py ``` ## Usage -Pipe JSON directly into `llm jq` and describe the result you would like: +Pipe JSON directly into `llm py` and describe the result you would like: ```bash curl -s https://api.github.com/repos/simonw/datasette/issues | \ - llm jq 'count by user.login, top 3' + llm py 'count by user login, top 3' ``` Output: ```json @@ -43,12 +43,12 @@ Output: ``` group_by(.user.login) | map({login: .[0].user.login, count: length}) | sort_by(-.count) | .[0:3] ``` -The JSON is printed to standard output, the jq program is printed to standard error. +The JSON is printed to standard output, the Python script is printed to standard error. Options: -- `-s/--silent`: Do not print the jq program to standard error -- `-o/--output`: Output just the jq program, do not run it +- `-s/--silent`: Do not print the Python script to standard error +- `-o/--output`: Output just the Python script, do not run it - `-v/--verbose`: Show the prompt sent to the model and the response - `-m/--model X`: Use a model other than the configured LLM default model - `-l/--length X`: Use a length of the input other than 1024 as the example @@ -58,8 +58,9 @@ By default, the first 1024 bytes of JSON will be sent to the model as an example ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: + ```bash -cd llm-jq +cd llm-py python -m venv venv source venv/bin/activate ```