Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
kyleconroy committed Oct 30, 2024
1 parent abe5a0b commit 4a98e9a
Showing 1 changed file with 9 additions and 8 deletions.
17 changes: 9 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,23 +5,23 @@
[![Tests](https://github.com/simonw/llm-jq/actions/workflows/test.yml/badge.svg)](https://github.com/simonw/llm-jq/actions/workflows/test.yml)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-jq/blob/main/LICENSE)

Write and execute jq programs with the help of LLM
Write and execute Python scripts with the help of LLM

See [Run a prompt to generate and execute jq programs using llm-jq](https://simonwillison.net/2024/Oct/27/llm-jq/) for background on this project.

## Installation

Install this plugin in the same environment as [LLM](https://llm.datasette.io/).
```bash
llm install llm-jq
llm install llm-py
```
## Usage

Pipe JSON directly into `llm jq` and describe the result you would like:
Pipe JSON directly into `llm py` and describe the result you would like:

```bash
curl -s https://api.github.com/repos/simonw/datasette/issues | \
llm jq 'count by user.login, top 3'
llm py 'count by user login, top 3'
```
Output:
```json
Expand All @@ -43,12 +43,12 @@ Output:
```
group_by(.user.login) | map({login: .[0].user.login, count: length}) | sort_by(-.count) | .[0:3]
```
The JSON is printed to standard output, the jq program is printed to standard error.
The JSON is printed to standard output, the Python script is printed to standard error.

Options:

- `-s/--silent`: Do not print the jq program to standard error
- `-o/--output`: Output just the jq program, do not run it
- `-s/--silent`: Do not print the Python script to standard error
- `-o/--output`: Output just the Python script, do not run it
- `-v/--verbose`: Show the prompt sent to the model and the response
- `-m/--model X`: Use a model other than the configured LLM default model
- `-l/--length X`: Use a length of the input other than 1024 as the example
Expand All @@ -58,8 +58,9 @@ By default, the first 1024 bytes of JSON will be sent to the model as an example
## Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

```bash
cd llm-jq
cd llm-py
python -m venv venv
source venv/bin/activate
```
Expand Down

0 comments on commit 4a98e9a

Please sign in to comment.