Skip to content

Commit

Permalink
Add examples infrastructure and add example for iterative computation
Browse files Browse the repository at this point in the history
  • Loading branch information
Kobzol committed Dec 4, 2023
1 parent fd05d92 commit 3604508
Show file tree
Hide file tree
Showing 5 changed files with 88 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ mkdocs==1.2.3
mkdocs-material==7.3.2
mkdocs-minify-plugin==0.5.0
mkdocs-git-revision-date-localized-plugin==0.10.0
mkdocs-gen-files==0.5.0
mike==1.1.2
requests==2.31.0
jinja2==3.0.3
Expand Down
8 changes: 8 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Examples
Here you can find several examples of how HyperQueue can be used for various use-cases, both with the command-line
interface and also with the Python API.

You can view these examples either in the [documentation](https://it4innovations.github.io/hyperqueue/stable/examples/iterative-computation/)
or on [GitHub](https://github.com/It4innovations/hyperqueue/tree/main/examples).

- [Iterative computation](iterative-computation)
58 changes: 58 additions & 0 deletions examples/iterative-computation/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Iterative computation
It is a common use-case to perform an iterative computation, e.g. run a randomized simulation until the results are
stable/accurate enough, or train a machine learning model while the loss keeps dropping.

While there is currently no built-in support in HQ for iteratively submitting new tasks to an existing job, you can perform
an iterative computation relatively easily with the following approach:

1. Submit a HQ job that performs a computation
2. Wait for the job to finish
3. Read the output of the job and decide if computation should continue
4. If yes, go to 1.

# Python API
With the Python API, we can simply write the outermost iteration loop in Python, and repeatedly submit jobs, until some
end criterion has been achieved:

```python
from hyperqueue import Job, Client

client = Client()

while True:
job = Job()
job.program(["my-program"], stdout="out.txt")

# Submit a job
submitted = client.submit(job)

# Wait for it to complete
client.wait_for_jobs([submitted])

# Read the output of the job
with open("out.txt") as f:
# Check some termination condition and eventually end the loop
if f.read().strip() == "done":
break
```

# Command-line interface
With the command-line interface, you can perform the iterative loop e.g. in Bash.

```bash
#!/bin/bash

while :
do
# Submit a job and wait for it to complete
./hq submit --wait ./compute.sh

# Read the output of the job
output=$(./hq job cat last stdout)

# Decide if we should end or continue
if [ "${output}" -eq 0 ]; then
break
fi
done
```
6 changes: 6 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ nav:
- Getting Started:
- Quickstart: quickstart.md
- Cheatsheet: cheatsheet.md
- Examples:
- examples/README.md
- Iterative computation: examples/iterative-computation/README.md
- Deployment:
- deployment/index.md
- Server: deployment/server.md
Expand Down Expand Up @@ -81,6 +84,9 @@ plugins:
canonical_version: stable
- nedoc:
path: python/apidoc
- gen-files:
scripts:
- scripts/doc_copy_examples.py

extra:
analytics:
Expand Down
15 changes: 15 additions & 0 deletions scripts/doc_copy_examples.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
"""
Copy all files from the `examples` directory to `<built-docs>/examples`, so that they can be rendered in the
documentation.
"""
import glob
import os.path

import mkdocs_gen_files


for path in glob.glob("examples/**/*", recursive=True):
if os.path.isfile(path):
with open(path) as src_file:
with mkdocs_gen_files.open(path, "w") as dest_file:
dest_file.write(src_file.read())

0 comments on commit 3604508

Please sign in to comment.