Skip to content

Commit

Permalink
[Example] Update pytorch example
Browse files Browse the repository at this point in the history
Signed-off-by: Sylveon <[email protected]>
  • Loading branch information
LFsWang committed Oct 21, 2024
1 parent 5c064aa commit e2bc625
Show file tree
Hide file tree
Showing 19 changed files with 226 additions and 135 deletions.
26 changes: 14 additions & 12 deletions .github/workflows/pytorch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,12 @@ on:
branches: [ '*' ]
paths:
- ".github/workflows/pytorch.yml"
- "pytorch-mobilenet-image/**"
- "pytorch-resnet18-image/**"
pull_request:
branches: [ '*' ]
paths:
- ".github/workflows/pytorch.yml"
- "pytorch-mobilenet-image/**"
- "pytorch-resnet18-image/**"

jobs:
build:
Expand All @@ -41,25 +41,27 @@ jobs:
- name: Install WasmEdge + WASI-NN + PyTorch
run: |
VERSION=0.13.4
VERSION=0.14.1
curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | sudo bash -s -- -v $VERSION --plugins wasi_nn-pytorch -p /usr/local
export PYTORCH_VERSION="1.8.2"
# For the Ubuntu 20.04 or above, use the libtorch with cxx11 abi.
export PYTORCH_ABI="libtorch-cxx11-abi"
curl -s -L -O --remote-name-all https://download.pytorch.org/libtorch/lts/1.8/cpu/${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip
export PYTORCH_VERSION="2.4.1"
# For AOTI example, use the libtorch with cxx11 abi.
export PYTORCH_ABI="libtorch"
curl -s -L -O --remote-name-all https://download.pytorch.org/libtorch/cpu/${PYTORCH_LINK}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip
unzip -q "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip"
rm -f "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip"
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$(pwd)/libtorch/lib
- name: Example
run: |
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$(pwd)/libtorch/lib
cd pytorch-mobilenet-image/rust
cd pytorch-resnet18-image/rust
cargo build --target wasm32-wasi --release
cd ..
wasmedge compile rust/target/wasm32-wasi/release/wasmedge-wasinn-example-mobilenet-image.wasm wasmedge-wasinn-example-mobilenet-image-aot.wasm
wasmedge compile rust/target/wasm32-wasi/release/wasmedge-wasinn-example-mobilenet-image-named-model.wasm wasmedge-wasinn-example-mobilenet-image-named-model-aot.wasm
wasmedge compile rust/target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image.wasm wasmedge-wasinn-example-resnet18-image-aot.wasm
wasmedge compile rust/target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image-named-model.wasm wasmedge-wasinn-example-resnet18-image-named-model-aot.wasm
echo "Run without named model"
wasmedge --dir .:. wasmedge-wasinn-example-mobilenet-image-aot.wasm mobilenet.pt input.jpg
wasmedge --dir .:. wasmedge-wasinn-example-resnet18-image-aot.wasm resnet18.pt input.jpg
echo "Run with named model"
wasmedge --dir .:. --nn-preload demo:PyTorch:CPU:mobilenet.pt wasmedge-wasinn-example-mobilenet-image-named-model-aot.wasm demo input.jpg
wasmedge --dir .:. --nn-preload demo:PyTorch:CPU:resnet18.pt wasmedge-wasinn-example-resnet18-image-named-model-aot.wasm demo input.jpg
echo "Run with AOTI"
wasmedge --dir .:. --nn-preload demo:PyTorchAOTI:CPU:$(pwd)/resnet18_pt2.so wasmedge-wasinn-example-resnet18-image-named-model.wasm demo input.jpg
105 changes: 0 additions & 105 deletions pytorch-mobilenet-image/README.md

This file was deleted.

18 changes: 0 additions & 18 deletions pytorch-mobilenet-image/gen_mobilenet_model.py

This file was deleted.

Binary file removed pytorch-mobilenet-image/mobilenet.pt
Binary file not shown.
144 changes: 144 additions & 0 deletions pytorch-resnet18-image/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,144 @@
# Resnet18 Example For WASI-NN with PyTorch Backend

This package is a high-level Rust bindings for [wasi-nn] example of Resnet18 with PyTorch backend.

[wasi-nn]: https://github.com/WebAssembly/wasi-nn

## Dependencies

This crate depends on the `wasi-nn` in the `Cargo.toml`:

```toml
[dependencies]
wasi-nn = "0.6.0"
```

## Build

Compile the application to WebAssembly:

```bash
cargo build --target=wasm32-wasi --release
```

Because here, we will demonstrate two ways of using wasi-nn. So the output WASM files will be at [`target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image.wasm`](wasmedge-wasinn-example-resnet18-image.wasm) and [`target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image-named-model.wasm`](wasmedge-wasinn-example-resnet18-image-named-model.wasm).
To speed up the image processing, we can enable the AOT mode in WasmEdge with:

```bash
wasmedgec rust/target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image.wasm wasmedge-wasinn-example-resnet18-image-aot.wasm

wasmedgec rust/target/wasm32-wasi/release/wasmedge-wasinn-example-resnet18-image-named-model.wasm wasmedge-wasinn-example-resnet18-image-named-model-aot.wasm
```

## Run

### Generate Model

First generate the fixture of the pre-trained mobilenet with the script:

```bash
pip3 install torch==2.4.1 numpy pillow --extra-index-url https://download.pytorch.org/whl/lts/1.8/cpu
# generate the model fixture
python3 gen_resnet18_model.py
```

(Or you can use the pre-generated one at [`resnet18.pt`](resnet18.pt))

### Test Image

The testing image `input.jpg` is downloaded from <https://github.com/bytecodealliance/wasi-nn/raw/main/rust/examples/images/1.jpg> with license Apache-2.0

### Generate Tensor

If you want to generate the [raw tensor](image-1x3x224x224.rgb), you can run:

```bash
python3 gen_tensor.py input.jpg image-1x3x224x224.rgb
```

### Execute

Users should [install the WasmEdge with WASI-NN PyTorch backend plug-in](https://wasmedge.org/docs/start/install#wasi-nn-plug-in-with-pytorch-backend).

Execute the WASM with the `wasmedge` with PyTorch supporting:

- Case 1:

```bash
wasmedge --dir .:. wasmedge-wasinn-example-resnet18-image.wasm resnet18.pt input.jpg
```

You will get the output:

```console
Loaded graph into wasi-nn with ID: 0
Created wasi-nn execution context with ID: 0
Read input tensor, size in bytes: 602112
Executed graph inference
1.) [954](18.0458)banana
2.) [940](15.6954)spaghetti squash
3.) [951](14.1337)lemon
4.) [942](13.2925)butternut squash
5.) [941](10.6792)acorn squash
```

- Case 2: Apply named model feature
> requirement wasi-nn >= 0.5.0 and WasmEdge-plugin-wasi_nn-(*) >= 0.13.4 and
> --nn-preload argument format follow <name>:<encoding>:<target>:<model_path>
```bash
wasmedge --dir .:. --nn-preload demo:PyTorch:CPU:resnet18.pt wasmedge-wasinn-example-resnet18-image-named-model.wasm demo input.jpg
```

You will get the same output:

```console
Loaded graph into wasi-nn with ID: 0
Created wasi-nn execution context with ID: 0
Read input tensor, size in bytes: 602112
Executed graph inference
1.) [954](18.0458)banana
2.) [940](15.6954)spaghetti squash
3.) [951](14.1337)lemon
4.) [942](13.2925)butternut squash
5.) [941](10.6792)acorn squash
```

## Run from AOTInductor

### Generate Model
PyTorch backend also support load from the AOTInductor (Shared Library). To compile the pytorch model, please follow the Pytorch official tutorial.

* https://pytorch.org/tutorials/recipes/torch_export_aoti_python.html


Or you can use the pre-generated one at [`resnet18_pt2.so`](resnet18_pt2.so). However it may not suitable for your machine. it is suggested to use [`gen_resnet18_aoti`](gen_resnet18_aoti) recompile the model.

> Notice: The AOTInductor from pip will use old c++ abi interface, it is maybe incompatible with wasmedge release, you may need to install the libtorch **without c++11 abi** and rebuild the wasmedge with `-DWASMEDGE_USE_CXX11_ABI=OFF`.

```bash
## Build Wasmedge with cmake example
cmake -Bbuild -GNinja -DWASMEDGE_USE_CXX11_ABI=OFF -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=PyTorch .
```

### Execute

To run the AOT Inductor, you need use `--nn-preload` with `PyTorchAOTI` interface and specify absolute path to load the shared library.

```bash
export LD_LIBRARY_PATH=/path_to_libtorch/lib
./wasmedge --dir .:. --nn-preload demo:PyTorchAOTI:CPU:/absolute_path_model/resnet18_pt2.so wasmedge-wasinn-example-resnet18-image-named-model.wasm demo input.jpg
```

```console
Loaded graph into wasi-nn with ID: 0
Created wasi-nn execution context with ID: 0
Read input tensor, size in bytes: 602112
Executed graph inference
1.) [954](18.0458)banana
2.) [940](15.6954)spaghetti squash
3.) [951](14.1337)lemon
4.) [942](13.2925)butternut squash
5.) [941](10.6792)acorn squash
```
39 changes: 39 additions & 0 deletions pytorch-resnet18-image/gen_resnet18_aoti.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# More detail please follow the link below
# https://pytorch.org/tutorials/recipes/torch_export_aoti_python.html

import os
import torch
from torchvision.models import ResNet18_Weights, resnet18

model = resnet18(weights=ResNet18_Weights.DEFAULT)
model.eval()

with torch.inference_mode():

# Specify the generated shared library path
aot_compile_options = {
"aot_inductor.output_path": os.path.join(os.getcwd(), "resnet18_pt2.so"),
}
if torch.cuda.is_available():
device = "cuda"
aot_compile_options.update({"max_autotune": True})
else:
device = "cpu"

model = model.to(device=device)
example_inputs = (torch.randn(2, 3, 224, 224, device=device),)

# min=2 is not a bug and is explained in the 0/1 Specialization Problem
batch_dim = torch.export.Dim("batch", min=2, max=32)
exported_program = torch.export.export(
model,
example_inputs,
# Specify the first dimension of the input x as dynamic
dynamic_shapes={"x": {0: batch_dim}},
)
so_path = torch._inductor.aot_compile(
exported_program.module(),
example_inputs,
# Specify the generated shared library path
options=aot_compile_options
)
17 changes: 17 additions & 0 deletions pytorch-resnet18-image/gen_resnet18_model.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
import os
import torch
from torch import jit

with torch.no_grad():
fake_input = torch.rand(1, 3, 224, 224)
model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet18', pretrained=True)
model.eval()
out1 = model(fake_input).squeeze()

sm = torch.jit.script(model)
if not os.path.exists("resnet18.pt"):
sm.save("resnet18.pt")
load_sm = jit.load("resnet18.pt")
out2 = load_sm(fake_input).squeeze()

print(out1[:5], out2[:5])
File renamed without changes.
File renamed without changes
Binary file added pytorch-resnet18-image/resnet18.pt
Binary file not shown.
Binary file added pytorch-resnet18-image/resnet18_pt2.so
Binary file not shown.
12 changes: 12 additions & 0 deletions pytorch-resnet18-image/run.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
import os
import torch

device = "cuda" if torch.cuda.is_available() else "cpu"
model_so_path = os.path.join(os.getcwd(), "resnet18_pt2.so")

model = torch._export.aot_load(model_so_path, device)
example_inputs = (torch.randn(1, 3, 224, 224, device=device),)

with torch.inference_mode():
output = model(example_inputs)
print(output)
File renamed without changes.
File renamed without changes.

0 comments on commit e2bc625

Please sign in to comment.