You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> If enable `WAMR_BUID_WASI_NN`, iwasm will link a shared WAMR library instead of a static one. Wasi-nn backends will be loaded dynamically at runtime. Users shall specify the path of the backend library and register it to the iwasm runtime with `--native-lib=<path of backend library>`. All shared libraries should be placed in the `LD_LIBRARY_PATH`.
20
+
> Enabling WAMR_BUILD_WASI_NN will cause the IWASM to link to a shared WAMR library instead of a static one. The WASI-NN backends will then be loaded dynamically when the program is run. You must ensure that all shared libraries are included in the `LD_LIBRARY_PATH`.
21
21
22
22
#### Compilation options
23
23
24
-
-`WAMR_BUILD_WASI_NN`. enable wasi-nn support. can't work alone. need to identify a backend. Match legacy wasi-nn spec naming convention. use `wasi_nn` as import module names.
25
-
-`WAMR_BUILD_WASI_EPHEMERAL_NN`. Match latest wasi-nn spec naming convention. use `wasi_ephemeral_nn` as import module names.
26
-
-`WAMR_BUILD_WASI_NN_TFLITE`. identify the backend as TensorFlow Lite.
27
-
-`WAMR_BUILD_WASI_NN_OPENVINO`. identify the backend as OpenVINO.
24
+
-`WAMR_BUILD_WASI_NN`. This option enables support for WASI-NN. It cannot function independently and requires specifying a backend. It follows the original WASI-NN specification for naming conventions and uses wasi_nn for import module names.
25
+
-`WAMR_BUILD_WASI_EPHEMERAL_NN`. This option adheres to the most recent WASI-NN specification for naming conventions and uses wasi_ephemeral_nn for import module names.
26
+
-`WAMR_BUILD_WASI_NN_TFLITE`. This option designates TensorFlow Lite as the backend.
27
+
-`WAMR_BUILD_WASI_NN_OPENVINO`. This option designates OpenVINO as the backend.
28
+
-`WAMR_BUILD_WASI_NN_LLAMACPP`. This option designates Llama.cpp as the backend.
It is required to recompile the Wasm application if you want to switch between the two sets of functions.
46
47
47
-
#### Openvino
48
+
#### Openvino installation
48
49
49
50
If you're planning to use OpenVINO backends, the first step is to install OpenVINO on your computer. To do this correctly, please follow the official installation guide which you can find at this link: https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-archive-linux.html.
50
51
@@ -162,17 +163,9 @@ Supported:
162
163
163
164
### Testing with WasmEdge-WASINN Examples
164
165
165
-
To ensure everything is set up correctly, use the examples from [WasmEdge-WASINN-examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master). These examples help verify that WASI-NN support in WAMR is functioning as expected.
166
+
To make sure everything is configured properly, refer to the examples provided at [WasmEdge-WASINN-examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master). These examples are useful for confirming that the WASI-NN support in WAMR is working correctly.
166
167
167
-
> Note: The repository contains two types of examples. Some use the [standard wasi-nn](https://github.com/WebAssembly/wasi-nn), while others use [WasmEdge's version of wasi-nn](https://github.com/second-state/wasmedge-wasi-nn), which is enhanced to meet specific customer needs.
168
-
169
-
The examples test the following machine learning backends:
170
-
171
-
- OpenVINO
172
-
- PyTorch
173
-
- TensorFlow Lite
174
-
175
-
Due to the different requirements of each backend, we'll use a Docker container for a hassle-free testing environment.
168
+
Because each backend has its own set of requirements, we recommend using a Docker container to create a straightforward testing environment without complications.
It should be noted that the qwen example is selected as the default one about the Llama.cpp backend because it uses a small model and is easy to run.
188
+
189
+
```bash
190
+
- openvino_mobile_image. PASS
191
+
- openvino_mobile_raw. PASS
192
+
- openvino_road_segmentation_adas. PASS
193
+
- wasmedge_ggml_qwen. PASS
194
+
```
195
+
196
+
### Testing with bytecodealliance WASI-NN
193
197
194
198
For another example, check out [classification-example](https://github.com/bytecodealliance/wasi-nn/tree/main/rust/examples/classification-example), which focuses on OpenVINO. You can run it using the same Docker container mentioned above.
0 commit comments