Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a benchmark for wasi-nn #201

Merged
merged 6 commits into from
Nov 18, 2022

Conversation

brianjjones
Copy link
Contributor

This benchmark will measure the performance of
wasi-nn by measuring the time required to complete a single inference.

This benchmark will measure the performance of
wasi-nn by measuring the time required to complete
a single inference.
@brianjjones
Copy link
Contributor Author

brianjjones commented Sep 15, 2022

One issue that needs to be addressed is that this benchmark requires OpenVINO to run. So I would think it would be best for it to be an optional benchmark that doesn't get executed automatically by run-all.sh. Thoughts on the best course of action? I already have setup.sh ready to go, which will install it for you, but obviously it shouldn't be run when calling run-all.sh.

* Renaming to image-classification to better
  describe what it does.
* Updated setup.sh to unpack a local copy of
  OpenVINO rather than installing.
* Updated the README to inform users how to use
  this benchmark.
* Changed the name of the .wasm file from
  benchmark.wasm to image-classification-benchmark.wasm
  so that it doesn't get picked up by run_all.sh
@brianjjones
Copy link
Contributor Author

@abrown This PR is ready for review now. It's been updated with the changes we discussed. The one possible issue is that the current build.sh script expects the file created in Docker to be named benchmark.wasm. Perhaps we could change this to look for any file that matches *-benchmark.wasm? This way, when build_all.sh or build.sh is called on image-classification, it names it image-classification-benchmark.wasm instead. This prevents run_all.sh from accidentally picking up the image-classification benchmark.

@jlb6740 jlb6740 requested review from jlb6740 and abrown October 18, 2022 22:52
Copy link
Collaborator

@abrown abrown left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this mostly looks good minus the comments I made below. I am interested also to hear @fitzgen's opinion on this PR as well.

ADD rust-benchmark rust-benchmark
WORKDIR /usr/src/rust-benchmark
RUN cargo build --release --target=wasm32-wasi
RUN cp target/wasm32-wasi/release/image-classification-benchmark.wasm /benchmark.wasm
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I agree with your comment above about changing the build.sh to be a more liberal "copier." The problem is that I'm not sure you can use glob patterns like *benchmark.wasm with Docker commands? If you can, then great, we should do that; otherwise we may have to modify build.sh to copy everything from some directory, e.g., /sightglass-output.

Copy link
Collaborator

@abrown abrown Oct 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm actually surprised by this question: how does the "rebuild" CI not fail? It seems like it should because this Dockerfile emits a benchmark.wasm but you have checked in a image-classification-benchmark.wasm — that should result in a failure when CI realizes we are not building the same files.

I think the answer is that build-all.sh picks a random subset of benchmarks to build, so this will eventually fail, just not every time. (Rebuilding the benchmarks takes a while so in an effort to speed up CI we just do a subset). You might want to run build.sh on this benchmark and see what happens.

(On a side note, I think at some point there was a Git check to make sure none of the bytes had changed, e.g., git diff --exit-code. That is no longer there so let me open an issue... #210)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@abrown Not sure about why the CI didn't fail, but I just made some changes so now build.sh allows for different wasm file names besides benchmark.wasm. Now you can either pass the filename.wasm to build.sh, or it will look in the benchmark you are building for the name and use that.

FILENAME=l_openvino_toolkit_ubuntu20_2022.2.0.7713.af16ea1d79a_x86_64
[ ! -d ${WASI_NN_DIR}/openvino ] && wget -nc -q --no-check-certificate https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.2/linux/${FILENAME}.tgz -O ${WASI_NN_DIR}/${FILENAME}.tgz
[ ! -d ${WASI_NN_DIR}/openvino ] && tar -C ${WASI_NN_DIR} -zxf ${WASI_NN_DIR}/${FILENAME}.tgz && mv ${WASI_NN_DIR}/${FILENAME} ${WASI_NN_DIR}/openvino || echo "OpenVINO is already there, skipping..."
source ${WASI_NN_DIR}/openvino/setupvars.sh
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this script also retrieve mobilenet.bin, mobilenet.xml, and the test.jpg? The additional weight isn't a lot for those files (i.e., <14MB) but it might be nice to avoid downloading it if users don't want to.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I guess I was just doing it this way because all the other benchmarks included everything needed to just run the provided benchmark.wasm. But since this requires you run setup.sh regardless, I don't see why they can't be downloaded.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update: I'm keeping the original test.jpg because the old .bgr file doesn't seem to work with the example.

@@ -66,7 +66,7 @@ fn main() {
// Build the engine library.
section("Building the engine");
exec(
&["cargo", "build", "--release", "-p", "wasmtime-bench-api"],
&["cargo", "build", "--release", "-p", "wasmtime-bench-api", "--features", "wasi-nn"],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did we figure out if this was actually necessary? I think it would be nicer (but this is a nit, not a real complained) if it could be removed either a) because it is unnecessary or b) by an upstream change to build in wasi-nn by default to bench-api.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this isn't needed, I'll take it out.

// the pixel precision to FP32. The resulting BGR pixel vector is then returned.
fn image_to_tensor(path: String, height: u32, width: u32) -> Vec<u8> {
let pixels = Reader::open(path).unwrap().decode().unwrap();
let dyn_img: DynamicImage = pixels.resize_exact(width, height, image::imageops::Triangle);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there an issue anywhere (or TODO) to remind us to replace with the new and improved image2tensor crate?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added a TODO:

publish = false

[dependencies]
image = { version = "0.23.14", default-features = false, features = ["gif", "jpeg", "ico", "png", "pnm", "tga", "tiff", "webp", "bmp", "hdr", "dxt", "dds", "farbfeld"] }
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we're only using "jpeg" here, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose there's no reason currently to pull in the other image types.

Removing un-needed "--features
Adding TODO for image2tensor
Removing un-needed dependencies
Download the model files
Changes to build.sh that allow benchmarks to have
a different name than benchmark.wasm. This allows
them to opt out of ./run-all.sh if they choose.
Users can provide the expected .wasm name if they
want. If one isn't provided, it will look for the
filename in the benchmark being built.
@jlb6740
Copy link
Collaborator

jlb6740 commented Nov 14, 2022

Hi @brianjjones
Without being sure of the steps to follow to run this I did this:

  1. cd benchmarks; ./build.sh image-classification/
  2. cc image-classification ; "Read the readme" ; ./setup.sh
  3. cc ../../ ; cargo run -- benchmark --engine engines/wasmtime/libengine.so --engine-flags=--wasi-modules=experimental-wasi-nn -- benchmarks/image-classification/ # but no wasm file.

Realized I should run ./build.sh after I ran.

  1. cd benchmarks; ./build.sh image-classification/
  2. cc../ ; cargo run -- benchmark --engine engines/wasmtime/libengine.so --engine-flags=--wasi-modules=experimental-wasi-nn -- benchmarks/image-classification/image-classification-benchmark.wasm

But here there is this error:

 Running `target/debug/sightglass-cli benchmark --engine engines/wasmtime/libengine.so --engine-flags=--wasi-modules=experimental-wasi-nn -- benchmarks/image-classification/image-classification-benchmark.wasm`

Host error: BackendError(BackendAccess(library loading error

Caused by:
system failed to load shared libraries (see https://github.com/intel/openvino-rs/blob/main/crates/openvino-finder): the shared library at /opt/intel/openvino_2021/deployment_tools/inference_engine/lib/intel64/libinference_engine_c_api.so could not be opened: libinference_engine.so: cannot open shared object file: No such file or directory))

What are the correct steps?

@brianjjones
Copy link
Contributor Author

@jlb6740 For whatever reason sourcing the setupvars.sh script from setup.sh doesn't work. So you'll need to call the following:

source benchmarks/image-classification/openvino/setupvars.sh

After that, your cargo run should work.

@brianjjones
Copy link
Contributor Author

Also I'll update the README with more clear instructions.

@brianjjones
Copy link
Contributor Author

@jlb6740 I updated the instructions in the README, let me know if that's more clear and works for you.

@jlb6740
Copy link
Collaborator

jlb6740 commented Nov 18, 2022

Looks good to me. Thanks for the updates.

@jlb6740 jlb6740 merged commit 49f6739 into bytecodealliance:main Nov 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants