Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]About the custom operation lib in Whisper #101

Open
Eric-Edf opened this issue Dec 28, 2024 · 1 comment
Open

[Question]About the custom operation lib in Whisper #101

Eric-Edf opened this issue Dec 28, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@Eric-Edf
Copy link

Dear authors,

Thank you all for releasing such an amazing project.

I encountered some problems during transferring the Recognizer.java into c++ code. The core issue is caused by some customed operations in “libonnxruntime-extensions-jni4.so”.

I think there are some necessary dependencies defined in this lib, and it doesn’t have the source codes. May I ask what are the contents in the customed lib? And could you release the source code?

Thanks!

@Eric-Edf Eric-Edf added the question Further information is requested label Dec 28, 2024
@niedev
Copy link
Owner

niedev commented Dec 28, 2024

Hi, thank you!
The library you are referring to is onnxruntime-extensions, here is the source code and instructions on how to import it into C++ code, here is also how to import onnxruntime into C/C++.
The source code is not present in the project because the android version of these libraries were imported via gradle here (line 106 and 107), but, if you want to convert the inference in C++, the C/C++ versions of these libraries in the links I provided, will work in a similar way and do the same things with the same models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants