The LlmInference model is not closing #5740
Labels
platform:android
Issues with Android as Platform
task:LLM inference
Issues related to MediaPipe LLM Inference Gen AI setup
type:bug
Bug in the Source Code of MediaPipe Solution
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
Yes
OS Platform and Distribution
Android 14
Mobile device if the issue happens on mobile device
QCOM ADP 8155
Browser and version if the issue happens on browser
No response
Programming Language and version
Kotlin/Java
MediaPipe version
0.10.18
Bazel version
No response
Solution
llmInference
Android Studio, NDK, SDK versions (if issue is related to building in Android environment)
Android Studio Koala | 2024.1.1
Xcode & Tulsi version (if issue is related to building for iOS)
No response
Describe the actual behavior
I call close function in LlmInference, but this is not close immediately
Describe the expected behaviour
Closes and cleans up the LlmInference Model immediately
Standalone code/steps you may have used to try to get what you need
inferenceModel.close()
Other info / Complete Logs
The text was updated successfully, but these errors were encountered: