-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support running recognize in a separate container (with GPU support) #1061
Comments
Hello 👋 Thank you for taking the time to open this issue with recognize. I know it's frustrating when software I look forward to working with you on this issue |
Hi @SilleBille Nextcloud GmbH is planning to make this happen soon-ish :) |
+1 upvote to this... especially the integration with aio. |
That wold be a good way to improve nextcloud ecosystem and allow users to install containers that are required. |
Any updates on this? Would be great to have the video tagging and performance benefits. I might have a little dev time to contribute if someone could point me in the right direction. |
If you're in instant need, there's this mod I've done: The way I have it set up is very close if not exactly that what requestor raised - in my case it runs on completely separate system, uses FS via NFS to access files and stores all data in shared Postgres DB. Use the nvidia-tensor-based if you're fine with some mappings being done there. |
Sorry to say, the plans have been scrapped due to lack of engineering time so far. It's still on our list of things that would be nice to have, but it's not scheduled any time soon for now :/ |
Are you open to community contributions on this one? If so, it would helpful to have an outline of how you intended to implement it, if one exists. I'm not sure I will personally have the time, but perhaps someone else does. |
I'd be open to community contributions on this. my rough plan would be not to deviate too much from how the models are run right now. Instead of the Classifier class executing node.js directly, there would be an option in the settings to call out to the recognize External App instead, or perhaps the external app could be auto-detected. The external app would do the same thing as the Classifier class, execute node.js and return the json line results, so they can be processed in the original recognize app. These are the current docs on how App API / External Apps work: https://cloud-py-api.github.io/app_api/index.html |
I'm closing this in favor of #73 which is basically the same thing. Upvote there, to make it more likely I get to work on this :) |
Describe the feature you'd like to request
With the
nextcloud/all-in-one
container that spawns multiple containers, I'd like to runrecognize
app in a separate container, that can be used by thenextcloud-aio-nextcloud
container. This will let the user provide GPU capabilities to the docker without having to provide that functionality to the whole NC eco-system. It also provides more portability and isolation.Describe the solution you'd like
A docker container that is exposed through a specific port so that the main
nextcloud-aio-nextcloud
container can interact with it through the same networknextcloud-aio
bridge network.Describe alternatives you've considered
A docker container that shares all the existing volume and mounts used by
nextcloud-aio-nextcloud
container inro
format. Completely trying to build a container with CUDA + cuDNN and implement this solution. (I'll keep this issue updated based on the results)The text was updated successfully, but these errors were encountered: