You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm a developer of a machine learning system at a start up company.
I want to deploy a TrOCR model on a triton inference server.
This is a transformer based encoder-decoder OCR model.
It would be great if you tell me whether it can be deploy using FasterTransformer or not.
If it is impossible, do you have another solution to deploy it on the triton inference server?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi, I'm a developer of a machine learning system at a start up company.
I want to deploy a TrOCR model on a triton inference server.
This is a transformer based encoder-decoder OCR model.
It would be great if you tell me whether it can be deploy using FasterTransformer or not.
If it is impossible, do you have another solution to deploy it on the triton inference server?
Beta Was this translation helpful? Give feedback.
All reactions