-
Notifications
You must be signed in to change notification settings - Fork 413
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add skip_resource_check option #191
base: master
Are you sure you want to change the base?
Conversation
If you really need to preserve the file size data, we can do it by starting a GET request stream and terminating it early as soon as the Content-Length header is available. |
I need to create a task that runs over thousands of amazon s3 videos and updates their metadata; obviously it's better not to download each video. Unfortunately the HEAD request fails for me, so maybe this is a good direction 👍 |
@mhluska are you sure the file size data is lost in a head request? i'm actually not quite sure how ffmpeg gets the metadata from a remote file |
@yoelblum if I remember correctly, with this library you'll ultimately have to download the full file to do any processing on it. If all you need is the content-length header and modifying metadata in S3 then you probably don't need this library. You can probably just get by with the AWS ruby SDK: https://github.com/aws/aws-sdk-ruby |
@mhluska It works fine without downloading the file. The metadata is returning correctly - including duration and file size. I have no idea how it works but it works! the only issue I had was indeed the HEAD request for amazon s3. |
Huh, interesting. Well glad you got it working. |
Having the exact same problem, if the url is a presigned S3 one, the HEAD will fail because the presigned url's from S3 are only for GET request, if you want one that supports HEAD, the url signature will be different. |
Issue #189
Allows skipping the internal HEAD request which may get rejected by some pre-signed URLs (e.g. Amazon S3).
This helps in environments like Heroku where initially downloading the file is not an option (large files contribute to process memory causing the dyno to get killed).