You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Requesting 50 files at once generally work, but not if the file names happen to be long. In this case the request fails since the URL is longer than ~2048 bytes, which seems to be about the longest that is supported. Would be good to figure out exactly where the limit is here (urllib?) and make sure that if the URL generated isn't longer than this - if 50 files doesn't fit, the request should probably be run twice or as many times as is needed to process all files in the task.
The text was updated successfully, but these errors were encountered:
Requesting 50 files at once generally work, but not if the file names happen to be long. In this case the request fails since the URL is longer than ~2048 bytes, which seems to be about the longest that is supported. Would be good to figure out exactly where the limit is here (urllib?) and make sure that if the URL generated isn't longer than this - if 50 files doesn't fit, the request should probably be run twice or as many times as is needed to process all files in the task.
The text was updated successfully, but these errors were encountered: