You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am experiencing issues when trying to submit a Sagemaker processing job. Job submission fails with the following error:
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the CreateProcessingJob operation:
Unfortunately the error message is empty.
After lots of trail and error, I believe that the error is related to the requested volume size. When I request a volume size of 600 GB or below, everything runs smoothly. The issue appears when I request a volume size of 700 GB or above. When I request more than 1024 GB, I receive a different error message:
botocore.errorfactory.ResourceLimitExceeded: An error occurred (ResourceLimitExceeded) when calling the CreateProcessingJob operation: The request delta of 1025 GBs for 'Size of EBS volume for a processing job instance' is greater than the account-level service-limit of 1024 GBs. Please use AWS Service Quotas to request an increase for this quota. If AWS Service Quotas is not available, contact AWS support to request an increase for this quota.
If 1024 GB is my account's quota, I suppose that 700 GB should be fine and this is not a quota issue.
Is there a limit that I am not aware of? In any case, I would expect a non-empty error message.
Code to reproduce
Here is a toy example (with some placeholders).
This is the job that should be executed, process.py, a simple Python script that counts the number of JPEG files in a given directory.
importargparseimportjsonimportosdeffind_files_recursively(directory, file_extensions=(".jpeg", ".jpg")):
""" Recursively find all the files with the given file extensions :param directory: input directory to search recursively :param file_extensions: list of file extensions :return: iterator that yields absolute paths """file_extensions=set(file_extensions)
forroot, dirs, filesinos.walk(directory):
forbasenameinfiles:
ext=os.path.splitext(basename)[1].lower()
ifextinfile_extensions:
filepath=os.path.join(root, basename)
yieldfilepathif__name__=="__main__":
parser=argparse.ArgumentParser()
parser.add_argument("--input_dir", type=str, required=True, help="A local directory")
parser.add_argument("--output_dir", type=str, help="Output directory", default="/opt/ml/processing/output")
args=vars(parser.parse_args())
ifnotos.path.exists(args["input_dir"]):
print(f"Given input directory {args['input_dir']} does not exist")
exit(1)
all_filepaths=list(find_files_recursively(args["input_dir"]))
print(f"Number of images in {args['input_dir']}: {len(all_filepaths)}")
# Store result to a JSON fileresult= {
"input_dir": args["input_dir"],
"number_of_files": len(all_filepaths),
}
output_file=os.path.join(args["output_dir"], "file_counts.json")
withopen(output_file, "w") asf:
json.dump(result, f, indent=4)
This is how the job is submitted:
fromsagemaker.pytorchimportPyTorchProcessorfromsagemaker.processingimportProcessingInput, ProcessingOutputfromsagemakerimportSessionimportos# Some constantsBASE_DIR=os.path.dirname(__file__)
ROLE= ... # sourced from an environment variableIMAGE_URI= ... # a custom image# Bucket for input and output databucket="my-custom-bucket"sagemaker_session=Session(default_bucket=bucket)
processor=PyTorchProcessor(
image_uri=IMAGE_URI,
framework_version=None,
role=ROLE,
instance_count=1,
instance_type="ml.g5.4xlarge",
volume_size_in_gb=700,
sagemaker_session=sagemaker_session,
code_location=f"s3://{bucket}/tmp", # S3 prefix where the code is uploaded tobase_job_name="debug-sagemaker-processing-700",
command=["python"],
env={
"PYTHONPATH": "/opt/ml/processing/input/code"
},
)
# Remote paths must begin with "/opt/ml/processing/".data_root="/opt/ml/processing/data"models_dir="/opt/ml/processing/models"output_dir="/opt/ml/processing/output"processor.run(
inputs=[
ProcessingInput(source=f"s3://{bucket}", destination=data_root),
],
outputs=[ProcessingOutput(
output_name="output",
source=output_dir,
destination=f"s3://{bucket}/tmp")
],
code="process.py",
source_dir=BASE_DIR,
arguments=[
"--input_dir", f"{data_root}/data",
"--output_dir", output_dir,
],
wait=False,
)
print(f"Processing job submitted. Check output in s3://{bucket}/tmp/")
Expected behavior
The submit script should produce the following output:
INFO:sagemaker:Creating processing-job with name debug-sagemaker-processing-700-2024-11-21-14-45-34-336
Processing job submitted. Check output in s3://my-custom-bucket/tmp/
Instead, I am getting the following error message and a stack trace:
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the CreateProcessingJob operation:
The error message is empty.
System information
Python 3.11
boto3 version 1.34.153
The text was updated successfully, but these errors were encountered:
Thankfully, my colleague found the problem. The problem is that ml.g5.4xlarge instances only have 600 GB of disk space. One solution is to switch to an instance with more disk space.
I'll keep this issue here as a reminder to improve the error message.
Dear Sagemaker team,
I am experiencing issues when trying to submit a Sagemaker processing job. Job submission fails with the following error:
Unfortunately the error message is empty.
After lots of trail and error, I believe that the error is related to the requested volume size. When I request a volume size of 600 GB or below, everything runs smoothly. The issue appears when I request a volume size of 700 GB or above. When I request more than 1024 GB, I receive a different error message:
If 1024 GB is my account's quota, I suppose that 700 GB should be fine and this is not a quota issue.
Is there a limit that I am not aware of? In any case, I would expect a non-empty error message.
Code to reproduce
Here is a toy example (with some placeholders).
This is the job that should be executed, process.py, a simple Python script that counts the number of JPEG files in a given directory.
This is how the job is submitted:
Expected behavior
The submit script should produce the following output:
Instead, I am getting the following error message and a stack trace:
The error message is empty.
System information
The text was updated successfully, but these errors were encountered: