Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to send a llama_parse job in python async? #513

Open
AyushParikh opened this issue Nov 25, 2024 · 10 comments
Open

How to send a llama_parse job in python async? #513

AyushParikh opened this issue Nov 25, 2024 · 10 comments
Labels
enhancement New feature or request

Comments

@AyushParikh
Copy link

hi, I want to start a job in python but not wait for the response. How can I achieve this?

    parser = LlamaParse(
        webhook_url=webhook_url
    ).load_data(file_path=path)

@AyushParikh AyushParikh added the enhancement New feature or request label Nov 25, 2024
@logan-markewich
Copy link
Contributor

await parser.aload_data(...)

@AyushParikh
Copy link
Author

AyushParikh commented Nov 25, 2024

await parser.aload_data(...)

When I do this, i get

'await parser.aload_data(...)' was never awaited

@AyushParikh
Copy link
Author

i also tried:

async def start_parser(parser, local_file_path: str):
    await parser.aload_data(file_path=local_file_path)
    
start_parser(parser=parser, local_file_path=local_file_path)

@AyushParikh
Copy link
Author

@logan-markewich

@AyushParikh
Copy link
Author

ok so this work, but issue is that my lambda endpoint returns right away before the job even starts. any suggestions?

@logan-markewich
Copy link
Contributor

Sorry, I misunderstood your question. Async is probably not what you want.

You'll probably want to use the raw api. Create a job, return the job ID, use the job ID to check if the status is done and retrieve the result
https://docs.cloud.llamaindex.ai/category/API/parsing

@logan-markewich
Copy link
Contributor

@AyushParikh
Copy link
Author

@logan-markewich i see but in the docs i don't see how i can provide webhook url in the API call?

curl -X 'POST' \
  'https://api.cloud.llamaindex.ai/api/parsing/upload'  \
  -H 'accept: application/json' \
  -H 'Content-Type: multipart/form-data' \
  -H "Authorization: Bearer $LLAMA_CLOUD_API_KEY" \
  -F 'file=@/path/to/your/file.pdf;type=application/pdf'

@AyushParikh
Copy link
Author

https://docs.cloud.llamaindex.ai/llamaparse/features/webhook @logan-markewich this is webhook docs for llama cloud. the API example doesnt show how to pass webhook url @logan-markewich

@AyushParikh
Copy link
Author

ok it works. you have to pass in

        data = {
            'webhook_url': webhook_url
        }

in the payload. maybe docs should be updated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants