-
Notifications
You must be signed in to change notification settings - Fork 320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to send a llama_parse job in python async? #513
Comments
|
When I do this, i get 'await parser.aload_data(...)' was never awaited |
i also tried:
|
ok so this work, but issue is that my lambda endpoint returns right away before the job even starts. any suggestions? |
Sorry, I misunderstood your question. Async is probably not what you want. You'll probably want to use the raw api. Create a job, return the job ID, use the job ID to check if the status is done and retrieve the result |
@logan-markewich i see but in the docs i don't see how i can provide webhook url in the API call?
|
https://docs.cloud.llamaindex.ai/llamaparse/features/webhook @logan-markewich this is webhook docs for llama cloud. the API example doesnt show how to pass webhook url @logan-markewich |
ok it works. you have to pass in
in the payload. maybe docs should be updated |
hi, I want to start a job in python but not wait for the response. How can I achieve this?
The text was updated successfully, but these errors were encountered: