You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our pipeline definition (see below) runs into a RecursionError: maximum recursion depth exceeded exception. We're using QdrantDocumentStore and QdrantEmbeddingRetriever from qdrant-haystack which seems to be causing the error since running the pipeline with its in-memory counterparts instead is successful.
Expected Behaviour
The pipeline should run without throwing an exception, correctly handling the Qdrant integration types.
Observed Behavior
$ hayhooks run --pipelines-dir ./pipelines
Stacktrace
INFO: Pipelines dir set to: ./pipelines/retrieval/
File ".venv/bin/hayhooks", line 8, in<module>sys.exit(hayhooks())
^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/cli/run/__init__.py", line 20, in run
uvicorn.run("hayhooks.server:app", host=host, port=port)
File ".venv/lib/python3.12/site-packages/uvicorn/main.py", line 579, in run
server.run()
File ".venv/lib/python3.12/site-packages/uvicorn/server.py", line 65, in run
return asyncio.run(self.serve(sockets=sockets))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
returnfuture.result()
^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/uvicorn/server.py", line 69, in serve
await self._serve(sockets)
File ".venv/lib/python3.12/site-packages/uvicorn/server.py", line 76, in _serve
config.load()
File ".venv/lib/python3.12/site-packages/uvicorn/config.py", line 434, in load
self.loaded_app = import_from_string(self.app)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/uvicorn/importer.py", line 19, in import_from_string
module = importlib.import_module(module_str)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 995, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File ".venv/lib/python3.12/site-packages/hayhooks/server/__init__.py", line 1, in<module>
from hayhooks.server.app import app
File ".venv/lib/python3.12/site-packages/hayhooks/server/app.py", line 32, in<module>
app = create_app()
^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/app.py", line 27, in create_app
deployed_pipeline = deploy_pipeline_def(app, pipeline_defintion)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/deploy_utils.py", line 20, in deploy_pipeline_def
PipelineRunRequest = get_request_model(pipeline_def.name, pipe.inputs())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/pipelines/models.py", line 29, in get_request_model
input_type = handle_unsupported_types(typedef["type"], {DataFrame: dict})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 65, in handle_unsupported_types
return handle_generics(type_)
^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 44, in handle_generics
result = handle_unsupported_types(t, types_mapping)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 61, in handle_unsupported_types
new_type[arg_name] = handle_generics(arg_type)
^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 44, in handle_generics
result = handle_unsupported_types(t, types_mapping)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 61, in handle_unsupported_types
new_type[arg_name] = handle_generics(arg_type)
^^^^^^^^^^^^^^^^^^^^^^^^^
# ... repeated frames truncated ...
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 44, in handle_generics
result = handle_unsupported_types(t, types_mapping)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/hayhooks/server/utils/create_valid_type.py", line 59, in handle_unsupported_types
forarg_name, arg_typeinget_type_hints(type_).items():
^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 2244, in get_type_hints
value = _eval_type(value, base_globals, base_locals)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 414, in _eval_type
return t._evaluate(globalns, localns, recursive_guard)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 929, in _evaluate
self.__forward_value__ = _eval_type(
^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in _eval_type
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in<genexpr>
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in _eval_type
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in<genexpr>
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in _eval_type
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/typing.py", line 428, in<genexpr>
ev_args = tuple(_eval_type(a, globalns, localns, recursive_guard) forain t.__args__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RecursionError: maximum recursion depth exceeded
pipeline.yml
components:
embedder:
init_parameters:
model: nullprefix: ''suffix: ''token:
env_vars:
- HF_API_TOKENstrict: falsetype: env_varurl: http://localhost:8080type: agrichat.ingestion.components.embedders.HuggingFaceTEITextEmbedderlist_to_str_adapter:
init_parameters:
custom_filters: {}output_type: strtemplate: '{{ replies[0] }}'unsafe: falsetype: haystack.components.converters.output_adapter.OutputAdapterllm:
init_parameters:
api_base_url: http://localhost:8000/v1api_key:
env_vars:
- OPENAI_API_KEYstrict: truetype: env_vargeneration_kwargs: {}model: mistralai/Mistral-Nemo-Instruct-2407organization: nullstreaming_callback: nulltype: haystack.components.generators.chat.openai.OpenAIChatGeneratormemory_joiner:
init_parameters:
type_: list[haystack.dataclasses.chat_message.ChatMessage]type: haystack.components.joiners.branch.BranchJoinermemory_retriever:
init_parameters:
last_k: 10message_store:
init_parameters: {}type: haystack_experimental.chat_message_stores.in_memory.InMemoryChatMessageStoretype: haystack_experimental.components.retrievers.chat_message_retriever.ChatMessageRetrievermemory_writer:
init_parameters:
message_store:
init_parameters: {}type: haystack_experimental.chat_message_stores.in_memory.InMemoryChatMessageStoretype: haystack_experimental.components.writers.chat_message_writer.ChatMessageWriterprompt_builder:
init_parameters:
required_variables: &id001 !!python/tuple
- query
- documents
- memoriestemplate: nullvariables: *id001type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilderquery_rephrase_llm:
init_parameters:
api_base_url: http://localhost:8000/v1api_key:
env_vars:
- OPENAI_API_KEYstrict: truetype: env_vargeneration_kwargs: {}model: mistralai/Mistral-Nemo-Instruct-2407organization: nullstreaming_callback: nullsystem_prompt: nulltype: haystack.components.generators.openai.OpenAIGeneratorquery_rephrase_prompt_builder:
init_parameters:
required_variables: nulltemplate: "\nRewrite the question for semantic search while keeping its meaning\\ and key terms intact.\nIf the conversation history is empty, DO NOT change\\ the query.\nDo not translate the question.\nUse conversation history only\\ if necessary, and avoid extending the query with your own knowledge.\nIf\\ no changes are needed, output the current question as is.\n\nConversation\\ history:\n{% for memory in memories %}\n {{ memory.content }}\n{% endfor\\ %}\n\nUser Query: {{query}}\nRewritten Query:\n"variables: nulltype: haystack.components.builders.prompt_builder.PromptBuilderretriever:
init_parameters:
document_store:
init_parameters:
api_key: nullembedding_dim: 768force_disable_check_same_thread: falsegrpc_port: 6334hnsw_config: nullhost: nullhttps: nullindex: Documentinit_from: nulllocation: nullmetadata: {}on_disk: falseon_disk_payload: nulloptimizers_config: nullpath: nullpayload_fields_to_index: nullport: 6333prefer_grpc: falseprefix: nullprogress_bar: falsequantization_config: nullrecreate_index: falsereplication_factor: nullreturn_embedding: falsescroll_size: 10000shard_number: nullsimilarity: cosinesparse_idf: falsetimeout: nullurl: http://localhost:6333use_sparse_embeddings: falsewait_result_from_api: truewal_config: nullwrite_batch_size: 100write_consistency_factor: nulltype: haystack_integrations.document_stores.qdrant.document_store.QdrantDocumentStorefilter_policy: replacefilters: nullgroup_by: nullgroup_size: nullreturn_embedding: falsescale_score: falsescore_threshold: nulltop_k: 3type: haystack_integrations.components.retrievers.qdrant.retriever.QdrantEmbeddingRetrieverconnections:
- receiver: query_rephrase_llm.promptsender: query_rephrase_prompt_builder.prompt
- receiver: list_to_str_adapter.repliessender: query_rephrase_llm.replies
- receiver: embedder.textsender: list_to_str_adapter.output
- receiver: retriever.query_embeddingsender: embedder.embedding
- receiver: prompt_builder.documentssender: retriever.documents
- receiver: llm.messagessender: prompt_builder.prompt
- receiver: memory_joiner.valuesender: llm.replies
- receiver: query_rephrase_prompt_builder.memoriessender: memory_retriever.messages
- receiver: prompt_builder.memoriessender: memory_retriever.messages
- receiver: memory_writer.messagessender: memory_joiner.valuemax_runs_per_component: 100metadata: {}
Hypothesis
The recursion happens when handle_unsupported_types processes nested or generic types like qdrant_client.http.models.models.Filter.
I've monkey-patched a print statement in handle_unsupported_types to have a look at its parameters causing the recursion:
defhandle_unsupported_types(
type_: type, types_mapping: Dict[type, type], skip_callables: bool=True
) ->Union[GenericAlias, type, None]:
""" Recursively handle types that are not supported by Pydantic by replacing them with the given types mapping. """print(type_, types_mapping)
...
which repeatedly prints the following before also throwing the exception:
Problem Description
Our pipeline definition (see below) runs into a
RecursionError: maximum recursion depth exceeded
exception. We're usingQdrantDocumentStore
andQdrantEmbeddingRetriever
fromqdrant-haystack
which seems to be causing the error since running the pipeline with its in-memory counterparts instead is successful.Expected Behaviour
The pipeline should run without throwing an exception, correctly handling the Qdrant integration types.
Observed Behavior
Stacktrace
pipeline.yml
Hypothesis
The recursion happens when
handle_unsupported_types
processes nested or generic types likeqdrant_client.http.models.models.Filter
.I've monkey-patched a print statement in
handle_unsupported_types
to have a look at its parameters causing the recursion:which repeatedly prints the following before also throwing the exception:
Console Prints
It seems like
handle_unsupported_types
doesn't terminate for certain nested generic types. Manually increasing recursion depth might solve this.requirements.txt
The text was updated successfully, but these errors were encountered: