You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@GregoireW, Which vector store are you using? To fix this issue, you need to provide a custom BatchingStrategy ben in your application. The default TokenCountBatchingStrategy implementation uses the default context-window size set by openai - 8191. You need to adjust the max token size when using different embedding models. Here is an example of overriding this bean:
Bug description
I try to use vertex embedding and did test big document. I did set the auto-truncate to true.
This correspond to this options:
spring-ai/models/spring-ai-vertex-ai-embedding/src/main/java/org/springframework/ai/vertexai/embedding/text/VertexAiTextEmbeddingOptions.java
Line 67 in be0f9fb
But I got an exception from TokenCountBatchingStrategy (
spring-ai/spring-ai-core/src/main/java/org/springframework/ai/embedding/TokenCountBatchingStrategy.java
Line 147 in be0f9fb
What to do in this situation ?
Environment
Spring AI 1.0.0-M4 / jdk21
Steps to reproduce
Use vertex embedding with the "auto truncate" option, and test with a large payload.
Expected behavior
Success or at least some way in the documentation to make it works.
Minimal Complete Reproducible example
var document = new Document("go ".repeat(50000));
vectorStore.add(List.of(document));
The text was updated successfully, but these errors were encountered: