agents.application.prompts.py def prompts_polymarket( self, data1: str, data2: str, market_question: str, outcome: str ) -> str: current_market_data = str(data1) #24
Labels
bug
Something isn't working
Describe the bug
python -m scripts.python.cli ask-polymarket-llm election
this can produce:
BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 16385 tokens. However, your messages resulted in 96817
tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Steps to reproduce
using the default gpt-3.5-turbo will result in the error
Expected behavior
if modify agents/application/executor.py to use
gpt-4-1106-preview
possible to get a workaround, but I would like to split data1 from agents.application.prompts.py, which should resolve the issue properly
def prompts_polymarket( self, data1: str, data2: str, market_question: str, outcome: str ) -> str: current_market_data = str(data1)
Environment
Additional context
Add any other context about the problem here
.
The text was updated successfully, but these errors were encountered: