Error35 reports
Fix BadRequestError
in LiteLLM
✅ Solution
BadRequestError in litellm usually indicates an issue with the format or content of your API request, often due to incorrect parameters or malformed input data, such as unclosed strings or unsupported arguments for a specific model. Carefully review your request, ensuring all parameters (like `stop`, `stop_sequences`, or input strings) are correctly formatted and supported by the model you are using, consulting the respective model's API documentation to confirm compatibility. Validate your input data for any errors like unterminated strings or characters that might be causing issues.
Related Issues
Real GitHub issues where developers encountered this error:
[Bug]: Opus Thinking Dropped UnpredictablyJan 11, 2026
[Bug]: Limit the "stop" and "stop_sequences" arguments based on modelJan 11, 2026
[Bug]: litellm.BadRequestError: AnthropicException - Unterminated string starting at:Jan 11, 2026
[Bug]: #15988 changes clobber OpenAI passthrough responses routeJan 9, 2026
[Bug]: Streaming requests with invalid parameters return SSE format instead of JSON error responseJan 7, 2026
Timeline
First reported:Dec 15, 2025
Last reported:Jan 11, 2026