Change8
Error2 reports

Fix BadRequestError

in MetaGPT

Solution

The "BadRequestError" often arises when using `max_tokens` with models that require `max_completion_tokens` or don't support `max_tokens` directly. To fix, replace `max_tokens` with `max_completion_tokens` in your OpenAI API call if the model supports it. If neither parameter works, adjust your parameters to match the specific model's documentation or remove the parameter and rely on default token limits.

Timeline

First reported:Feb 21, 2025
Last reported:Sep 3, 2025

Need More Help?

View the full changelog and migration guides for MetaGPT

View MetaGPT Changelog