Change8
Error1 reports

Fix IncompleteOutputException

in Instructor

Solution

IncompleteOutputException usually means the LLM response was cut off prematurely, often due to insufficient `max_tokens` or a restrictive stopping condition causing truncation. Increase `max_tokens` in your `OpenAISchema` or adjust your prompt to encourage more concise responses, or relax any custom stopping criteria to allow the model to fully express its output. Ensure retry mechanisms are in place to handle potential truncation and attempt re-generation.

Related Issues

Real GitHub issues where developers encountered this error:

Timeline

First reported:Dec 12, 2025
Last reported:Dec 12, 2025

Need More Help?

View the full changelog and migration guides for Instructor

View Instructor Changelog