Skip to content

Maximum context window length per model #394

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bubenkoff opened this issue May 11, 2025 · 1 comment
Open

Maximum context window length per model #394

bubenkoff opened this issue May 11, 2025 · 1 comment
Labels
documentation Improvements or additions to documentation

Comments

@bubenkoff
Copy link

Please provide constants for the maximum context length per model.
OpenAI used to provide this in a table format, but not anymore: https://platform.openai.com/docs/models.
So keeping track of the new models and their limits is hard, as there is also no API for this.
If we have one source of truth, that will be very helpful.
Thank you!

@jacobzim-stl jacobzim-stl added the documentation Improvements or additions to documentation label May 19, 2025
@jacobzim-stl
Copy link
Collaborator

Thanks for the suggestion!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

2 participants