-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Transformers version bump #2029
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Update: unpinning pydantic seems for more flexibility on how deepspeed is tested as they seem to use some strange aliases to enable for deprecated features to be used, see: deepspeedai/DeepSpeed#4407. Huggingface also unpinned pydantic here Now the test stumbles on some mismatch for gemma
As well as some openai client tests:
Would appreciate a look on these, perhaps @k223kim for gemma and @Borda for openai api tests |
I think his need to be fixed:
|
Yes, however this comes from huggingface code that calls deepspeed
My understanding is that they allow this such that old code can be run with an older of version of pydantic |
cc: @t-vi |
cc: @aniketmaurya |
does this pass now? |
It seems like the issue might be related to Pydantic version. Under the gpu test, somehow it appears to be falling back to a much older version of Pydantic ( ![]() cc: @aniketmaurya @Borda |
we did some major updates on master regarding dev env, so let's update it |
Transformers version bump for more recent model support