Replies: 1 comment 3 replies
-
Hello,
For issue2, I also did it like you, reassigning |
Beta Was this translation helpful? Give feedback.
-
Hello,
For issue2, I also did it like you, reassigning |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In page 201,
issue 1: model.pos_emb.weight.shape is [1024,768], of which 1024 is the context length and 768 is the dimension of the embeddings. So, should it be
shape[0]
instead?issue 2: if supported_contex_length is smaller than max_length, it makes sense to truncate the input_ids to supported_contex_length. But in this case, what's the meaning of expand to max_length? Though in latter code, calling of classify_review uses max_length of 120 (train_dataset.max_length), which is much smaller than supported_context_length.
So a better way to handle both cases is to:
It also takes care of the case when the max_length is None.
Beta Was this translation helpful? Give feedback.
All reactions