understanding of overfit_batches #11859
Unanswered
talhaanwarch
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 4 replies
-
There might be two reasons for your 80% accuracy:
Also, I would suggest you look at the training loss rather than training accuracy. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am but confused about overfit_batches, if in case of overfit_batches, if the model performance is less than 100%, then it mean that we have some issue, because we are training and validating on same batches. is it correct?
In my case the performance is around 80%, i am not sure in case of overfit_batches, is it acceptable or there is issue
Do we need to track training metric, val metric or both in case of overfit_batches?
Beta Was this translation helpful? Give feedback.
All reactions