Training loss significantly higher than validation loss, consistently #19603
Unanswered
dempsey-ryan
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to figure out if I'm using
validation_step
andtraining_step
properly. It seems I am, but maybe it's not accumulating properly? I've checked for data leakage, there is none. From my LightningModule, here are the training and validation loops, as well as a small portion of init to give context for certain variables:and here is the data split:
Is it my data split? Perhaps SubsetRandomSampler and/or Subset don't work as I'm expecting, or maybe the loops aren't written correctly
Beta Was this translation helpful? Give feedback.
All reactions