Log the same metric when using multiple dataloaders in validation_step() #18542
Unanswered
thangld201
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
It's been a while, but did you find a solution? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I have a particular use case when using multiple dataloaders for validation in
validation_step()
. I have metrics that I want to aggregate across these different dataloaders for checkpoint monitoring, but when I useself.log('val/loss',....)
in thevalidation_step()
, automatically a suffix will be added: e.g.'val/loss'
->'val/loss/dataloader_idx_0'
,'val/loss'
->'val/loss/dataloader_idx_1'
... This makes it impossible to reduce this metric across datasets.I tried setting
add_dataloader_idx=False
but it reports another errorHow can I do this in Pytorch Lightning ? I'm currently using the 1.9.4 version.
Beta Was this translation helpful? Give feedback.
All reactions