Different train and test loss for same inputs as the course #427
Replies: 2 comments
-
Hi, I can't view your colab notebook. Could you paste the model class, loss function, optimizer, and training/test loop cells? |
Beta Was this translation helpful? Give feedback.
-
Hi @aashiswar1989, Don't worry too much if your values aren't 100% the same as the course videos. As PyTorch changes overtime, some values may change (even with the same random seed). What's most important is the trend of where the values are going. They should be close but don't have to be the 100% same. If you run the notebook here, do you get the same values? Colab link: https://colab.research.google.com/github/mrdbourke/pytorch-deep-learning/blob/main/01_pytorch_workflow.ipynb |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am on the chapter PyTorch Workflow Fundamentals. Here we are building a Linear regression model. The weight and bias values are generated randomly with a manual seed value == 42.
I replicate the values as taken in the course but while training the model I am getting some different values.
I observe that initially the bias value increases towards the ideal value (0.3) but later in the training phase it starts decreasing again and stagnates after 150 epochs.
Could someone tell why is this happening?
Here is the link to my colab notebook:
https://colab.research.google.com/drive/1FASg9BQNMLoA5Bt5V18REoSzpw74elsK#scrollTo=668wFEs1nzIU
Beta Was this translation helpful? Give feedback.
All reactions