Replies: 1 comment
-
Correct way to use Sigmoid activation module is
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I tried to use sigmoid instead of ReLU inside the model, as it also activates the non-linear function. I kept everything the same based on the tutorial. However, my model is not learning anything. When I tried this in the TensorFlow playground, it appeared to work. Could someone explain this?
Code -
from torch import nn
class CircleModelV3(nn.Module):
def init(self):
super().init()
def forward(self,x:torch.tensor):
return self.layer_3(self.sigmoid(self.layer_2(self.sigmoid(self.layer_1(x)))))
'''
Beta Was this translation helpful? Give feedback.
All reactions