Skip to content

01_PyTorch Workflow (Runtime Error: Expected all tensors to be on the same device) #929

Closed Answered by allen-ajith
allen-ajith asked this question in Q&A
Discussion options

You must be logged in to vote

Solved!

It was a typo in forward() method while creating model

def forward(self, x: torch.Tensor) -> torch.Tensor: return self.linear_layer(X)

Correction is

def forward(self, x: torch.Tensor) -> torch.Tensor: return self.linear_layer(x)

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by allen-ajith
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant