Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

code error #2

Open
Jacky-Android opened this issue Oct 15, 2023 · 5 comments
Open

code error #2

Jacky-Android opened this issue Oct 15, 2023 · 5 comments

Comments

@Jacky-Android
Copy link

unet 181line,should write as:
if self.mode != 'ori':
c2 = torch.cat([c2, self.ife1(c2)],dim=1)
c3 = torch.cat([c3, self.ife2(c3)],dim=1)

@Jacky-Android
Copy link
Author

def histc_fork(self,ij) not have self

@fulinlin3855
Copy link

futures : List[torch.jit.Future[torch.Tensor]] = [] List是什么呢

@fulinlin3855
Copy link

unet 181line,should write as: if self.mode != 'ori': c2 = torch.cat([c2, self.ife1(c2)],dim=1) c3 = torch.cat([c3, self.ife2(c3)],dim=1)

请问cat的时候因为有radio导致维度不一致,怎么cat呢

@Jacky-Android
Copy link
Author

unet 181line,should write as: if self.mode != 'ori': c2 = torch.cat([c2, self.ife1(c2)],dim=1) c3 = torch.cat([c3, self.ife2(c3)],dim=1)

请问cat的时候因为有radio导致维度不一致,怎么cat呢

Not encountered, you need to ensure that the input tensor dimensions are consistent: before using torch.cat, ensure that the tensors to be concatenated have the same shape in all dimensions except the concatenation dimension. You can use functions like torch.unsqueeze or torch.reshape to reshape tensors so that they have the same shape.

Or use torch.stack: If the tensors you want to join are not consistent in all dimensions, you may consider using the torch.stack function. torch.stack creates a new dimension and stacks the input tensors in this new dimension. This will keep the dimensions of all input tensors consistent.

Also consider using the dim parameter of torch.cat: When using torch.cat, be sure to specify the dim parameter to indicate in which dimension to join. This helps you ensure that you are concatenating tensors in the correct dimensions.

@fulinlin3855
Copy link

unet 181line,should write as: if self.mode != 'ori': c2 = torch.cat([c2, self.ife1(c2)],dim=1) c3 = torch.cat([c3, self.ife2(c3)],dim=1)

请问cat的时候因为有radio导致维度不一致,怎么cat呢

Not encountered, you need to ensure that the input tensor dimensions are consistent: before using torch.cat, ensure that the tensors to be concatenated have the same shape in all dimensions except the concatenation dimension. You can use functions like torch.unsqueeze or torch.reshape to reshape tensors so that they have the same shape.

Or use torch.stack: If the tensors you want to join are not consistent in all dimensions, you may consider using the torch.stack function. torch.stack creates a new dimension and stacks the input tensors in this new dimension. This will keep the dimensions of all input tensors consistent.

Also consider using the dim parameter of torch.cat: When using torch.cat, be sure to specify the dim parameter to indicate in which dimension to join. This helps you ensure that you are concatenating tensors in the correct dimensions.
Ok,thank you!

unet 181line,should write as: if self.mode != 'ori': c2 = torch.cat([c2, self.ife1(c2)],dim=1) c3 = torch.cat([c3, self.ife2(c3)],dim=1)

请问cat的时候因为有radio导致维度不一致,怎么cat呢

Not encountered, you need to ensure that the input tensor dimensions are consistent: before using torch.cat, ensure that the tensors to be concatenated have the same shape in all dimensions except the concatenation dimension. You can use functions like torch.unsqueeze or torch.reshape to reshape tensors so that they have the same shape.

Or use torch.stack: If the tensors you want to join are not consistent in all dimensions, you may consider using the torch.stack function. torch.stack creates a new dimension and stacks the input tensors in this new dimension. This will keep the dimensions of all input tensors consistent.

Also consider using the dim parameter of torch.cat: When using torch.cat, be sure to specify the dim parameter to indicate in which dimension to join. This helps you ensure that you are concatenating tensors in the correct dimensions.

Ok,thank you very much

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants