Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问作者,在微调vit_b+rvsa+upernet模型的是,报错没有这个模块layer_decay_optimizer_constructor_vitae,看了代码好像没有这个文件? #19

Open
funny000 opened this issue Apr 15, 2024 · 5 comments

Comments

@funny000
Copy link

No description provided.

@DotWang
Copy link
Collaborator

DotWang commented Apr 15, 2024

@funny000 这个不需要,这次只用到ViT没用到ViTAE,注释掉就行,可能我没删干净

@funny000
Copy link
Author

感谢作者回复@DotWang 上面问题解决了,现在用vit_b-rvsa+upernet模型遇到这个问题,Expected more than 1 value per channel when training, got input size torch.Size([1, 768, 1, 1]),数据集是自己裁剪的potsdam,大小是512*512,打断点是在for ppm in self,out=ppm(x)里面报的错,请问作者这是数据集没处理对吗?

@DotWang
Copy link
Collaborator

DotWang commented Apr 16, 2024

@funny000 ppm里边有bn,batch size不能为1

@funny000
Copy link
Author

非常感谢@DotWang,现在samrs的模型可以跑通,但是调mtp的时候,比如vit_rvsa_mtp.py里面的RVSA_MTP模块是再需要重装一边mmseg包吗,现在跑这个模型会报RVSA_MTP模块没有注册

@funny000
Copy link
Author

非常感谢@DotWang,现在samrs的模型可以跑通,但是调mtp的时候,比如vit_rvsa_mtp.py里面的RVSA_MTP模块是再需要重装一边mmseg包吗,现在跑这个模型会报RVSA_MTP模块没有注册

这个问题解决了,把上面那个py文件复制到mmseg相应位置,init.py添加导入报错的模块就行了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants