We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我理解的p-tuning的原理是冷冻后面大语言模型,只调整前面的prompt embedding模型。但是在您代码的实现中(https://github.com/THUDM/P-tuning/blob/main/PT-Fewshot/pet/wrapper.py 中optimizer部分) 同时对后面大语言模型的参数进行了微调,想问下这部分是我理解错了吗
The text was updated successfully, but these errors were encountered:
我理解意义上也是需要对整体模型进行微调
Sorry, something went wrong.
可以参考一下之前的一个issue #4
No branches or pull requests
您好,我理解的p-tuning的原理是冷冻后面大语言模型,只调整前面的prompt embedding模型。但是在您代码的实现中(https://github.com/THUDM/P-tuning/blob/main/PT-Fewshot/pet/wrapper.py 中optimizer部分) 同时对后面大语言模型的参数进行了微调,想问下这部分是我理解错了吗
The text was updated successfully, but these errors were encountered: