Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about model parallel in minillm/dpkd #295

Open
DiaQusNet opened this issue Dec 27, 2024 · 1 comment
Open

Question about model parallel in minillm/dpkd #295

DiaQusNet opened this issue Dec 27, 2024 · 1 comment

Comments

@DiaQusNet
Copy link

Hi, I noticed that the implementation of minillm and dpkd requires installing your customized version of the transformers. After reviewing the code, I found that your modifications mainly involve model parallelism. So, can I use the official transformers and disable model parallelism? Will this allow me to run your full suite of code (including training and evaluation)?

@t1101675
Copy link
Contributor

t1101675 commented Jan 4, 2025

We also modified the transformers library to implement teacher-mixed-sampling (mixing the probabilities of the teacher and student models for decoding). The modified lines are wrapped with

# ### MiniLLM BEGIN ###
... SOME NEW CODES ...
# ### MiniLLM END ###

This feature is required for MiniLLM training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants