-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the pretrained model #5
Comments
Training epochs can be a bit confusing, I agree. In Incremental Network Quantization you have 2 kind of iterations. The first one is the amount of times you do a new weight partitioning where you determine which weights get fixed and which will still be trained. And then you have the amount of training loops you do within each of these quantization iterations. With The only thing you have to change in the code should be the path to imagenet, the remainder should be the setup to quantize resnet18. |
I changed nothing about the code except for the data path,but I found that when I began to quantize the model,the loss was big. Is that correct? the environment of mine is python3.6 and pytorch 1.0.1. => using pre-trained model 'resnet18'
|
Hey, I just reran the example file in a docker container, only modifying the datapath. My output looks like this:
So the loss should be way lower and the accuracy higher. I also noticed you have less epochs. Are you sure you have the correct dataset? (The fact that your validation error is correct makes me think that you do have the right dataset however). On how many GPUs are you running the code (not that it should matter). -- In case someone else stumbles upon this issue and ran the code, could you let me know if everything works fine? |
I find this line a bit weird: the |
Hello,I am confused of the training epoch, in your code,you set the epochs as 4, if I want to quantize resnet18, do I need to change it? and do you have the quantized model of resnet18 of other bit width except 5? Thank you!
The text was updated successfully, but these errors were encountered: