-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
entropy coding #6
Comments
|
Hi again, |
Directly train without any augmentation. |
splitting of training and validation was 0.95 , 0.05? |
Is the Owlii dataset quantized? In our training the coordinates of Owlii dataset are quantized into 10-bit. The example quantization script is https://github.com/ftyaaa/quantizer.git. And can you describe the overfitting problem in detail? |
I didn't quantized the data, just changed the scaling factor to 8, to reduce the computations. |
Can you share the complete project file, the trained model, and the command of running training and testing? Please send e-mail to [email protected] |
HI , i want to implement the algorithm of the article exactly.
1-How to use .npy for dataset-owlii ? where did you convert the type of dataset ?
2- where did you use G-PCC for lossless compression ? there is nothing of G-PCC in LosslessCompressor() definiotion.
3-why you dont use factorized_entropy_coding() in training?
I'd be grateful for you answering.
The text was updated successfully, but these errors were encountered: