Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QST]Why fp8 convert only has float2fp8 function without ptx ? #1564

Open
WtDMaO opened this issue May 31, 2024 · 3 comments
Open

[QST]Why fp8 convert only has float2fp8 function without ptx ? #1564

WtDMaO opened this issue May 31, 2024 · 3 comments

Comments

@WtDMaO
Copy link

WtDMaO commented May 31, 2024

What is your question?
Why does the CUDA Toolkit only provide an implementation for double2fp8 in the conversion to FP8, while CUTLASS only provides float2fp8?
For FP16 and FP32, the CUDA Toolkit uses a step-by-step widening of bit-width to double for the conversion to FP8. Is this completely equivalent?
Why is there no implementation for fp162fp8 in non-PTX scenarios?"

@WtDMaO
Copy link
Author

WtDMaO commented May 31, 2024

I speculate that perhaps because there is no performance requirement in non-PTX scenarios, and since the widening of bit-width is completely equivalent, only implementations for conversion from double or higher bit-width numbers are provided.

Copy link

This issue has been labeled inactive-30d due to no recent activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed. This issue will be labeled inactive-90d if there is no activity in the next 60 days.

Copy link

This issue has been labeled inactive-90d due to no recent activity in the past 90 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant