Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vulkan: Fix float16 use on devices without float16 support + fix subgroup_size_control validation error #11161

Merged
merged 2 commits into from
Jan 10, 2025

Conversation

0cc4m
Copy link
Collaborator

@0cc4m 0cc4m commented Jan 9, 2025

@netrunnereve This is gonna cause a conflict with #11081. It might also have performance implications on devices that support float16, but if we do want to use float16 in mul_mat_vec shaders, then it has to be split up into a float32 and a float16 variant, similar to mul_mm.

@0cc4m 0cc4m requested a review from netrunnereve January 9, 2025 15:36
@github-actions github-actions bot added Vulkan Issues specific to the Vulkan backend ggml changes relating to the ggml tensor library for machine learning labels Jan 9, 2025
@jeffbolznv
Copy link
Collaborator

LGTM. I didn't perf test it because the changes look harmless. I think the f16vec2s might have been my fault?

Copy link
Collaborator

@netrunnereve netrunnereve left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the heads up. This looks good and I don't see any slowdowns on my system, which makes sense as it only supports FP32.

@0cc4m 0cc4m merged commit c3f9d25 into master Jan 10, 2025
48 checks passed
@0cc4m 0cc4m deleted the 0cc4m/vulkan-fix-float16-validation branch January 10, 2025 05:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ggml changes relating to the ggml tensor library for machine learning Vulkan Issues specific to the Vulkan backend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants