-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix crash when loading textures from gltf. #7005
Conversation
…exture format from gltf
@nicopap I've tested it according all the models I have without problem, what now? Waiting for more test result or code review? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why remove support
|
||
data = i.into_raw(); | ||
} | ||
DynamicImage::ImageLuma16(i) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you explain why you removed support for the following image formats: Luma16
, LumaA16
, Rgb16
, Rgba16
, and 32F variants?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for not providing enough context and information at the first time.
I'm creating a 3D asset management tool based on bevy, it should be able to load and preview gltf 2.0 model at runtime, so the stability and compatibility comes to first priority.
I've tested the compatibility with all kinds of gltf, including all models from (https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0) and models I collected which will cause the crash issue #7005, and current code will provide maximum stability and compatibility, means all the textures could be loaded and rendered without any problem.
I think, we could build a solid ground without any crash at first, and extend the compatibility case-by-case in future, rather than support as much as possible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about custom shaders? Bevy users might want to load any image formats for their own shaders. Even if it doesn't work with the default PBR shader
Any updates on this? If I understood @AllenDang's answer correctly, the version he proposes only keeps de jure support for the formats Bevy de facto supports right now. That seems reasonable to me, is it not? |
@janhohenheim Could you drop a review? The changes are quite trivial and if you understand the implications of the bug, your input will be greatly appreciated. I personally barely understand them and feel like I don't have the authority to review it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To me it seems reasonable to remove handling of image formats that invariably crash bevy. I'm on edge because of Chesterton's fence. A lot of removed code seem like they do duplicate or useless work, but who knows if it actually serves a purpose?
I'm worried about flexibility though. Maybe some texture formats do not work with PBR, but they might work with user-defined shaders.
} | ||
// DynamicImage is now non exhaustive, catch future variants and convert them |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why remove this line? This is still true as far as I am aware.
|
||
data = i.into_raw(); | ||
} | ||
DynamicImage::ImageLuma16(i) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about custom shaders? Bevy users might want to load any image formats for their own shaders. Even if it doesn't work with the default PBR shader
@nicopap sure, I can review this tomorrow :) |
Any progress? |
@AllenDang thanks for the reminder, I forgot to get back to you! /// "This type is currently not compatible with [`StandardMaterial`]"
In any case, thanks for fixing the PR so far and fixing some crashes I had :) |
@janhohenheim The crash happens on a critical and major workflow about 3D game making. I got a glb file, I previewed it via gltf's official viewer, everything seems fine. If some image types are known incompatible with StandardMaterial, which will lead to a runtime crash, why are we still trying to use a universal image loader? Anyone could add a new image type for some reason and introduce a runtime crash at anytime. My suggestion is, not just add some comments, but to create a new image loader only for StandardMaterial. |
@AllenDang I see what you mean. How would you go about passing the data to other shaders that use these formats? |
@janhohenheim After a double thoughts, I think we might need to go back to check the root cause about the crash... I'm not familiar about wlsl and web gpu, do you have any clue? |
I'm not knowledgeable enough, sorry. Maybe a rendering expert can help out: summoning @superdump, @robtfm |
This surfaced recently: A discussion with a user resulted in finding out that |
this seems to work: diff --git a/crates/bevy_render/src/texture/image_texture_conversion.rs b/crates/bevy_render/src/texture/image_texture_conversion.rs
index 71eeff23e..5c372995d 100644
--- a/crates/bevy_render/src/texture/image_texture_conversion.rs
+++ b/crates/bevy_render/src/texture/image_texture_conversion.rs
@@ -82,7 +82,7 @@ impl Image {
DynamicImage::ImageRgb16(image) => {
width = image.width();
height = image.height();
- format = TextureFormat::Rgba16Uint;
+ format = TextureFormat::Rgba16Unorm;
let mut local_data =
Vec::with_capacity(width as usize * height as usize * format.pixel_size());
@@ -106,7 +106,7 @@ impl Image {
DynamicImage::ImageRgba16(i) => {
width = i.width();
height = i.height();
- format = TextureFormat::Rgba16Uint;
+ format = TextureFormat::Rgba16Unorm;
let raw_data = i.into_raw();
it seems like currently when loading images we choose the format somewhat arbitrarily. 8bpp formats are treated as Unorm and 16bpp formats are treated as Uint. Given that there isn't any further info avaiable at the point the image is built we need to decide whether they should be treated as Unorm [0.0, 1.0], Snorm [-1.0, 1.0], or Uint [0, 65535] by default. I think choosing Unorm is likely fine since it is consistent with the 8bpp formats. @superdump, thoughts? |
Merge upstream
Merge from upstream
…9611) # Objective fix bevyengine#8185, bevyengine#6710 replace bevyengine#7005 (closed) rgb and rgba 16 bit textures currently default to `Rgba16Uint`, the more common use is `Rgba16Unorm`, which also matches the default type of rgb8 and rgba8 textures. ## Solution Change default to `Rgba16Unorm`
Refine image texture conversion to fix crash when loading specified textures from gltf.
Objective
Fixes #6710
Solution
Unify format conversion by using ConvertBuffer from image crate and set correct format.