Fix deferred lighting pass values not all working on M1 in WebGL2 (#10304)
# Objective - On MacOS M1 WebGL2 the deferred lighting ID depth comparison is failing for some values (including 1, the default) Note: this issue is just with WebGL2, native on MacOS M1 is working with current bevy main. ## Solution - Use Depth16Unorm for lighting pass id format. This format is aliasing to the same value consistently (in [copy_deferred_lighting_id](https://github.com/bevyengine/bevy/blob/main/crates/bevy_core_pipeline/src/deferred/copy_deferred_lighting_id.wgsl#L15) and [deferred_lighting](https://github.com/bevyengine/bevy/blob/main/crates/bevy_pbr/src/deferred/deferred_lighting.wgsl#L39)) on MacOS M1 WebGL, and appears to be supported across WebGL2, WebGPU, DX12, OpenGL 3.3, Vulkan. Successfully tested all 256 ids on: - MacOS M1 native and WebGL2 - Window RTX3060 Vulkan/DX12/WebGL2 - Windows Intel UHD Graphics 630 IGP DX12/WebGL2 (bevy w/ Vulkan doesn't work on this IGP in general https://github.com/bevyengine/bevy/issues/8037)
This commit is contained in:
parent
e9a0d6ccc5
commit
d3e41e2ff7
@ -12,7 +12,7 @@ use bevy_utils::{nonmax::NonMaxU32, FloatOrd};
|
||||
|
||||
pub const DEFERRED_PREPASS_FORMAT: TextureFormat = TextureFormat::Rgba32Uint;
|
||||
pub const DEFERRED_LIGHTING_PASS_ID_FORMAT: TextureFormat = TextureFormat::R8Uint;
|
||||
pub const DEFERRED_LIGHTING_PASS_ID_DEPTH_FORMAT: TextureFormat = TextureFormat::Depth32Float;
|
||||
pub const DEFERRED_LIGHTING_PASS_ID_DEPTH_FORMAT: TextureFormat = TextureFormat::Depth16Unorm;
|
||||
|
||||
/// Opaque phase of the 3D Deferred pass.
|
||||
///
|
||||
|
Loading…
Reference in New Issue
Block a user