
# Bevy Solari <img src="https://github.com/user-attachments/assets/94061fc8-01cf-4208-b72a-8eecad610d76" width="100" /> ## Preface - See release notes. - Please talk to me in #rendering-dev on discord or open a github discussion if you have questions about the long term plan, and keep discussion in this PR limited to the contents of the PR :) ## Connections - Works towards #639, #16408. - Spawned https://github.com/bevyengine/bevy/issues/18993. - Need to fix RT stuff in naga_oil first https://github.com/bevyengine/naga_oil/pull/116. ## This PR After nearly two years, I've revived the raytraced lighting effort I first started in https://github.com/bevyengine/bevy/pull/10000. Unlike that PR, which has realtime techniques, I've limited this PR to: * `RaytracingScenePlugin` - BLAS and TLAS building, geometry and texture binding, sampling functions. * `PathtracingPlugin` - A non-realtime path tracer intended to serve as a testbed and reference. ## What's implemented?  * BLAS building on mesh load * Emissive lights * Directional lights with soft shadows * Diffuse (lambert, not Bevy's diffuse BRDF) and emissive materials * A reference path tracer with: * Antialiasing * Direct light sampling (next event estimation) with 0/1 MIS weights * Importance-sampled BRDF bounces * Russian roulette ## What's _not_ implemented? * Anything realtime, including a real-time denoiser * Integration with Bevy's rasterized gbuffer * Specular materials * Non-opaque geometry * Any sort of CPU or GPU optimizations * BLAS compaction, proper bindless, and further RT APIs are things that we need wgpu to add * PointLights, SpotLights, or skyboxes / environment lighting * Support for materials other than StandardMaterial (and only a subset of properties are supported) * Skinned/morphed or otherwise animating/deformed meshes * Mipmaps * Adaptive self-intersection ray bias * A good way for developers to detect whether the user's GPU supports RT or not, and fallback to baked lighting. * Documentation and actual finalized APIs (literally everything is subject to change) ## End-user Usage * Have a GPU that supports RT with inline ray queries * Add `SolariPlugin` to your app * Ensure any `Mesh` asset you want to use for raytracing has `enable_raytracing: true` (defaults to true), and that it uses the standard uncompressed position/normal/uv_0/tangent vertex attribute set, triangle list topology, and 32-bit indices. * If you don't want to build a BLAS and use the mesh for RT, set enable_raytracing to false. * Add the `RaytracingMesh3d` component to your entity (separate from `Mesh3d` or `MeshletMesh3d`). ## Testing - Did you test these changes? If so, how? - Ran the solari example. - Are there any parts that need more testing? - Other test scenes probably. Normal mapping would be good to test. - How can other people (reviewers) test your changes? Is there anything specific they need to know? - See the solari.rs example for how to setup raytracing. - If relevant, what platforms did you test these changes on, and are there any important ones you can't test? - Windows 11, NVIDIA RTX 3080. --------- Co-authored-by: atlv <email@atlasdostal.com> Co-authored-by: IceSentry <IceSentry@users.noreply.github.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
86 lines
2.6 KiB
Rust
86 lines
2.6 KiB
Rust
//! Demonstrates realtime dynamic global illumination rendering using Bevy Solari.
|
|
|
|
#[path = "../helpers/camera_controller.rs"]
|
|
mod camera_controller;
|
|
|
|
use bevy::{
|
|
prelude::*,
|
|
render::{camera::CameraMainTextureUsages, mesh::Indices, render_resource::TextureUsages},
|
|
scene::SceneInstanceReady,
|
|
solari::{
|
|
pathtracer::Pathtracer,
|
|
prelude::{RaytracingMesh3d, SolariPlugin},
|
|
},
|
|
};
|
|
use camera_controller::{CameraController, CameraControllerPlugin};
|
|
use std::f32::consts::PI;
|
|
|
|
fn main() {
|
|
App::new()
|
|
.add_plugins((DefaultPlugins, SolariPlugin, CameraControllerPlugin))
|
|
.add_systems(Startup, setup)
|
|
.run();
|
|
}
|
|
|
|
fn setup(mut commands: Commands, asset_server: Res<AssetServer>) {
|
|
commands
|
|
.spawn(SceneRoot(asset_server.load(
|
|
GltfAssetLabel::Scene(0).from_asset("models/CornellBox/CornellBox.glb"),
|
|
)))
|
|
.observe(add_raytracing_meshes_on_scene_load);
|
|
|
|
commands.spawn((
|
|
DirectionalLight {
|
|
illuminance: light_consts::lux::FULL_DAYLIGHT,
|
|
shadows_enabled: true,
|
|
..default()
|
|
},
|
|
Transform::from_rotation(Quat::from_euler(EulerRot::XYZ, PI * -0.43, PI * -0.08, 0.0)),
|
|
));
|
|
|
|
commands.spawn((
|
|
Camera3d::default(),
|
|
Camera {
|
|
clear_color: ClearColorConfig::Custom(Color::BLACK),
|
|
..default()
|
|
},
|
|
CameraController {
|
|
walk_speed: 500.0,
|
|
run_speed: 1500.0,
|
|
..Default::default()
|
|
},
|
|
Pathtracer::default(),
|
|
CameraMainTextureUsages::default().with(TextureUsages::STORAGE_BINDING),
|
|
Transform::from_xyz(-278.0, 273.0, 800.0),
|
|
));
|
|
}
|
|
|
|
fn add_raytracing_meshes_on_scene_load(
|
|
trigger: On<SceneInstanceReady>,
|
|
children: Query<&Children>,
|
|
mesh: Query<&Mesh3d>,
|
|
mut meshes: ResMut<Assets<Mesh>>,
|
|
mut commands: Commands,
|
|
) {
|
|
// Ensure meshes are bery_solari compatible
|
|
for (_, mesh) in meshes.iter_mut() {
|
|
mesh.remove_attribute(Mesh::ATTRIBUTE_UV_1.id);
|
|
mesh.generate_tangents().unwrap();
|
|
|
|
if let Some(indices) = mesh.indices_mut() {
|
|
if let Indices::U16(u16_indices) = indices {
|
|
*indices = Indices::U32(u16_indices.iter().map(|i| *i as u32).collect());
|
|
}
|
|
}
|
|
}
|
|
|
|
for descendant in children.iter_descendants(trigger.target().unwrap()) {
|
|
if let Ok(mesh) = mesh.get(descendant) {
|
|
commands
|
|
.entity(descendant)
|
|
.insert(RaytracingMesh3d(mesh.0.clone()))
|
|
.remove::<Mesh3d>();
|
|
}
|
|
}
|
|
}
|