
This PR adds support for *mixed lighting* to Bevy, whereby some parts of the scene are lightmapped, while others take part in real-time lighting. (Here *real-time lighting* means lighting at runtime via the PBR shader, as opposed to precomputed light using lightmaps.) It does so by adding a new field, `affects_lightmapped_meshes` to `IrradianceVolume` and `AmbientLight`, and a corresponding field `affects_lightmapped_mesh_diffuse` to `DirectionalLight`, `PointLight`, `SpotLight`, and `EnvironmentMapLight`. By default, this value is set to true; when set to false, the light contributes nothing to the diffuse irradiance component to meshes with lightmaps. Note that specular light is unaffected. This is because the correct way to bake specular lighting is *directional lightmaps*, which we have no support for yet. There are two general ways I expect this field to be used: 1. When diffuse indirect light is baked into lightmaps, irradiance volumes and reflection probes shouldn't contribute any diffuse light to the static geometry that has a lightmap. That's because the baking tool should have already accounted for it, and in a higher-quality fashion, as lightmaps typically offer a higher effective texture resolution than the light probe does. 2. When direct diffuse light is baked into a lightmap, punctual lights shouldn't contribute any diffuse light to static geometry with a lightmap, to avoid double-counting. It may seem odd to bake *direct* light into a lightmap, as opposed to indirect light. But there is a use case: in a scene with many lights, avoiding light leaks requires shadow mapping, which quickly becomes prohibitive when many lights are involved. Baking lightmaps allows light leaks to be eliminated on static geometry. A new example, `mixed_lighting`, has been added. It demonstrates a sofa (model from the [glTF Sample Assets]) that has been lightmapped offline using [Bakery]. It has four modes: 1. In *baked* mode, all objects are locked in place, and all the diffuse direct and indirect light has been calculated ahead of time. Note that the bottom of the sphere has a red tint from the sofa, illustrating that the baking tool captured indirect light for it. 2. In *mixed direct* mode, lightmaps capturing diffuse direct and indirect light have been pre-calculated for the static objects, but the dynamic sphere has real-time lighting. Note that, because the diffuse lighting has been entirely pre-calculated for the scenery, the dynamic sphere casts no shadow. In a real app, you would typically use real-time lighting for the most important light so that dynamic objects can shadow the scenery and relegate baked lighting to the less important lights for which shadows aren't as important. Also note that there is no red tint on the sphere, because there is no global illumination applied to it. In an actual game, you could fix this problem by supplementing the lightmapped objects with an irradiance volume. 3. In *mixed indirect* mode, all direct light is calculated in real-time, and the static objects have pre-calculated indirect lighting. This corresponds to the mode that most applications are expected to use. Because direct light on the scenery is computed dynamically, shadows are fully supported. As in mixed direct mode, there is no global illumination on the sphere; in a real application, irradiance volumes could be used to supplement the lightmaps. 4. In *real-time* mode, no lightmaps are used at all, and all punctual lights are rendered in real-time. No global illumination exists. In the example, you can click around to move the sphere, unless you're in baked mode, in which case the sphere must be locked in place to be lit correctly. ## Showcase Baked mode:  Mixed direct mode:  Mixed indirect mode (default):  Real-time mode:  ## Migration guide * The `AmbientLight` resource, the `IrradianceVolume` component, and the `EnvironmentMapLight` component now have `affects_lightmapped_meshes` fields. If you don't need to use that field (for example, if you aren't using lightmaps), you can safely set the field to true. * `DirectionalLight`, `PointLight`, and `SpotLight` now have `affects_lightmapped_mesh_diffuse` fields. If you don't need to use that field (for example, if you aren't using lightmaps), you can safely set the field to true. [glTF Sample Assets]: https://github.com/KhronosGroup/glTF-Sample-Assets/tree/main [Bakery]: https://geom.io/bakery/wiki/index.php?title=Bakery_-_GPU_Lightmapper
403 lines
13 KiB
Rust
403 lines
13 KiB
Rust
//! Plays animations from a skinned glTF.
|
|
|
|
use std::{f32::consts::PI, time::Duration};
|
|
|
|
use bevy::{
|
|
animation::{AnimationTargetId, RepeatAnimation},
|
|
color::palettes::css::WHITE,
|
|
pbr::CascadeShadowConfigBuilder,
|
|
prelude::*,
|
|
};
|
|
use rand::{Rng, SeedableRng};
|
|
use rand_chacha::ChaCha8Rng;
|
|
|
|
const FOX_PATH: &str = "models/animated/Fox.glb";
|
|
|
|
fn main() {
|
|
App::new()
|
|
.insert_resource(AmbientLight {
|
|
color: Color::WHITE,
|
|
brightness: 2000.,
|
|
..default()
|
|
})
|
|
.add_plugins(DefaultPlugins)
|
|
.init_resource::<ParticleAssets>()
|
|
.init_resource::<FoxFeetTargets>()
|
|
.add_systems(Startup, setup)
|
|
.add_systems(Update, setup_scene_once_loaded)
|
|
.add_systems(Update, (keyboard_animation_control, simulate_particles))
|
|
.add_observer(observe_on_step)
|
|
.run();
|
|
}
|
|
|
|
#[derive(Resource)]
|
|
struct SeededRng(ChaCha8Rng);
|
|
|
|
#[derive(Resource)]
|
|
struct Animations {
|
|
animations: Vec<AnimationNodeIndex>,
|
|
graph: Handle<AnimationGraph>,
|
|
}
|
|
|
|
#[derive(Event, Reflect, Clone)]
|
|
struct OnStep;
|
|
|
|
fn observe_on_step(
|
|
trigger: Trigger<OnStep>,
|
|
particle: Res<ParticleAssets>,
|
|
mut commands: Commands,
|
|
transforms: Query<&GlobalTransform>,
|
|
mut seeded_rng: ResMut<SeededRng>,
|
|
) {
|
|
let translation = transforms.get(trigger.target()).unwrap().translation();
|
|
// Spawn a bunch of particles.
|
|
for _ in 0..14 {
|
|
let horizontal = seeded_rng.0.gen::<Dir2>() * seeded_rng.0.gen_range(8.0..12.0);
|
|
let vertical = seeded_rng.0.gen_range(0.0..4.0);
|
|
let size = seeded_rng.0.gen_range(0.2..1.0);
|
|
commands.queue(spawn_particle(
|
|
particle.mesh.clone(),
|
|
particle.material.clone(),
|
|
translation.reject_from_normalized(Vec3::Y),
|
|
seeded_rng.0.gen_range(0.2..0.6),
|
|
size,
|
|
Vec3::new(horizontal.x, vertical, horizontal.y) * 10.0,
|
|
));
|
|
}
|
|
}
|
|
|
|
fn setup(
|
|
mut commands: Commands,
|
|
asset_server: Res<AssetServer>,
|
|
mut meshes: ResMut<Assets<Mesh>>,
|
|
mut materials: ResMut<Assets<StandardMaterial>>,
|
|
mut graphs: ResMut<Assets<AnimationGraph>>,
|
|
) {
|
|
// Build the animation graph
|
|
let (graph, node_indices) = AnimationGraph::from_clips([
|
|
asset_server.load(GltfAssetLabel::Animation(2).from_asset(FOX_PATH)),
|
|
asset_server.load(GltfAssetLabel::Animation(1).from_asset(FOX_PATH)),
|
|
asset_server.load(GltfAssetLabel::Animation(0).from_asset(FOX_PATH)),
|
|
]);
|
|
|
|
// Insert a resource with the current scene information
|
|
let graph_handle = graphs.add(graph);
|
|
commands.insert_resource(Animations {
|
|
animations: node_indices,
|
|
graph: graph_handle,
|
|
});
|
|
|
|
// Camera
|
|
commands.spawn((
|
|
Camera3d::default(),
|
|
Transform::from_xyz(100.0, 100.0, 150.0).looking_at(Vec3::new(0.0, 20.0, 0.0), Vec3::Y),
|
|
));
|
|
|
|
// Plane
|
|
commands.spawn((
|
|
Mesh3d(meshes.add(Plane3d::default().mesh().size(500000.0, 500000.0))),
|
|
MeshMaterial3d(materials.add(Color::srgb(0.3, 0.5, 0.3))),
|
|
));
|
|
|
|
// Light
|
|
commands.spawn((
|
|
Transform::from_rotation(Quat::from_euler(EulerRot::ZYX, 0.0, 1.0, -PI / 4.)),
|
|
DirectionalLight {
|
|
shadows_enabled: true,
|
|
..default()
|
|
},
|
|
CascadeShadowConfigBuilder {
|
|
first_cascade_far_bound: 200.0,
|
|
maximum_distance: 400.0,
|
|
..default()
|
|
}
|
|
.build(),
|
|
));
|
|
|
|
// Fox
|
|
commands.spawn(SceneRoot(
|
|
asset_server.load(GltfAssetLabel::Scene(0).from_asset(FOX_PATH)),
|
|
));
|
|
|
|
println!("Animation controls:");
|
|
println!(" - spacebar: play / pause");
|
|
println!(" - arrow up / down: speed up / slow down animation playback");
|
|
println!(" - arrow left / right: seek backward / forward");
|
|
println!(" - digit 1 / 3 / 5: play the animation <digit> times");
|
|
println!(" - L: loop the animation forever");
|
|
println!(" - return: change animation");
|
|
|
|
// We're seeding the PRNG here to make this example deterministic for testing purposes.
|
|
// This isn't strictly required in practical use unless you need your app to be deterministic.
|
|
let seeded_rng = ChaCha8Rng::seed_from_u64(19878367467712);
|
|
commands.insert_resource(SeededRng(seeded_rng));
|
|
}
|
|
|
|
// An `AnimationPlayer` is automatically added to the scene when it's ready.
|
|
// When the player is added, start the animation.
|
|
fn setup_scene_once_loaded(
|
|
mut commands: Commands,
|
|
animations: Res<Animations>,
|
|
feet: Res<FoxFeetTargets>,
|
|
graphs: Res<Assets<AnimationGraph>>,
|
|
mut clips: ResMut<Assets<AnimationClip>>,
|
|
mut players: Query<(Entity, &mut AnimationPlayer), Added<AnimationPlayer>>,
|
|
) {
|
|
fn get_clip<'a>(
|
|
node: AnimationNodeIndex,
|
|
graph: &AnimationGraph,
|
|
clips: &'a mut Assets<AnimationClip>,
|
|
) -> &'a mut AnimationClip {
|
|
let node = graph.get(node).unwrap();
|
|
let clip = match &node.node_type {
|
|
AnimationNodeType::Clip(handle) => clips.get_mut(handle),
|
|
_ => unreachable!(),
|
|
};
|
|
clip.unwrap()
|
|
}
|
|
|
|
for (entity, mut player) in &mut players {
|
|
let graph = graphs.get(&animations.graph).unwrap();
|
|
|
|
// Send `OnStep` events once the fox feet hits the ground in the running animation.
|
|
let running_animation = get_clip(animations.animations[0], graph, &mut clips);
|
|
// You can determine the time an event should trigger if you know witch frame it occurs and
|
|
// the frame rate of the animation. Let's say we want to trigger an event at frame 15,
|
|
// and the animation has a frame rate of 24 fps, then time = 15 / 24 = 0.625.
|
|
running_animation.add_event_to_target(feet.front_left, 0.625, OnStep);
|
|
running_animation.add_event_to_target(feet.front_right, 0.5, OnStep);
|
|
running_animation.add_event_to_target(feet.back_left, 0.0, OnStep);
|
|
running_animation.add_event_to_target(feet.back_right, 0.125, OnStep);
|
|
|
|
let mut transitions = AnimationTransitions::new();
|
|
|
|
// Make sure to start the animation via the `AnimationTransitions`
|
|
// component. The `AnimationTransitions` component wants to manage all
|
|
// the animations and will get confused if the animations are started
|
|
// directly via the `AnimationPlayer`.
|
|
transitions
|
|
.play(&mut player, animations.animations[0], Duration::ZERO)
|
|
.repeat();
|
|
|
|
commands
|
|
.entity(entity)
|
|
.insert(AnimationGraphHandle(animations.graph.clone()))
|
|
.insert(transitions);
|
|
}
|
|
}
|
|
|
|
fn keyboard_animation_control(
|
|
keyboard_input: Res<ButtonInput<KeyCode>>,
|
|
mut animation_players: Query<(&mut AnimationPlayer, &mut AnimationTransitions)>,
|
|
animations: Res<Animations>,
|
|
mut current_animation: Local<usize>,
|
|
) {
|
|
for (mut player, mut transitions) in &mut animation_players {
|
|
let Some((&playing_animation_index, _)) = player.playing_animations().next() else {
|
|
continue;
|
|
};
|
|
|
|
if keyboard_input.just_pressed(KeyCode::Space) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
if playing_animation.is_paused() {
|
|
playing_animation.resume();
|
|
} else {
|
|
playing_animation.pause();
|
|
}
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::ArrowUp) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
let speed = playing_animation.speed();
|
|
playing_animation.set_speed(speed * 1.2);
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::ArrowDown) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
let speed = playing_animation.speed();
|
|
playing_animation.set_speed(speed * 0.8);
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::ArrowLeft) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
let elapsed = playing_animation.seek_time();
|
|
playing_animation.seek_to(elapsed - 0.1);
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::ArrowRight) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
let elapsed = playing_animation.seek_time();
|
|
playing_animation.seek_to(elapsed + 0.1);
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::Enter) {
|
|
*current_animation = (*current_animation + 1) % animations.animations.len();
|
|
|
|
transitions
|
|
.play(
|
|
&mut player,
|
|
animations.animations[*current_animation],
|
|
Duration::from_millis(250),
|
|
)
|
|
.repeat();
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::Digit1) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
playing_animation
|
|
.set_repeat(RepeatAnimation::Count(1))
|
|
.replay();
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::Digit3) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
playing_animation
|
|
.set_repeat(RepeatAnimation::Count(3))
|
|
.replay();
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::Digit5) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
playing_animation
|
|
.set_repeat(RepeatAnimation::Count(5))
|
|
.replay();
|
|
}
|
|
|
|
if keyboard_input.just_pressed(KeyCode::KeyL) {
|
|
let playing_animation = player.animation_mut(playing_animation_index).unwrap();
|
|
playing_animation.set_repeat(RepeatAnimation::Forever);
|
|
}
|
|
}
|
|
}
|
|
|
|
fn simulate_particles(
|
|
mut commands: Commands,
|
|
mut query: Query<(Entity, &mut Transform, &mut Particle)>,
|
|
time: Res<Time>,
|
|
) {
|
|
for (entity, mut transform, mut particle) in &mut query {
|
|
if particle.lifetime_timer.tick(time.delta()).just_finished() {
|
|
commands.entity(entity).despawn();
|
|
} else {
|
|
transform.translation += particle.velocity * time.delta_secs();
|
|
transform.scale =
|
|
Vec3::splat(particle.size.lerp(0.0, particle.lifetime_timer.fraction()));
|
|
particle
|
|
.velocity
|
|
.smooth_nudge(&Vec3::ZERO, 4.0, time.delta_secs());
|
|
}
|
|
}
|
|
}
|
|
|
|
fn spawn_particle<M: Material>(
|
|
mesh: Handle<Mesh>,
|
|
material: Handle<M>,
|
|
translation: Vec3,
|
|
lifetime: f32,
|
|
size: f32,
|
|
velocity: Vec3,
|
|
) -> impl Command {
|
|
move |world: &mut World| {
|
|
world.spawn((
|
|
Particle {
|
|
lifetime_timer: Timer::from_seconds(lifetime, TimerMode::Once),
|
|
size,
|
|
velocity,
|
|
},
|
|
Mesh3d(mesh),
|
|
MeshMaterial3d(material),
|
|
Transform {
|
|
translation,
|
|
scale: Vec3::splat(size),
|
|
..Default::default()
|
|
},
|
|
));
|
|
}
|
|
}
|
|
|
|
#[derive(Component)]
|
|
struct Particle {
|
|
lifetime_timer: Timer,
|
|
size: f32,
|
|
velocity: Vec3,
|
|
}
|
|
|
|
#[derive(Resource)]
|
|
struct ParticleAssets {
|
|
mesh: Handle<Mesh>,
|
|
material: Handle<StandardMaterial>,
|
|
}
|
|
|
|
impl FromWorld for ParticleAssets {
|
|
fn from_world(world: &mut World) -> Self {
|
|
Self {
|
|
mesh: world.resource_mut::<Assets<Mesh>>().add(Sphere::new(10.0)),
|
|
material: world
|
|
.resource_mut::<Assets<StandardMaterial>>()
|
|
.add(StandardMaterial {
|
|
base_color: WHITE.into(),
|
|
..Default::default()
|
|
}),
|
|
}
|
|
}
|
|
}
|
|
|
|
#[derive(Resource)]
|
|
struct FoxFeetTargets {
|
|
front_right: AnimationTargetId,
|
|
front_left: AnimationTargetId,
|
|
back_left: AnimationTargetId,
|
|
back_right: AnimationTargetId,
|
|
}
|
|
|
|
impl Default for FoxFeetTargets {
|
|
fn default() -> Self {
|
|
// Get the id's of the feet and store them in a resource.
|
|
let hip_node = ["root", "_rootJoint", "b_Root_00", "b_Hip_01"];
|
|
let front_left_foot = hip_node.iter().chain(
|
|
[
|
|
"b_Spine01_02",
|
|
"b_Spine02_03",
|
|
"b_LeftUpperArm_09",
|
|
"b_LeftForeArm_010",
|
|
"b_LeftHand_011",
|
|
]
|
|
.iter(),
|
|
);
|
|
let front_right_foot = hip_node.iter().chain(
|
|
[
|
|
"b_Spine01_02",
|
|
"b_Spine02_03",
|
|
"b_RightUpperArm_06",
|
|
"b_RightForeArm_07",
|
|
"b_RightHand_08",
|
|
]
|
|
.iter(),
|
|
);
|
|
let back_left_foot = hip_node.iter().chain(
|
|
[
|
|
"b_LeftLeg01_015",
|
|
"b_LeftLeg02_016",
|
|
"b_LeftFoot01_017",
|
|
"b_LeftFoot02_018",
|
|
]
|
|
.iter(),
|
|
);
|
|
let back_right_foot = hip_node.iter().chain(
|
|
[
|
|
"b_RightLeg01_019",
|
|
"b_RightLeg02_020",
|
|
"b_RightFoot01_021",
|
|
"b_RightFoot02_022",
|
|
]
|
|
.iter(),
|
|
);
|
|
Self {
|
|
front_left: AnimationTargetId::from_iter(front_left_foot),
|
|
front_right: AnimationTargetId::from_iter(front_right_foot),
|
|
back_left: AnimationTargetId::from_iter(back_left_foot),
|
|
back_right: AnimationTargetId::from_iter(back_right_foot),
|
|
}
|
|
}
|
|
}
|