# Objective
Picking was changed in the UI transform PR to walk up the tree
recursively to check if an interaction was on a clipped node.
`OverrideClip` only affects a node's local clipping rect, so valid
interactions can be ignored if a node has clipped ancestors.
## Solution
Add a `Without<OverrideClip>` query filter to the picking systems'
`child_of_query`s.
## Testing
This modified `button` example can be used to test the change:
```rust
//! This example illustrates how to create a button that changes color and text based on its
//! interaction state.
use bevy::{color::palettes::basic::*, input_focus::InputFocus, prelude::*, winit::WinitSettings};
fn main() {
App::new()
.add_plugins(DefaultPlugins)
// Only run the app when there is user input. This will significantly reduce CPU/GPU use.
.insert_resource(WinitSettings::desktop_app())
// `InputFocus` must be set for accessibility to recognize the button.
.init_resource::<InputFocus>()
.add_systems(Startup, setup)
.add_systems(Update, button_system)
.run();
}
const NORMAL_BUTTON: Color = Color::srgb(0.15, 0.15, 0.15);
const HOVERED_BUTTON: Color = Color::srgb(0.25, 0.25, 0.25);
const PRESSED_BUTTON: Color = Color::srgb(0.35, 0.75, 0.35);
fn button_system(
mut input_focus: ResMut<InputFocus>,
mut interaction_query: Query<
(
Entity,
&Interaction,
&mut BackgroundColor,
&mut BorderColor,
&mut Button,
&Children,
),
Changed<Interaction>,
>,
mut text_query: Query<&mut Text>,
) {
for (entity, interaction, mut color, mut border_color, mut button, children) in
&mut interaction_query
{
let mut text = text_query.get_mut(children[0]).unwrap();
match *interaction {
Interaction::Pressed => {
input_focus.set(entity);
**text = "Press".to_string();
*color = PRESSED_BUTTON.into();
*border_color = BorderColor::all(RED.into());
// The accessibility system's only update the button's state when the `Button` component is marked as changed.
button.set_changed();
}
Interaction::Hovered => {
input_focus.set(entity);
**text = "Hover".to_string();
*color = HOVERED_BUTTON.into();
*border_color = BorderColor::all(Color::WHITE);
button.set_changed();
}
Interaction::None => {
input_focus.clear();
**text = "Button".to_string();
*color = NORMAL_BUTTON.into();
*border_color = BorderColor::all(Color::BLACK);
}
}
}
}
fn setup(mut commands: Commands, assets: Res<AssetServer>) {
// ui camera
commands.spawn(Camera2d);
commands.spawn(button(&assets));
}
fn button(asset_server: &AssetServer) -> impl Bundle + use<> {
(
Node {
width: Val::Percent(100.0),
height: Val::Percent(100.0),
align_items: AlignItems::Center,
justify_content: JustifyContent::Center,
..default()
},
children![(
Node {
width: Val::Px(0.),
height: Val::Px(0.),
overflow: Overflow::clip(),
..default()
},
children![(
//OverrideClip,
Button,
Node {
position_type: PositionType::Absolute,
width: Val::Px(150.0),
height: Val::Px(65.0),
border: UiRect::all(Val::Px(5.0)),
// horizontally center child text
justify_content: JustifyContent::Center,
// vertically center child text
align_items: AlignItems::Center,
..default()
},
BorderColor::all(Color::WHITE),
BorderRadius::MAX,
BackgroundColor(Color::BLACK),
children![(
Text::new("Button"),
TextFont {
font: asset_server.load("fonts/FiraSans-Bold.ttf"),
font_size: 33.0,
..default()
},
TextColor(Color::srgb(0.9, 0.9, 0.9)),
TextShadow::default(),
)]
)],
)],
)
}
```
On main the button ignores interactions, with this PR it should respond
correctly.
---------
Co-authored-by: Alice Cecile <alice.i.cecile@gmail.com>
# Objective
- get closer to being able to load gltfs without using bevy_render
## Solution
- Split bevy_camera out of bevy_render
- Builds on #19943
- Im sorry for the big diff, i tried to minimize it as much as i can by
using re-exports. This also prevents most breaking changes, but there
are still a couple.
## Testing
- 3d_scene looks good
# Objective
- `bevy_ui` has errors and warnings when building independently
## Solution
- properly use the `bevy_ui_picking_backend` feature
## Testing
`cargo build -p bevy_ui`
# Objective
Add specialized UI transform `Component`s and fix some related problems:
* Animating UI elements by modifying the `Transform` component of UI
nodes doesn't work very well because `ui_layout_system` overwrites the
translations each frame. The `overflow_debug` example uses a horrible
hack where it copies the transform into the position that'll likely
cause a panic if any users naively copy it.
* Picking ignores rotation and scaling and assumes UI nodes are always
axis aligned.
* The clipping geometry stored in `CalculatedClip` is wrong for rotated
and scaled elements.
* Transform propagation is unnecessary for the UI, the transforms can be
updated during layout updates.
* The UI internals use both object-centered and top-left-corner-based
coordinates systems for UI nodes. Depending on the context you have to
add or subtract the half-size sometimes before transforming between
coordinate spaces. We should just use one system consistantly so that
the transform can always be directly applied.
* `Transform` doesn't support responsive coordinates.
## Solution
* Unrequire `Transform` from `Node`.
* New components `UiTransform`, `UiGlobalTransform`:
- `Node` requires `UiTransform`, `UiTransform` requires
`UiGlobalTransform`
- `UiTransform` is a 2d-only equivalent of `Transform` with a
translation in `Val`s.
- `UiGlobalTransform` newtypes `Affine2` and is updated in
`ui_layout_system`.
* New helper functions on `ComputedNode` for mapping between viewport
and local node space.
* The cursor position is transformed to local node space during picking
so that it respects rotations and scalings.
* To check if the cursor hovers a node recursively walk up the tree to
the root checking if any of the ancestor nodes clip the point at the
cursor. If the point is clipped the interaction is ignored.
* Use object-centered coordinates for UI nodes.
* `RelativeCursorPosition`'s coordinates are now object-centered with
(0,0) at the the center of the node and the corners at (±0.5, ±0.5).
* Replaced the `normalized_visible_node_rect: Rect` field of
`RelativeCursorPosition` with `cursor_over: bool`, which is set to true
when the cursor is over an unclipped point on the node. The visible area
of the node is not necessarily a rectangle, so the previous
implementation didn't work.
This should fix all the logical bugs with non-axis aligned interactions
and clipping. Rendering still needs changes but they are far outside the
scope of this PR.
Tried and abandoned two other approaches:
* New `transform` field on `Node`, require `GlobalTransform` on `Node`,
and unrequire `Transform` on `Node`. Unrequiring `Transform` opts out of
transform propagation so there is then no conflict with updating the
`GlobalTransform` in `ui_layout_system`. This was a nice change in its
simplicity but potentially confusing for users I think, all the
`GlobalTransform` docs mention `Transform` and having special rules for
how it's updated just for the UI is unpleasently surprising.
* New `transform` field on `Node`. Unrequire `Transform` on `Node`. New
`transform: Affine2` field on `ComputedNode`.
This was okay but I think most users want a separate specialized UI
transform components. The fat `ComputedNode` doesn't work well with
change detection.
Fixes#18929, #18930
## Testing
There is an example you can look at:
```
cargo run --example ui_transform
```
Sometimes in the example if you press the rotate button couple of times
the first glyph from the top label disappears , I'm not sure what's
causing it yet but I don't think it's related to this PR.
## Migration Guide
New specialized 2D UI transform components `UiTransform` and
`UiGlobalTransform`. `UiTransform` is a 2d-only equivalent of
`Transform` with a translation in `Val`s. `UiGlobalTransform` newtypes
`Affine2` and is updated in `ui_layout_system`.
`Node` now requires `UiTransform` instead of `Transform`. `UiTransform`
requires `UiGlobalTransform`.
In previous versions of Bevy `ui_layout_system` would overwrite UI
node's `Transform::translation` each frame. `UiTransform`s aren't
overwritten and there is no longer any need for systems that cache and
rewrite the transform for translated UI elements.
`RelativeCursorPosition`'s coordinates are now object-centered with
(0,0) at the the center of the node and the corners at (±0.5, ±0.5). Its
`normalized_visible_node_rect` field has been removed and replaced with
a new `cursor_over: bool` field which is set to true when the cursor is
hovering an unclipped area of the UI node.
---------
Co-authored-by: Alice Cecile <alice.i.cecile@gmail.com>
# Objective
The goal of `bevy_platform_support` is to provide a set of platform
agnostic APIs, alongside platform-specific functionality. This is a high
traffic crate (providing things like HashMap and Instant). Especially in
light of https://github.com/bevyengine/bevy/discussions/18799, it
deserves a friendlier / shorter name.
Given that it hasn't had a full release yet, getting this change in
before Bevy 0.16 makes sense.
## Solution
- Rename `bevy_platform_support` to `bevy_platform`.
# Objective
Fixes#9367.
Yet another follow-up to #16547.
These traits were initially based on `Borrow<Entity>` because that trait
was what they were replacing, and it felt close enough in meaning.
However, they ultimately don't quite match: `borrow` always returns
references, whereas `EntityBorrow` always returns a plain `Entity`.
Additionally, `EntityBorrow` can imply that we are borrowing an `Entity`
from the ECS, which is not what it does.
Due to its safety contract, `TrustedEntityBorrow` is important an
important and widely used trait for `EntitySet` functionality.
In contrast, the safe `EntityBorrow` does not see much use, because even
outside of `EntitySet`-related functionality, it is a better idea to
accept `TrustedEntityBorrow` over `EntityBorrow`.
Furthermore, as #9367 points out, abstracting over returning `Entity`
from pointers/structs that contain it can skip some ergonomic friction.
On top of that, there are aspects of #18319 and #18408 that are relevant
to naming:
We've run into the issue that relying on a type default can switch
generic order. This is livable in some contexts, but unacceptable in
others.
To remedy that, we'd need to switch to a type alias approach:
The "defaulted" `Entity` case becomes a
`UniqueEntity*`/`Entity*Map`/`Entity*Set` alias, and the base type
receives a more general name. `TrustedEntityBorrow` does not mesh
clearly with sensible base type names.
## Solution
Replace any `EntityBorrow` bounds with `TrustedEntityBorrow`.
+
Rename them as such:
`EntityBorrow` -> `ContainsEntity`
`TrustedEntityBorrow` -> `EntityEquivalent`
For `EntityBorrow` we produce a change in meaning; We designate it for
types that aren't necessarily strict wrappers around `Entity` or some
pointer to `Entity`, but rather any of the myriad of types that contain
a single associated `Entity`.
This pattern can already be seen in the common `entity`/`id` methods
across the engine.
We do not mean for `ContainsEntity` to be a trait that abstracts input
API (like how `AsRef<T>` is often used, f.e.), because eliding
`entity()` would be too implicit in the general case.
We prefix "Contains" to match the intuition of a struct with an `Entity`
field, like some contain a `length` or `capacity`.
It gives the impression of structure, which avoids the implication of a
relationship to the `ECS`.
`HasEntity` f.e. could be interpreted as "a currently live entity",
As an input trait for APIs like #9367 envisioned, `TrustedEntityBorrow`
is a better fit, because it *does* restrict itself to strict wrappers
and pointers. Which is why we replace any
`EntityBorrow`/`ContainsEntity` bounds with
`TrustedEntityBorrow`/`EntityEquivalent`.
Here, the name `EntityEquivalent` is a lot closer to its actual meaning,
which is "A type that is both equivalent to an `Entity`, and forms the
same total order when compared".
Prior art for this is the
[`Equivalent`](https://docs.rs/hashbrown/latest/hashbrown/trait.Equivalent.html)
trait in `hashbrown`, which utilizes both `Borrow` and `Eq` for its one
blanket impl!
Given that we lose the `Borrow` moniker, and `Equivalent` can carry
various meanings, we expand on the safety comment of `EntityEquivalent`
somewhat. That should help prevent the confusion we saw in
[#18408](https://github.com/bevyengine/bevy/pull/18408#issuecomment-2742094176).
The new name meshes a lot better with the type aliasing approach in
#18408, by aligning with the base name `EntityEquivalentHashMap`.
For a consistent scheme among all set types, we can use this scheme for
the `UniqueEntity*` wrapper types as well!
This allows us to undo the switched generic order that was introduced to
`UniqueEntityArray` by its `Entity` default.
Even without the type aliases, I think these renames are worth doing!
## Migration Guide
Any use of `EntityBorrow` becomes `ContainsEntity`.
Any use of `TrustedEntityBorrow` becomes `EntityEquivalent`.
# Objective
Now that #13432 has been merged, it's important we update our reflected
types to properly opt into this feature. If we do not, then this could
cause issues for users downstream who want to make use of
reflection-based cloning.
## Solution
This PR is broken into 4 commits:
1. Add `#[reflect(Clone)]` on all types marked `#[reflect(opaque)]` that
are also `Clone`. This is mandatory as these types would otherwise cause
the cloning operation to fail for any type that contains it at any
depth.
2. Update the reflection example to suggest adding `#[reflect(Clone)]`
on opaque types.
3. Add `#[reflect(clone)]` attributes on all fields marked
`#[reflect(ignore)]` that are also `Clone`. This prevents the ignored
field from causing the cloning operation to fail.
Note that some of the types that contain these fields are also `Clone`,
and thus can be marked `#[reflect(Clone)]`. This makes the
`#[reflect(clone)]` attribute redundant. However, I think it's safer to
keep it marked in the case that the `Clone` impl/derive is ever removed.
I'm open to removing them, though, if people disagree.
4. Finally, I added `#[reflect(Clone)]` on all types that are also
`Clone`. While not strictly necessary, it enables us to reduce the
generated output since we can just call `Clone::clone` directly instead
of calling `PartialReflect::reflect_clone` on each variant/field. It
also means we benefit from any optimizations or customizations made in
the `Clone` impl, including directly dereferencing `Copy` values and
increasing reference counters.
Along with that change I also took the liberty of adding any missing
registrations that I saw could be applied to the type as well, such as
`Default`, `PartialEq`, and `Hash`. There were hundreds of these to
edit, though, so it's possible I missed quite a few.
That last commit is **_massive_**. There were nearly 700 types to
update. So it's recommended to review the first three before moving onto
that last one.
Additionally, I can break the last commit off into its own PR or into
smaller PRs, but I figured this would be the easiest way of doing it
(and in a timely manner since I unfortunately don't have as much time as
I used to for code contributions).
## Testing
You can test locally with a `cargo check`:
```
cargo check --workspace --all-features
```
# Objective
It's difficult to understand or make changes to the UI systems because
of how each system needs to individually track changes to scale factor,
windows and camera targets in local hashmaps, particularly for new
contributors. Any major change inevitably introduces new scale factor
bugs.
Instead of per-system resolution we can resolve the camera target info
for all UI nodes in a system at the start of `PostUpdate` and then store
it per-node in components that can be queried with change detection.
Fixes#17578Fixes#15143
## Solution
Store the UI render target's data locally per node in a component that
is updated in `PostUpdate` before any other UI systems run.
This component can be then be queried with change detection so that UI
systems no longer need to have knowledge of cameras and windows and
don't require fragile custom change detection solutions using local
hashmaps.
## Showcase
Compare `measure_text_system` from main (which has a bug the causes it
to use the wrong scale factor when a node's camera target changes):
```
pub fn measure_text_system(
mut scale_factors_buffer: Local<EntityHashMap<f32>>,
mut last_scale_factors: Local<EntityHashMap<f32>>,
fonts: Res<Assets<Font>>,
camera_query: Query<(Entity, &Camera)>,
default_ui_camera: DefaultUiCamera,
ui_scale: Res<UiScale>,
mut text_query: Query<
(
Entity,
Ref<TextLayout>,
&mut ContentSize,
&mut TextNodeFlags,
&mut ComputedTextBlock,
Option<&UiTargetCamera>,
),
With<Node>,
>,
mut text_reader: TextUiReader,
mut text_pipeline: ResMut<TextPipeline>,
mut font_system: ResMut<CosmicFontSystem>,
) {
scale_factors_buffer.clear();
let default_camera_entity = default_ui_camera.get();
for (entity, block, content_size, text_flags, computed, maybe_camera) in &mut text_query {
let Some(camera_entity) = maybe_camera
.map(UiTargetCamera::entity)
.or(default_camera_entity)
else {
continue;
};
let scale_factor = match scale_factors_buffer.entry(camera_entity) {
Entry::Occupied(entry) => *entry.get(),
Entry::Vacant(entry) => *entry.insert(
camera_query
.get(camera_entity)
.ok()
.and_then(|(_, c)| c.target_scaling_factor())
.unwrap_or(1.0)
* ui_scale.0,
),
};
if last_scale_factors.get(&camera_entity) != Some(&scale_factor)
|| computed.needs_rerender()
|| text_flags.needs_measure_fn
|| content_size.is_added()
{
create_text_measure(
entity,
&fonts,
scale_factor.into(),
text_reader.iter(entity),
block,
&mut text_pipeline,
content_size,
text_flags,
computed,
&mut font_system,
);
}
}
core::mem::swap(&mut *last_scale_factors, &mut *scale_factors_buffer);
}
```
with `measure_text_system` from this PR (which always uses the correct
scale factor):
```
pub fn measure_text_system(
fonts: Res<Assets<Font>>,
mut text_query: Query<
(
Entity,
Ref<TextLayout>,
&mut ContentSize,
&mut TextNodeFlags,
&mut ComputedTextBlock,
Ref<ComputedNodeTarget>,
),
With<Node>,
>,
mut text_reader: TextUiReader,
mut text_pipeline: ResMut<TextPipeline>,
mut font_system: ResMut<CosmicFontSystem>,
) {
for (entity, block, content_size, text_flags, computed, computed_target) in &mut text_query {
// Note: the ComputedTextBlock::needs_rerender bool is cleared in create_text_measure().
if computed_target.is_changed()
|| computed.needs_rerender()
|| text_flags.needs_measure_fn
|| content_size.is_added()
{
create_text_measure(
entity,
&fonts,
computed_target.scale_factor.into(),
text_reader.iter(entity),
block,
&mut text_pipeline,
content_size,
text_flags,
computed,
&mut font_system,
);
}
}
}
```
## Testing
I removed an alarming number of tests from the `layout` module but they
were mostly to do with the deleted camera synchronisation logic. The
remaining tests should all pass now.
The most relevant examples are `multiple_windows` and `split_screen`,
the behaviour of both should be unchanged from main.
---------
Co-authored-by: UkoeHB <37489173+UkoeHB@users.noreply.github.com>
Co-authored-by: Alice Cecile <alice.i.cecile@gmail.com>
# Objective
- Contributes to #16877
## Solution
- Moved `hashbrown`, `foldhash`, and related types out of `bevy_utils`
and into `bevy_platform_support`
- Refactored the above to match the layout of these types in `std`.
- Updated crates as required.
## Testing
- CI
---
## Migration Guide
- The following items were moved out of `bevy_utils` and into
`bevy_platform_support::hash`:
- `FixedState`
- `DefaultHasher`
- `RandomState`
- `FixedHasher`
- `Hashed`
- `PassHash`
- `PassHasher`
- `NoOpHash`
- The following items were moved out of `bevy_utils` and into
`bevy_platform_support::collections`:
- `HashMap`
- `HashSet`
- `bevy_utils::hashbrown` has been removed. Instead, import from
`bevy_platform_support::collections` _or_ take a dependency on
`hashbrown` directly.
- `bevy_utils::Entry` has been removed. Instead, import from
`bevy_platform_support::collections::hash_map` or
`bevy_platform_support::collections::hash_set` as appropriate.
- All of the above equally apply to `bevy::utils` and
`bevy::platform_support`.
## Notes
- I left `PreHashMap`, `PreHashMapExt`, and `TypeIdMap` in `bevy_utils`
as they might be candidates for micro-crating. They can always be moved
into `bevy_platform_support` at a later date if desired.
# Objective
The UI can only target a single view and doesn't support `RenderLayers`,
so there doesn't seem to be any need for UI nodes to require
`ViewVisibility` and `VisibilityClass`.
Fixes#17400
## Solution
Remove the `ViewVisibility` and `VisibilityClass` component requires
from `Node` and change the visibility queries to only query for
`InheritedVisibility`.
## Testing
```cargo run --example many_buttons --release --features "trace_tracy"```
Yellow is this PR, red is main.
`bevy_render::view::visibility::reset_view_visibility`
<img width="531" alt="reset-view" src="https://github.com/user-attachments/assets/a44b215d-96bf-43ec-8669-31530ff98eae" />
`bevy_render::view::visibility::check_visibility`
<img width="445" alt="view_visibility" src="https://github.com/user-attachments/assets/fa111757-da91-434d-88e4-80bdfa29374f" />
# Objective
It's not immediately obvious that `TargetCamera` only works with UI node
entities. It's natural to assume from looking at something like the
`multiple_windows` example that it will work with everything.
## Solution
Rename `TargetCamera` to `UiTargetCamera`.
## Migration Guide
`TargetCamera` has been renamed to `UiTargetCamera`.
# Objective
Many instances of `clippy::too_many_arguments` linting happen to be on
systems - functions which we don't call manually, and thus there's not
much reason to worry about the argument count.
## Solution
Allow `clippy::too_many_arguments` globally, and remove all lint
attributes related to it.
# Objective
- https://github.com/bevyengine/bevy/issues/17111
## Solution
Set the `clippy::allow_attributes` and
`clippy::allow_attributes_without_reason` lints to `deny`, and bring
`bevy_ui` in line with the new restrictions.
## Testing
`cargo clippy --tests` and `cargo test --package bevy_ui` were run, and
no errors were encountered.
# Objective
Found more excessive `DefaultUiCamera` queries outside of extraction.
The default UI camera lookup only needs to be done once. Do it first,
not per node.
---------
Co-authored-by: MichiRecRoom <1008889+LikeLakers2@users.noreply.github.com>
# Objective
Some types like `RenderEntity` and `MainEntity` are just wrappers around
`Entity`, so they should be able to implement
`EntityBorrow`/`TrustedEntityBorrow`. This allows using them with
`EntitySet` functionality.
The `EntityRef` family are more than direct wrappers around `Entity`,
but can still benefit from being unique in a collection.
## Solution
Implement `EntityBorrow` and `TrustedEntityBorrow` for simple `Entity`
newtypes and `EntityRef` types.
These impls are an explicit decision to have the `EntityRef` types
compare like just `Entity`.
`EntityWorldMut` is omitted from this impl, because it explicitly
contains a `&mut World` as well, and we do not ever use more than one at
a time.
Add `EntityBorrow` to the `bevy_ecs` prelude.
## Migration Guide
`NormalizedWindowRef::entity` has been replaced with an
`EntityBorrow::entity` impl.
# Objective
We switch back and forwards between logical and physical coordinates all
over the place. Systems have to query for cameras and the UiScale when
they shouldn't need to. It's confusing and fragile and new scale factor
bugs get found constantly.
## Solution
* Use physical coordinates whereever possible in `bevy_ui`.
* Store physical coords in `ComputedNode` and tear out all the unneeded
scale factor calculations and queries.
* Add an `inverse_scale_factor` field to `ComputedNode` and set nodes
changed when their scale factor changes.
## Migration Guide
`ComputedNode`'s fields and methods now use physical coordinates.
`ComputedNode` has a new field `inverse_scale_factor`. Multiplying the
physical coordinates by the `inverse_scale_factor` will give the logical
values.
---------
Co-authored-by: atlv <email@atlasdostal.com>
# Objective
Continue improving the user experience of our UI Node API in the
direction specified by [Bevy's Next Generation Scene / UI
System](https://github.com/bevyengine/bevy/discussions/14437)
## Solution
As specified in the document above, merge `Style` fields into `Node`,
and move "computed Node fields" into `ComputedNode` (I chose this name
over something like `ComputedNodeLayout` because it currently contains
more than just layout info. If we want to break this up / rename these
concepts, lets do that in a separate PR). `Style` has been removed.
This accomplishes a number of goals:
## Ergonomics wins
Specifying both `Node` and `Style` is now no longer required for
non-default styles
Before:
```rust
commands.spawn((
Node::default(),
Style {
width: Val::Px(100.),
..default()
},
));
```
After:
```rust
commands.spawn(Node {
width: Val::Px(100.),
..default()
});
```
## Conceptual clarity
`Style` was never a comprehensive "style sheet". It only defined "core"
style properties that all `Nodes` shared. Any "styled property" that
couldn't fit that mold had to be in a separate component. A "real" style
system would style properties _across_ components (`Node`, `Button`,
etc). We have plans to build a true style system (see the doc linked
above).
By moving the `Style` fields to `Node`, we fully embrace `Node` as the
driving concept and remove the "style system" confusion.
## Next Steps
* Consider identifying and splitting out "style properties that aren't
core to Node". This should not happen for Bevy 0.15.
---
## Migration Guide
Move any fields set on `Style` into `Node` and replace all `Style`
component usage with `Node`.
Before:
```rust
commands.spawn((
Node::default(),
Style {
width: Val::Px(100.),
..default()
},
));
```
After:
```rust
commands.spawn(Node {
width: Val::Px(100.),
..default()
});
```
For any usage of the "computed node properties" that used to live on
`Node`, use `ComputedNode` instead:
Before:
```rust
fn system(nodes: Query<&Node>) {
for node in &nodes {
let computed_size = node.size();
}
}
```
After:
```rust
fn system(computed_nodes: Query<&ComputedNode>) {
for computed_node in &computed_nodes {
let computed_size = computed_node.size();
}
}
```
# Objective
Fixes#15142
## Solution
* Moved all the UI border geometry calculations that were scattered
through the UI extraction functions into `ui_layout_system`.
* Added a `border: BorderRect` field to `Node` to store the border size
computed by `ui_layout_system`.
* Use the border values returned from Taffy rather than calculate them
ourselves during extraction.
* Removed the `logical_rect` and `physical_rect` methods from `Node` the
descriptions and namings are deceptive, it's better to create the rects
manually instead.
* Added a method `outline_radius` to `Node` that calculates the border
radius of outlines.
* For border values `ExtractedUiNode` takes `BorderRect` and
`ResolvedBorderRadius` now instead of raw `[f32; 4]` values and converts
them in `prepare_uinodes`.
* Removed some unnecessary scaling and clamping of border values
(#15142).
* Added a `BorderRect::ZERO` constant.
* Added an `outlined_node_size` method to `Node`.
## Testing
Added some non-uniform borders to the border example. Everything seems
to be in order:
<img width="626" alt="nub"
src="https://github.com/user-attachments/assets/258ed8b5-1a9e-4ac5-99c2-6bf25c0ef31c">
## Migration Guide
The `logical_rect` and `physical_rect` methods have been removed from
`Node`. Use `Rect::from_center_size` with the translation and node size
instead.
The types of the fields border and border_radius of `ExtractedUiNode`
have been changed to `BorderRect` and `ResolvedBorderRadius`
respectively.
---------
Co-authored-by: UkoeHB <37489173+UkoeHB@users.noreply.github.com>
Co-authored-by: akimakinai <105044389+akimakinai@users.noreply.github.com>
# Objective
Simplify `pick_rounded_rect` with multiple `if` statements to make it
more readable and efficient([Godbolt
link](https://godbolt.org/z/W5vPEvT5c)).
Co-authored-by: WX\shixi <shixi1@cnwxsoft.com>
# Objective
#14957 added the `pick_rounded_rect` function to `bevy_ui` in the
`picking_backend` module, which is gated behind the `bevy_picking`
feature. This function is used in that module, as well as in the `focus`
module. The latter usage is not gated behind the `bevy_picking` feature,
causing a compile error when the feature is disabled.
## Solution
Move the `pick_rounded_rect` function out of the `picking_backend`
module, as it does not depend on anything defined in that module. I put
it in `lib.rs` but it could reasonably be moved somewhere else instead.
## Testing
Encountered this compile error in a project and confirmed that this
patch fixes it.
# Objective
Make bevy_utils less of a compilation bottleneck. Tackle #11478.
## Solution
* Move all of the directly reexported dependencies and move them to
where they're actually used.
* Remove the UUID utilities that have gone unused since `TypePath` took
over for `TypeUuid`.
* There was also a extraneous bytemuck dependency on `bevy_core` that
has not been used for a long time (since `encase` became the primary way
to prepare GPU buffers).
* Remove the `all_tuples` macro reexport from bevy_ecs since it's
accessible from `bevy_utils`.
---
## Changelog
Removed: Many of the reexports from bevy_utils (petgraph, uuid, nonmax,
smallvec, and thiserror).
Removed: bevy_core's reexports of bytemuck.
## Migration Guide
bevy_utils' reexports of petgraph, uuid, nonmax, smallvec, and thiserror
have been removed.
bevy_core' reexports of bytemuck's types has been removed.
Add them as dependencies in your own crate instead.
# Objective
- There are multiple instances of `let Some(x) = ... else { None };`
throughout the project.
- Because `Option<T>` implements
[`Try`](https://doc.rust-lang.org/stable/std/ops/trait.Try.html), it can
use the question mark `?` operator.
## Solution
- Use question mark operator instead of `let Some(x) = ... else { None
}`.
---
There was another PR that did a similar thing a few weeks ago, but I
couldn't find it.
# Objective
- (Partially) Fixes#9904
- Acts on #9910
## Solution
- Deprecated the relevant methods from `Query`, cascading changes as
required across Bevy.
---
## Changelog
- Deprecated `QueryState::get_component_unchecked_mut` method
- Deprecated `Query::get_component` method
- Deprecated `Query::get_component_mut` method
- Deprecated `Query::component` method
- Deprecated `Query::component_mut` method
- Deprecated `Query::get_component_unchecked_mut` method
## Migration Guide
### `QueryState::get_component_unchecked_mut`
Use `QueryState::get_unchecked_manual` and select for the exact
component based on the structure of the exact query as required.
### `Query::(get_)component(_unchecked)(_mut)`
Use `Query::get` and select for the exact component based on the
structure of the exact query as required.
- For mutable access (`_mut`), use `Query::get_mut`
- For unchecked access (`_unchecked`), use `Query::get_unchecked`
- For panic variants (non-`get_`), add `.unwrap()`
## Notes
- `QueryComponentError` can be removed once these deprecated methods are
also removed. Due to an interaction with `thiserror`'s derive macro, it
is not marked as deprecated.
# Objective
Add support for presenting each UI tree on a specific window and
viewport, while making as few breaking changes as possible.
This PR is meant to resolve the following issues at once, since they're
all related.
- Fixes#5622
- Fixes#5570
- Fixes#5621
Adopted #5892 , but started over since the current codebase diverged
significantly from the original PR branch. Also, I made a decision to
propagate component to children instead of recursively iterating over
nodes in search for the root.
## Solution
Add a new optional component that can be inserted to UI root nodes and
propagate to children to specify which camera it should render onto.
This is then used to get the render target and the viewport for that UI
tree. Since this component is optional, the default behavior should be
to render onto the single camera (if only one exist) and warn of
ambiguity if multiple cameras exist. This reduces the complexity for
users with just one camera, while giving control in contexts where it
matters.
## Changelog
- Adds `TargetCamera(Entity)` component to specify which camera should a
node tree be rendered into. If only one camera exists, this component is
optional.
- Adds an example of rendering UI to a texture and using it as a
material in a 3D world.
- Fixes recalculation of physical viewport size when target scale factor
changes. This can happen when the window is moved between displays with
different DPI.
- Changes examples to demonstrate assigning UI to different viewports
and windows and make interactions in an offset viewport testable.
- Removes `UiCameraConfig`. UI visibility now can be controlled via
combination of explicit `TargetCamera` and `Visibility` on the root
nodes.
---------
Co-authored-by: davier <bricedavier@gmail.com>
Co-authored-by: Alice Cecile <alice.i.cecile@gmail.com>
Co-authored-by: Alice Cecile <alice.i.cecil@gmail.com>
# Objective
- Fixes#11119
## Solution
- Creation of the serialize feature to ui
---
## Changelog
### Changed
- Changed all the structs that implement Serialize and Deserialize to
only implement when feature is on
## Migration Guide
- If you want to use serialize and deserialize with types from bevy_ui,
you need to use the feature serialize in your TOML
```toml
[dependencies.bevy]
features = ["serialize"]
```
Matches versioning & features from other Cargo.toml files in the
project.
# Objective
Resolves#10932
## Solution
Added smallvec to the bevy_utils cargo.toml and added a line to
re-export the crate. Target version and features set to match what's
used in the other bevy crates.
# Objective
- Finish the work done in #8942 .
## Solution
- Rebase the changes made in #8942 and fix the issues stopping it from
being merged earlier
---------
Co-authored-by: Thomas <1234328+thmsgntz@users.noreply.github.com>
# Objective
- Resolves#10853
## Solution
- ~~Changed the name of `Input` struct to `PressableInput`.~~
- Changed the name of `Input` struct to `ButtonInput`.
## Migration Guide
- Breaking Change: Users need to rename `Input` to `ButtonInput` in
their projects.
# Objective
- Fixes#7680
- This is an updated for https://github.com/bevyengine/bevy/pull/8899
which had the same objective but fell a long way behind the latest
changes
## Solution
The traits `WorldQueryData : WorldQuery` and `WorldQueryFilter :
WorldQuery` have been added and some of the types and functions from
`WorldQuery` has been moved into them.
`ReadOnlyWorldQuery` has been replaced with `ReadOnlyWorldQueryData`.
`WorldQueryFilter` is safe (as long as `WorldQuery` is implemented
safely).
`WorldQueryData` is unsafe - safely implementing it requires that
`Self::ReadOnly` is a readonly version of `Self` (this used to be a
safety requirement of `WorldQuery`)
The type parameters `Q` and `F` of `Query` must now implement
`WorldQueryData` and `WorldQueryFilter` respectively.
This makes it impossible to accidentally use a filter in the data
position or vice versa which was something that could lead to bugs.
~~Compile failure tests have been added to check this.~~
It was previously sometimes useful to use `Option<With<T>>` in the data
position. Use `Has<T>` instead in these cases.
The `WorldQuery` derive macro has been split into separate derive macros
for `WorldQueryData` and `WorldQueryFilter`.
Previously it was possible to derive both `WorldQuery` for a struct that
had a mixture of data and filter items. This would not work correctly in
some cases but could be a useful pattern in others. *This is no longer
possible.*
---
## Notes
- The changes outside of `bevy_ecs` are all changing type parameters to
the new types, updating the macro use, or replacing `Option<With<T>>`
with `Has<T>`.
- All `WorldQueryData` types always returned `true` for `IS_ARCHETYPAL`
so I moved it to `WorldQueryFilter` and
replaced all calls to it with `true`. That should be the only logic
change outside of the macro generation code.
- `Changed<T>` and `Added<T>` were being generated by a macro that I
have expanded. Happy to revert that if desired.
- The two derive macros share some functions for implementing
`WorldQuery` but the tidiest way I could find to implement them was to
give them a ton of arguments and ask clippy to ignore that.
## Changelog
### Changed
- Split `WorldQuery` into `WorldQueryData` and `WorldQueryFilter` which
now have separate derive macros. It is not possible to derive both for
the same type.
- `Query` now requires that the first type argument implements
`WorldQueryData` and the second implements `WorldQueryFilter`
## Migration Guide
- Update derives
```rust
// old
#[derive(WorldQuery)]
#[world_query(mutable, derive(Debug))]
struct CustomQuery {
entity: Entity,
a: &'static mut ComponentA
}
#[derive(WorldQuery)]
struct QueryFilter {
_c: With<ComponentC>
}
// new
#[derive(WorldQueryData)]
#[world_query_data(mutable, derive(Debug))]
struct CustomQuery {
entity: Entity,
a: &'static mut ComponentA,
}
#[derive(WorldQueryFilter)]
struct QueryFilter {
_c: With<ComponentC>
}
```
- Replace `Option<With<T>>` with `Has<T>`
```rust
/// old
fn my_system(query: Query<(Entity, Option<With<ComponentA>>)>)
{
for (entity, has_a_option) in query.iter(){
let has_a:bool = has_a_option.is_some();
//todo!()
}
}
/// new
fn my_system(query: Query<(Entity, Has<ComponentA>)>)
{
for (entity, has_a) in query.iter(){
//todo!()
}
}
```
- Fix queries which had filters in the data position or vice versa.
```rust
// old
fn my_system(query: Query<(Entity, With<ComponentA>)>)
{
for (entity, _) in query.iter(){
//todo!()
}
}
// new
fn my_system(query: Query<Entity, With<ComponentA>>)
{
for entity in query.iter(){
//todo!()
}
}
// old
fn my_system(query: Query<AnyOf<(&ComponentA, With<ComponentB>)>>)
{
for (entity, _) in query.iter(){
//todo!()
}
}
// new
fn my_system(query: Query<Option<&ComponentA>, Or<(With<ComponentA>, With<ComponentB>)>>)
{
for entity in query.iter(){
//todo!()
}
}
```
---------
Co-authored-by: Alice Cecile <alice.i.cecile@gmail.com>
# Objective
Problems:
* The clipped, non-visible regions of UI nodes are interactive.
* `RelativeCursorPostion` is set relative to the visible part of the
node. It should be relative to the whole node.
* The `RelativeCursorPostion::mouse_over` method returns `true` when the
mouse is over a clipped part of a node.
fixes#10470
## Solution
Intersect a node's bounding rect with its clipping rect before checking
if it contains the cursor.
Added the field `normalized_visible_node_rect` to
`RelativeCursorPosition`. This is set to the bounds of the unclipped
area of the node rect by `ui_focus_system` expressed in normalized
coordinates relative to the entire node.
Instead of checking if the normalized cursor position lies within a unit
square, it instead checks if it is contained by
`normalized_visible_node_rect`.
Added outlines to the `overflow` example that appear when the cursor is
over the visible part of the images, but not the clipped area.
---
## Changelog
* `ui_focus_system` intersects a node's bounding rect with its clipping
rect before checking if mouse over.
* Added the field `normalized_visible_node_rect` to
`RelativeCursorPosition`. This is set to the bounds of the unclipped
area of the node rect by `ui_focus_system` expressed in normalized
coordinates relative to the entire node.
* `RelativeCursorPostion` is calculated relative to the whole node's
position and size, not only the visible part.
* `RelativeCursorPosition::mouse_over` only returns true when the mouse
is over an unclipped region of the UI node.
* Removed the `Deref` and `DerefMut` derives from
`RelativeCursorPosition` as it is no longer a single field struct.
* Added some outlines to the `overflow` example that respond to
`Interaction` changes.
## Migration Guide
The clipped areas of UI nodes are no longer interactive.
`RelativeCursorPostion` is now calculated relative to the whole node's
position and size, not only the visible part. Its `mouse_over` method
only returns true when the cursor is over an unclipped part of the node.
`RelativeCursorPosition` no longer implements `Deref` and `DerefMut`.
# Objective
Replace instances of
```rust
for x in collection.iter{_mut}() {
```
with
```rust
for x in &{mut} collection {
```
This also changes CI to no longer suppress this lint. Note that since
this lint only shows up when using clippy in pedantic mode, it was
probably unnecessary to suppress this lint in the first place.
# Objective
Fix#8267.
Fixes half of #7840.
The `ComputedVisibility` component contains two flags: hierarchy
visibility, and view visibility (whether its visible to any cameras).
Due to the modular and open-ended way that view visibility is computed,
it triggers change detection every single frame, even when the value
does not change. Since hierarchy visibility is stored in the same
component as view visibility, this means that change detection for
inherited visibility is completely broken.
At the company I work for, this has become a real issue. We are using
change detection to only re-render scenes when necessary. The broken
state of change detection for computed visibility means that we have to
to rely on the non-inherited `Visibility` component for now. This is
workable in the early stages of our project, but since we will
inevitably want to use the hierarchy, we will have to either:
1. Roll our own solution for computed visibility.
2. Fix the issue for everyone.
## Solution
Split the `ComputedVisibility` component into two: `InheritedVisibilty`
and `ViewVisibility`.
This allows change detection to behave properly for
`InheritedVisibility`.
View visiblity is still erratic, although it is less useful to be able
to detect changes
for this flavor of visibility.
Overall, this actually simplifies the API. Since the visibility system
consists of
self-explaining components, it is much easier to document the behavior
and usage.
This approach is more modular and "ECS-like" -- one could
strip out the `ViewVisibility` component entirely if it's not needed,
and rely only on inherited visibility.
---
## Changelog
- `ComputedVisibility` has been removed in favor of:
`InheritedVisibility` and `ViewVisiblity`.
## Migration Guide
The `ComputedVisibilty` component has been split into
`InheritedVisiblity` and
`ViewVisibility`. Replace any usages of
`ComputedVisibility::is_visible_in_hierarchy`
with `InheritedVisibility::get`, and replace
`ComputedVisibility::is_visible_in_view`
with `ViewVisibility::get`.
```rust
// Before:
commands.spawn(VisibilityBundle {
visibility: Visibility::Inherited,
computed_visibility: ComputedVisibility::default(),
});
// After:
commands.spawn(VisibilityBundle {
visibility: Visibility::Inherited,
inherited_visibility: InheritedVisibility::default(),
view_visibility: ViewVisibility::default(),
});
```
```rust
// Before:
fn my_system(q: Query<&ComputedVisibilty>) {
for vis in &q {
if vis.is_visible_in_hierarchy() {
// After:
fn my_system(q: Query<&InheritedVisibility>) {
for inherited_visibility in &q {
if inherited_visibility.get() {
```
```rust
// Before:
fn my_system(q: Query<&ComputedVisibilty>) {
for vis in &q {
if vis.is_visible_in_view() {
// After:
fn my_system(q: Query<&ViewVisibility>) {
for view_visibility in &q {
if view_visibility.get() {
```
```rust
// Before:
fn my_system(mut q: Query<&mut ComputedVisibilty>) {
for vis in &mut q {
vis.set_visible_in_view();
// After:
fn my_system(mut q: Query<&mut ViewVisibility>) {
for view_visibility in &mut q {
view_visibility.set();
```
---------
Co-authored-by: Robert Swain <robert.swain@gmail.com>
# Objective
Inconvenient initialization of `UiScale`
## Solution
Change `UiScale` to a tuple struct
## Migration Guide
Replace initialization of `UiScale` like ```UiScale { scale: 1.0 }```
with ```UiScale(1.0)```
# Objective
After the UI layout is computed when the coordinates are converted back
from physical coordinates to logical coordinates the `UiScale` is
ignored. This results in a confusing situation where we have two
different systems of logical coordinates.
Example:
```rust
use bevy::prelude::*;
fn main() {
App::new()
.add_plugins(DefaultPlugins)
.add_systems(Startup, setup)
.add_systems(Update, update)
.run();
}
fn setup(mut commands: Commands, mut ui_scale: ResMut<UiScale>) {
ui_scale.scale = 4.;
commands.spawn(Camera2dBundle::default());
commands.spawn(NodeBundle {
style: Style {
align_items: AlignItems::Center,
justify_content: JustifyContent::Center,
width: Val::Percent(100.),
..Default::default()
},
..Default::default()
})
.with_children(|builder| {
builder.spawn(NodeBundle {
style: Style {
width: Val::Px(100.),
height: Val::Px(100.),
..Default::default()
},
background_color: Color::MAROON.into(),
..Default::default()
}).with_children(|builder| {
builder.spawn(TextBundle::from_section("", TextStyle::default());
});
});
}
fn update(
mut text_query: Query<(&mut Text, &Parent)>,
node_query: Query<Ref<Node>>,
) {
for (mut text, parent) in text_query.iter_mut() {
let node = node_query.get(parent.get()).unwrap();
if node.is_changed() {
text.sections[0].value = format!("size: {}", node.size());
}
}
}
```
result:

We asked for a 100x100 UI node but the Node's size is multiplied by the
value of `UiScale` to give a logical size of 400x400.
## Solution
Divide the output physical coordinates by `UiScale` in
`ui_layout_system` and multiply the logical viewport size by `UiScale`
when creating the projection matrix for the UI's `ExtractedView` in
`extract_default_ui_camera_view`.
---
## Changelog
* The UI layout's physical coordinates are divided by both the window
scale factor and `UiScale` when converting them back to logical
coordinates. The logical size of Ui nodes now matches the values given
to their size constraints.
* Multiply the logical viewport size by `UiScale` before creating the
projection matrix for the UI's `ExtractedView` in
`extract_default_ui_camera_view`.
* In `ui_focus_system` the cursor position returned from `Window` is
divided by `UiScale`.
* Added a scale factor parameter to `Node::physical_size` and
`Node::physical_rect`.
* The example `viewport_debug` now uses a `UiScale` of 2. to ensure that
viewport coordinates are working correctly with a non-unit `UiScale`.
## Migration Guide
Physical UI coordinates are now divided by both the `UiScale` and the
window's scale factor to compute the logical sizes and positions of UI
nodes.
This ensures that UI Node size and position values, held by the `Node`
and `GlobalTransform` components, conform to the same logical coordinate
system as the style constraints from which they are derived,
irrespective of the current `scale_factor` and `UiScale`.
---------
Co-authored-by: Carter Anderson <mcanders1@gmail.com>
# Objective
**This implementation is based on
https://github.com/bevyengine/rfcs/pull/59.**
---
Resolves#4597
Full details and motivation can be found in the RFC, but here's a brief
summary.
`FromReflect` is a very powerful and important trait within the
reflection API. It allows Dynamic types (e.g., `DynamicList`, etc.) to
be formed into Real ones (e.g., `Vec<i32>`, etc.).
This mainly comes into play concerning deserialization, where the
reflection deserializers both return a `Box<dyn Reflect>` that almost
always contain one of these Dynamic representations of a Real type. To
convert this to our Real type, we need to use `FromReflect`.
It also sneaks up in other ways. For example, it's a required bound for
`T` in `Vec<T>` so that `Vec<T>` as a whole can be made `FromReflect`.
It's also required by all fields of an enum as it's used as part of the
`Reflect::apply` implementation.
So in other words, much like `GetTypeRegistration` and `Typed`, it is
very much a core reflection trait.
The problem is that it is not currently treated like a core trait and is
not automatically derived alongside `Reflect`. This makes using it a bit
cumbersome and easy to forget.
## Solution
Automatically derive `FromReflect` when deriving `Reflect`.
Users can then choose to opt-out if needed using the
`#[reflect(from_reflect = false)]` attribute.
```rust
#[derive(Reflect)]
struct Foo;
#[derive(Reflect)]
#[reflect(from_reflect = false)]
struct Bar;
fn test<T: FromReflect>(value: T) {}
test(Foo); // <-- OK
test(Bar); // <-- Panic! Bar does not implement trait `FromReflect`
```
#### `ReflectFromReflect`
This PR also automatically adds the `ReflectFromReflect` (introduced in
#6245) registration to the derived `GetTypeRegistration` impl— if the
type hasn't opted out of `FromReflect` of course.
<details>
<summary><h4>Improved Deserialization</h4></summary>
> **Warning**
> This section includes changes that have since been descoped from this
PR. They will likely be implemented again in a followup PR. I am mainly
leaving these details in for archival purposes, as well as for reference
when implementing this logic again.
And since we can do all the above, we might as well improve
deserialization. We can now choose to deserialize into a Dynamic type or
automatically convert it using `FromReflect` under the hood.
`[Un]TypedReflectDeserializer::new` will now perform the conversion and
return the `Box`'d Real type.
`[Un]TypedReflectDeserializer::new_dynamic` will work like what we have
now and simply return the `Box`'d Dynamic type.
```rust
// Returns the Real type
let reflect_deserializer = UntypedReflectDeserializer::new(®istry);
let mut deserializer = ron:🇩🇪:Deserializer::from_str(input)?;
let output: SomeStruct = reflect_deserializer.deserialize(&mut deserializer)?.take()?;
// Returns the Dynamic type
let reflect_deserializer = UntypedReflectDeserializer::new_dynamic(®istry);
let mut deserializer = ron:🇩🇪:Deserializer::from_str(input)?;
let output: DynamicStruct = reflect_deserializer.deserialize(&mut deserializer)?.take()?;
```
</details>
---
## Changelog
* `FromReflect` is now automatically derived within the `Reflect` derive
macro
* This includes auto-registering `ReflectFromReflect` in the derived
`GetTypeRegistration` impl
* ~~Renamed `TypedReflectDeserializer::new` and
`UntypedReflectDeserializer::new` to
`TypedReflectDeserializer::new_dynamic` and
`UntypedReflectDeserializer::new_dynamic`, respectively~~ **Descoped**
* ~~Changed `TypedReflectDeserializer::new` and
`UntypedReflectDeserializer::new` to automatically convert the
deserialized output using `FromReflect`~~ **Descoped**
## Migration Guide
* `FromReflect` is now automatically derived within the `Reflect` derive
macro. Items with both derives will need to remove the `FromReflect`
one.
```rust
// OLD
#[derive(Reflect, FromReflect)]
struct Foo;
// NEW
#[derive(Reflect)]
struct Foo;
```
If using a manual implementation of `FromReflect` and the `Reflect`
derive, users will need to opt-out of the automatic implementation.
```rust
// OLD
#[derive(Reflect)]
struct Foo;
impl FromReflect for Foo {/* ... */}
// NEW
#[derive(Reflect)]
#[reflect(from_reflect = false)]
struct Foo;
impl FromReflect for Foo {/* ... */}
```
<details>
<summary><h4>Removed Migrations</h4></summary>
> **Warning**
> This section includes changes that have since been descoped from this
PR. They will likely be implemented again in a followup PR. I am mainly
leaving these details in for archival purposes, as well as for reference
when implementing this logic again.
* The reflect deserializers now perform a `FromReflect` conversion
internally. The expected output of `TypedReflectDeserializer::new` and
`UntypedReflectDeserializer::new` is no longer a Dynamic (e.g.,
`DynamicList`), but its Real counterpart (e.g., `Vec<i32>`).
```rust
let reflect_deserializer =
UntypedReflectDeserializer::new_dynamic(®istry);
let mut deserializer = ron:🇩🇪:Deserializer::from_str(input)?;
// OLD
let output: DynamicStruct = reflect_deserializer.deserialize(&mut
deserializer)?.take()?;
// NEW
let output: SomeStruct = reflect_deserializer.deserialize(&mut
deserializer)?.take()?;
```
Alternatively, if this behavior isn't desired, use the
`TypedReflectDeserializer::new_dynamic` and
`UntypedReflectDeserializer::new_dynamic` methods instead:
```rust
// OLD
let reflect_deserializer = UntypedReflectDeserializer::new(®istry);
// NEW
let reflect_deserializer =
UntypedReflectDeserializer::new_dynamic(®istry);
```
</details>
---------
Co-authored-by: Carter Anderson <mcanders1@gmail.com>
# Objective
This calculation is performed componentwise but all the values are
vectors so it should be using vector operations.
Works correctly with the `relative_cursor_position` example.
# Objective
A lot of items in `bevy_ui` could be `FromReflect` but aren't. This
prevents users and library authors from being able to convert from a
`dyn Reflect` to one of these items.
## Solution
Derive `FromReflect` where possible. Also register the
`ReflectFromReflect` type data.
# Objective
Make the coordinate systems of screen-space items (cursor position, UI,
viewports, etc.) consistent.
## Solution
Remove the weird double inversion of the cursor position's Y origin.
Once in bevy_winit to the bottom and then again in bevy_ui back to the
top.
This leaves the origin at the top left like it is in every other popular
app framework.
Update the `world_to_viewport`, `viewport_to_world`, and
`viewport_to_world_2d` methods to flip the Y origin (as they should
since the viewport coordinates were always relative to the top left).
## Migration Guide
`Window::cursor_position` now returns the position of the cursor
relative to the top left instead of the bottom left.
This now matches other screen-space coordinates like
`RelativeCursorPosition`, UI, and viewports.
The `world_to_viewport`, `viewport_to_world`, and `viewport_to_world_2d`
methods on `Camera` now return/take the viewport position relative to
the top left instead of the bottom left.
If you were using `world_to_viewport` to position a UI node the returned
`y` value should now be passed into the `top` field on `Style` instead
of the `bottom` field.
Note that this might shift the position of the UI node as it is now
anchored at the top.
If you were passing `Window::cursor_position` to `viewport_to_world` or
`viewport_to_world_2d` no change is necessary.