Per comments in util/box.h, the width and height of a wlr_box are
exclusive; that is, for a 100x100 box at (0,0), the point (99,99) is
inside it while the point (100,100) is outside it.
Thus mathematically, there exists no single closest point to the
bottom-right corner of the box while remaining inside it. You can
construct an infinite series approaching the limit, such as {(99,99),
(99.9,99.9), (99.99,99.99)...}, but since the intervals are half-open,
there is no "last" point.
wlr_box_closest_point() must therefore define an arbitrary "closest"
point. For points below and to the right of the box, the current
implementation returns (box.x + width - 1, box.y + height - 1). Let's
continue to do this.
However, the current implementation is non-linear: with the example
100x100 box, it will return an input point of (99.9,99.9) unchanged, but
for an input point (100.1,100.1) the returned point will jump back to
(99.0,99.0).
In practice, this non-linearity results in strange behaviors when
driving the mouse cursor to a screen corner. On a 1920x1080 display for
example, driving the cursor quickly to the bottom-left corner results in
a position of exactly (0,1079). Continuing to slowly nudge the cursor
downward results in the position jumping between (0,1079) and other,
fractional coordinates such as (0,1079.88).
The fractional coordinates expose some client/toolkit-side bugs (which,
to be clear, should be fixed on the client side), but IMHO the wlroots
behavior is also inconsistent and wrong -- when I drive the mouse cursor
into the corner of the screen, it should come to a stop at a fixed
position, not jitter around.