This is needed to support CHERI, and thus Arm's experimental Morello
prototype, where pointers are implemented using unforgeable capabilities
that include bounds and permissions metadata to provide fine-grained
spatial and referential memory safety, as well as revocation by sweeping
memory to provide heap temporal memory safety.
The C standard does not guarantee that if two pointers compare equal
they are the same pointer, as C pointers have a notion of provenance,
and compilers have been known to exploit this during optimisation. For
CHERI, this becomes even more important, as in-place expansion can
result in realloc returning a capability to the same address but with
increased capability bounds, and so reusing the old capability will trap
trying to access outside the bounds of the original allocation.
In the case that ptr == mem, memdiff and ptrdiff should still be equal,
so the only overhead is a small amount of pointer arithmetic and a store
of the new pointer (which is required per the C standard in order to not
be undefined behaviour when next loaded).
This also fixes the calculation of oldmem to use uintptr_t rather than
size_t as casting the pointer to size_t on CHERI will strip the
capability metadata, including the validity tag, with the subsequent
cast back to void * resulting in a null-derived capability whose
validity tag is clear and thus cannot be dereferenced without trapping.
This is needed to support CHERI, and thus Arm's experimental Morello
prototype, where pointers are implemented using unforgeable capabilities
that include bounds and permissions metadata to provide fine-grained
spatial and referential memory safety, as well as revocation by sweeping
memory to provide heap temporal memory safety.
On most systems (anything with a flat memory hierarchy rather than using
segment-based addressing), size_t and uintptr_t are the same type.
However, on CHERI, size_t is just an integer offset, whereas uintptr_t
is still a capability as described above. Casting a pointer to size_t
will strip the metadata and validity tag, and casting from size_t to a
pointer will result in a null-derived capability whose validity tag is
not set, and thus cannot be dereferenced without faulting.
The audio and cursor casts were harmless as they intend to stuff an
integer into a pointer, but using uintptr_t is the idiomatic way to do
that and silences our compiler warnings (which our build tool makes
fatal by default as they often indicate real problems). The iconv and
egl casts were true positives as SDL_iconv_t and iconv_t are pointer
types, as is NativeDisplayType on most OSes, so this would have trapped
at run time when using the round-tripped pointers. The gles2 casts were
also harmless; the OpenGL API defines this argument to be a pointer type
(and uses the argument name "pointer"), but it in fact represents an
integer offset, so like audio and cursor the additional idiomatic cast
is needed to silence the warning.
When choosing an X11 Visual for a window based on its GLX capabilities, first
try glXChooseFBConfig (if available) before falling back to glXChooseVisual.
This normally does not make a difference because most GLX drivers create a
Visual for every GLXFBConfig, exposing all of the same capabilities.
For GLX render offload configurations (also know as "PRIME") where one GPU is
providing GLX rendering support for windows on an X screen running on a
different GPU, the GPU doing the offloading needs to use the Visuals that were
created by the host GPU's driver rather than being able to add its own. This
means that there may be fewer Visuals available for all of the GLXFBConfigs the
guest driver wants to expose. In order to handle that situation, the NVIDIA GLX
driver creates many GLXFBConfigs that map to the same Visual when running in a
render offload configuration.
This can result in a glXChooseVisual request failing to find a supported Visual
when there is a GLXFBConfig for that configuration that would have worked. For
example, when the game "Unnamed SDVX Clone" [1] tries to create a configuration
with multisample, glXChooseVisual fails because the Visual assigned to the
multisample GLXFBConfigs is shared with the GLXFBConfigs without multisample.
Avoid this problem by using glXChooseFBConfig, when available, to find a
GLXFBConfig with the requested capabilities and then using
glXGetVisualFromFBConfig to find the corresponding X11 Visual. This allows the
game to run, although it doesn't make me any better at actually playing it...
Signed-off-by: Aaron Plattner <aplattner@nvidia.com>
Fixes: https://forums.developer.nvidia.com/t/prime-run-cannot-create-window-x-glxcreatecontext/180214
[1] https://github.com/Drewol/unnamed-sdvx-clone
As of [1], SDL now compiles with a warning in SDL_waylandevents.c on
32-bit systems under gcc 10.3.0:
/tmp/SDL/src/video/wayland/SDL_waylandevents.c: In function 'seat_handle_capabilities':
/tmp/SDL/src/video/wayland/SDL_waylandevents.c:958:22: warning: cast from pointer to integer of different size [-Wpointer-to-int-cast]
958 | SDL_AddTouch((SDL_TouchID)seat, SDL_TOUCH_DEVICE_DIRECT, "wayland_touch");
| ^
/tmp/SDL/src/video/wayland/SDL_waylandevents.c:964:22: warning: cast from pointer to integer of different size [-Wpointer-to-int-cast]
964 | SDL_DelTouch((SDL_TouchID)seat);
| ^
This is due to SDL_TouchID always being 32-bit, but seat being a pointer
which is (obviously) only 32-bit on 32-bit systems. The conversion is
therefore harmless, so silence it with an extra cast via intptr_t.
This is what the cocoa backend does (and is similar to what the Win32
backend does, except with size_t).
Fixes: 03c19efbd1 ("Added support for multiple seats with touch input on Wayland")
[1]: 03c19efbd1
When wayland is not dynamically loaded (--enable-wayland-shared=no)
libdecor.h is not included unless SDL_VIDEO_DRIVER_WAYLAND_DYNAMIC
is set, so it fails to build. We can't simply move the libdecor.h
include above the #ifdef SDL_VIDEO_DRIVER_WAYLAND_DYNAMIC block, as
libdecor.h itself #includes wayland headers we need to replace with
#defines. Instead, duplicate the #include.
Fixes https://github.com/libsdl-org/SDL/issues/4543
Note that this doesn't fix any of the underlying issues of libdecor
being treated as part of wayland, it just fixes the build. A better
solution would probably be to decouple the wayland dynamic loading
from the libdecor dynamic loading completely, though that is a lot
more work...
Each window can have at most one zxdg toplevel decoration, but as of
[1], we accidentally create two. (If libdecor is not in use). This
causes wayland windows with server-side decorations (e.g. on KDE/KWin)
to crash with the message:
zxdg_decoration_manager_v1@7: error 1: decoration has been already constructed
This extra zxdg_decoration_manager_v1.get_toplevel_decoration() call was
introduced while deprecating wl-shell and xdg-shell-stable[1] support,
and possibly was a bad interaction with [2], which moved the decoration
creation around.
Fixes: 6aae5b44f8 ("Remove wl-shell and xdg-shell-unstable-v6 support (#4323)")
[1]: https://github.com/libsdl-org/SDL/pull/4323
[2]: https://github.com/libsdl-org/SDL/pull/4374
WASAPI_WaitDevice is used for audio playback and capture, but needs to
behave slighty different.
For playback `GetCurrentPadding` returns the padding which is already
queued, so WaitDevice should return when buffer length falls below the
buffer threshold (`maxpadding`).
For capture `GetCurrentPadding` returns the available data which can be
read, so WaitDevice can return as soon as any data is available.
In the old implementation WaitDevice could suddenly hang. This is
because on many capture devices the buffer (`padding`) wasn't filled
fast enough to surpass `maxpadding`. But if at one point (due to unlucky
timing) more than maxpadding frames were available, WaitDevice would not
return anymore.
Issue #3234 is probably related to this.
On modern CPUs, there's no penalty for using the unaligned instruction on
aligned memory, but now it can vectorize unaligned data too, which even if
it's not optimal, is still going to be faster than the scalar fallback.
Fixes#4532.
The Game Controller Kit doesn't show the controllers at startup, so the HIDAPI driver sees them first and therefore gets preference when a controller is supported by both drivers.
This fixes bug https://github.com/libsdl-org/SDL/issues/4209
This prevents an assertion whem LINUX_JoystickGetGamepadMapping tried to
open the stick temporarily and messed with global state by doing so. Now
the global state is only set in LINUX_JoystickOpen, but the common code
is shared by both interfaces.
Fixes#4198.
Wayland video subsystem uses a mix of libc and SDL function.
This patch switches libc functions to SDL ones and fixes a mismatch in memory
allocation/dealoccation of SDL_Cursor in SDL_waylandmouse.c (calloc on line 201
and SDL_free on line 313) which caused memory corruption if custom memory
allocator where provided to SDL.
As written, these contain undefined stack contents, which in practice
causes crashes/hangs and/or triggers the validation layers (they
complain about `pNext` and `flags` not being NULL).
When hint SDL_HINT_OPENGL_ES_DRIVER is set to "1" (e.g. for ANGLE support), assertion due to !_this->gl_config.driver_loaded can be causes while EGL is available.
When relative mode is enabled and not using warp mode, the cursor is
being clipped to the window. Therefore there is no reason to restore the
cursor position to the center.
Avoiding the warp to center simplifies mouse position event flow, as we
are no longer potentially receiving mouse events for the automated
movement of the cursor and can be (mostly) assured that an incoming
event from the windowing system is that of external means.
The implementation of clip logic for relative mode seemed to
unnecessarily limit the usable area to the middle of the window, in a
2x2 pixel region. This has the adverse side effect of moving the
operating system cursor to that location, even if it is in a valid
location in the window.
While in most scenarios this is handled correctly (by storing the
original position of the cursor in the window and restoring when leaving
relative mode), there are edge cases where this clip operation can cause
WM_MOUSEMOVE to fire at a point in time where it counts as a relative
delta from SDL's perspective.
X11_SetDisplayMode currently calls X11_XRRSetCrtcConfig alone. This results
in the monitor's viewport getting changed, but the underlying screen dimensions
stay the same.
The spec indicates that RRSetCrtcConfig only changes the crtc mode and has no effect
on the screen dimensions, only mentioning that the new crtc must fit entirely within the
screen size. For the size to change, RRSetScreenSize also needs to be called.
This affects Metro Exodus on Linux, when changing the resolution in the in-game settings
Metro gets stuck in a loop waiting for the size of its vulkan surface to change. Because
XRRSetScreenSize is not called the screen size is never changed, the vulkan surface dimensions
do not change, and Metro hangs forever watching for a surface size update that will
never come.
This change disables the CRTC, calls XRRSetScreenSize, and then updates the
CRTC configuration. This fixes changing the resolution from the Metro settings.
Tested with:
Metro Exodus, Portal 2
To enter Bluetooth pairing mode hold B and Action (button with circle) buttons for 3 seconds.
It works via usual HIDAPI if special filter driver is not installed:
https://www.amazon.com/gp/help/customer/display.html?nodeId=GZCT4CTFHXLHEB9T
With that driver installed it mimics Xbox One controller and works via XInput under Windows.
Under DInput this controller is not usable at all.