I still haven't figured out why my application is being minimized when I try to raise, it but my previous workaround is causing issues.
For now the correct way to raise and/or restore the window is as follows:
if ( !(SDL_GetWindowFlags( window ) & SDL_WINDOW_MINIMIZED) )
{
SDL_RaiseWindow( window );
}
if ( SDL_GetWindowFlags( window ) & SDL_WINDOW_MINIMIZED )
{
SDL_RestoreWindow( window );
}
I will investigate the window state change rules more fully in the future.
CR: Alfred Reynolds
When we get a kCGEventTapDisabledByTimeout or
kCGEventTapDisabledByUserInput, the event tap would perform an invalid
memory access. void pointers are so fun.
This code only runs if you explicitly build with SDL_MAC_NO_SANDBOX.
Patrice Mandin
I encountered a problem trying to load a 8-bit paletted BMP file using SDL. This file was generated using GIMP 2.8. It has a big BITMAPINFOHEADER (0x6c bytes for biSize field), and thus the palette is incorrectly setup.
Lorenzo Pistone
this patch adds support for the horizontal wheel in Windows. It is shamelessly copied off the vertical wheel code, but I guess that that is a value added in consistency.
Thanks to Denis Bernard!
Also, changed the Android manifest so the app doesn't quit with orientation
changes, and made testgles.c exit properly on Android.
Mai Lavelle
I've recently tried to create multiple windows and process key events for them, and found that key events weren't being generated for most of the windows. After some investigating I've observed the following effects. All but the most recently created window experience these effects...
- a focus lost event is generated immediately after the focus gained event, even tho window still has focus
- key events report window id 0 rather than the id of the window which has focus, SDL thinks no window has focus?
- giving focus to a non SDL window and then selecting an SDL window causes events to be generated as expected, but only until focus changes again
Focus change events are queued and delayed (200 ticks) before they are dispatched. The problem occurs when a focus out and focus in event are received on the same tick. When these delayed events are dispatched they will be sent in the order determined by the window list rather than the order in which they are received.
The focus out dispatch is implemented by calling SDL_SetKeyboardFocus(NULL). This will remove focus from any window, regardless of whether it is the one originally targeted by the X11 event.
Since SDL_SetKeyboardFocus() will always dispatch a focus lost event as needed, the easiest solution is simply to only call SDL_SetKeyboardFocus(NULL) when SDL_GetKeyboardFocus() matches the target window.
Nitz
I was going through the SDL_IntersectRectAndLine function and wondered to see the ComputeOutCode function implementation.
The problem in this algo is, x and y axis are getting check with respect to 0, Which is wrong, it should be get checked with respect to rectangle x and y axis.
Andreas Ertelt
The problem in question is caused by changeset 7771 (http://hg.libsdl.org/SDL/rev/5486e579872e / https://bugzilla.libsdl.org/show_bug.cgi?id=2121)
The redefinition of __inline__ (introduced by the addition of begin_code.h:128's "|| __STRICT_ANSI__") results in mingw's gcc throwing multiple
warning: always_inline function might not be inlinable [-Wattributes]
as well as a whole bunch of redefinitions of mingw internals which break linking of projects including the SDL2 headers.
This reverts http://hg.libsdl.org/SDL/rev/7cdeb64faa72 and fixes it in
the correct way. If you call SDL_SetWindowPosition on a fullscreen
window, it would update the x & y variables for the window, but not
actually move the window (since it was fullscreen). That would make the
internal state of the SDL_Window incorrect, causing
SDL_WarpMouseInWindow to offset incorrectly.
This makes it so SDL_SetWindowPosition updates the `windowed' x & y
coordinates, which take effect when you revert from fullscreen.
On Android available touch devices are now added with video initialization (like
the keyboard). This fixes SDL_GetNumTouchDevices() returning 0 before any touch
events happened although there is a touch screen available. The adding of touch
devices after a touch event was received is still active to allow connecting
devices later (if this is possible) and to provide a fallback if the new init
did not work somehow. For the implementation JNI was used and API level 9 is
required. There seems to be nothing in the Android NDK's input header (input.h)
to implement everything on C side without communication with Java side.