Andreas Falkenhahn
SDL_RenderReadPixels() doesn't seem to work when trying to read pixels from a texture that has been created using SDL_TEXTUREACCESS_TARGET and has been selected as the render target using SDL_SetRenderTarget().
I am attaching a small program that demonstrates the issue. I get the following result here:
READ PIXEL RETURN: 0 --- COLOR CHECK: ff000000
But it should be:
READ PIXEL RETURN: 0 --- COLOR CHECK: ffff0000
Tested with SDL 2.0.3 on Windows 7.
Nitz
In SDL_CreateTextureFromSurface:
SDL_PixelFormat *dst_fmt;
/* Set up a destination surface for the texture update */
dst_fmt = SDL_AllocFormat(format);
temp = SDL_ConvertSurface(surface, dst_fmt, 0);
Here is need of NULL check for dst_fmt because there are chances of NULL return from SDL_AllocFormat(format);
Tobias Himmer
this patch adds a check to the CMake build script to detect whether the VideoCore API is available.
If it is found, it enables SDL_VIDEO_DRIVER_RPI and will also add the needed include/library directory flags to CMAKE_C_FLAGS so the subsequent check for GLES succeeds in picking up the headers.
Seems to work fine on Raspbian.
Patch from Benoit Pierre:
video: fix clipping handling in SDL_UpperBlitScaled
- honor destination clipping rectangle
- update both destination and source rectangles when clipping source
rectangle to source surface and destination rectangle to destination
clip rectangle
- don't change scaling factors when clipping
N.B.:
- when no scaling is involved (source and destination width/height are
the same), SDL_UpperBlit is used (so SDL_BlitScaled behaves like
SDL_BlitSurface)
- the final destination rectangle after all clipping is performed is
saved back to dstrect (like for SDL_UpperBlit)
There was a misconception that Linux's saturation and deadband parameters -
on which the corresponding SDL parameters were based - use only half of the
possible range.
Thanks, Elias!
Partially fixes Bugzilla #2686.
The _xgetbv intrinsic was being used in ARM builds of SDL/WinRT, which was
leading to linker errors. This commit limits _xgetbv use to the platforms on
which it is available, x86 and x64.
This will (eventually) make SDL_GetQueuedAudioSize() more accurate, and thus
reduce latency. Right now this isn't implemented anywhere, so we assume data
fed to the audio callback is consumed by the hardware and immediately played
to completion.
For versions of XAudio2 with an IXAudio2SourceVoice::GetState() that offers a
flags parameter, we can use XAUDIO2_VOICE_NOSAMPLESPLAYED, since we don't
need this information in our current calls. According to MSDN, this makes the
the call about 3x faster.
1) Moves all READMEs to docs/
2) Renames them to *.md, adds some Markdown with the idea to add a lot more
3) Moves the doxyfile config to doc/ and makes it parse the headers at ../include as well as the md files in docs.
4) Skips SDL_opengl*.h headers from the docs
5) Minor fixes to doxyfile