UMU
#define SDL_COMPOSE_ERROR(str) SDL_STRINGIFY_ARG(__FUNCTION__) ", " str
I think SDL_STRINGIFY_ARG should be removed.
#define SDL_COMPOSE_ERROR(str) __FUNCTION__ ", " str
(verified with Visual Studio 2019)
Konrad
It appears that I cannot use SDL_RenderReadPixels on a bound framebuffer (SDL_Texture set as render target) as it simply results in gibberish data. However, drawing that framebuffer into the default target (window surface) does render it correctly. Other backends (OpenGL, software, Direct3D) do work fine.
It looks to me like D3D11_RenderReadPixels just gets the general backbuffer and not the current render target and its backbuffer.
Here is the patch which actually fetches the current render target and its underlying ID3D11Resource which is ID3D11Texture2D.
Konrad
This kind of blending is rather quite useful and in my opinion should be available for all renderers. I do need it myself, but since I didn't want to use a custom blending mode which is supported only by certain renderers (e.g. not in software which is quite important for me) I did write implementation of SDL_BLENDMODE_MUL for all renderers altogether.
SDL_BLENDMODE_MUL implements following equation:
dstRGB = (srcRGB * dstRGB) + (dstRGB * (1-srcA))
dstA = (srcA * dstA) + (dstA * (1-srcA))
Background:
https://i.imgur.com/UsYhydP.png
Blended texture:
https://i.imgur.com/0juXQcV.png
Result for SDL_BLENDMODE_MOD:
https://i.imgur.com/wgNSgUl.png
Result for SDL_BLENDMODE_MUL:
https://i.imgur.com/Veokzim.png
I think I did cover all possibilities within included patch, but I didn't write any tests for SDL_BLENDMODE_MUL, so it would be lovely if someone could do it.
Konrad
This was something rather trivial to add, but asked at least several times before (I did google about it as well).
It should be possible to dynamically change scaling mode of the texture. It is actually trivial task, but until now it was only possible with a hint before creating a texture.
I needed it for my game as well, so I took the liberty of writing it myself.
This patch adds following functions:
SDL_SetTextureScaleMode(SDL_Texture * texture, SDL_ScaleMode scaleMode);
SDL_GetTextureScaleMode(SDL_Texture * texture, SDL_ScaleMode *scaleMode);
That way you can change texture scaling on the fly.
This means it doesn't have to block while the current frame finishes using the
vertex buffer; it just moves on to the next, probably-not-in-use buffer.
Olli-Samuli Lehmus
If one creates a window with the SDL_WINDOW_FULLSCREEN_DESKTOP flag, and creates a render target with SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "linear"), and afterwards sets SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "nearest"), after minimizing the window, the scale quality hint is lost on the render target. Textures however do keep their interpolation modes.
SDL now builds with gcc 7.2 with the following command line options:
-Wall -pedantic-errors -Wno-deprecated-declarations -Wno-overlength-strings --std=c99
New functions get and set the YUV colorspace conversion mode:
SDL_SetYUVConversionMode()
SDL_GetYUVConversionMode()
SDL_GetYUVConversionModeForResolution()
SDL_ConvertPixels() converts between all supported RGB and YUV formats, with SSE acceleration for converting from planar YUV formats (YV12, NV12, etc) to common RGB/RGBA formats.
Added a new test program, testyuv, to verify correctness and speed of YUV conversion functionality.