mirror of
https://github.com/yuzu-emu/mbedtls.git
synced 2024-12-24 12:25:40 +00:00
Introduce compile-time option to always flush X.509 CRT caches
This commit introduces a compile-time option MBEDTLS_X509_ALWAYS_FLUSH which controls whether releasing of CRT frames or public key contexts associated to X.509 CRTs (or, in the future, other cached parsed X.509 structures) should lead to freeing those structures immediately. Enabling this alongside of the MBEDTLS_X509_ON_DEMAND_PARSING leads to significant reduction of the average RAM consumption of Mbed TLS. The option is enabled by default to reduce the permanent RAM overhead of MBEDTLS_X509_ON_DEMAND_PARSING in case the latter is *disabled* (default). (Note that there is very little performance penalty enabling MBEDTLS_X509_ALWAYS_FLUSH in case MBEDTLS_X509_ON_DEMAND_PARSING is disabled, because hardly any parsing needs to be done to setup a CRT frame / PK context from the legacy `mbedtls_x509_crt` structure.)
This commit is contained in:
parent
c6d1c3ed1c
commit
ffcd8c39a4
|
@ -1789,6 +1789,22 @@
|
|||
*/
|
||||
//#define MBEDTLS_X509_ON_DEMAND_PARSING
|
||||
|
||||
/**
|
||||
* \def MBEDTLS_X509_ALWAYS_FLUSH
|
||||
*
|
||||
* Save RAM by having Mbed TLS always flush caches for parsed X.509
|
||||
* structures after use: This means, firstly, that caches of X.509
|
||||
* structures used by an API call are flushed when the call returns,
|
||||
* but it also encompasses immediate flushing of caches when Mbed TLS uses
|
||||
* multiple structures in succession, thereby reducing the peak RAM usage.
|
||||
* Setting this option leads to minimal RAM usage of the X.509 module at
|
||||
* the cost of performance penalties when using X.509 structures multiple
|
||||
* times (such as trusted CRTs on systems serving many connections).
|
||||
*
|
||||
* Uncomment this to always flush caches for unused X.509 structures.
|
||||
*/
|
||||
#define MBEDTLS_X509_ALWAYS_FLUSH
|
||||
|
||||
/**
|
||||
* \def MBEDTLS_X509_ALLOW_EXTENSIONS_NON_V3
|
||||
*
|
||||
|
|
Loading…
Reference in a new issue