In pk_sign_verify, if mbedtls_pk_sign() failed, sig_len was passed to
mbedtls_pk_verify_restartable() without having been initialized. This
worked only because in the only test case that expects signature to
fail, the verify implementation doesn't look at sig_len before failing
for the expected reason.
The value of sig_len if sign() fails is undefined, so set sig_len to
something sensible.
Add pk_write test cases where the ASN.1 INTEGER encoding of the
private value would not have the mandatory size for the OCTET STRING
that contains the value.
ec_256_long_prv.pem is a random secp256r1 private key, selected so
that the private value is >= 2^255, i.e. the top bit of the first byte
is set (which would cause the INTEGER encoding to have an extra
leading 0 byte).
ec_521_short_prv.pem is a random secp521r1 private key, selected so
that the private value is < 2^519, i.e. the first byte is 0 and the
top bit of the second byte is 0 (which would cause the INTEGER
encoding to have one less 0 byte at the start).
The default entropy nonce length is either zero or nonzero depending
on the desired security strength and the entropy length.
The implementation calculates the actual entropy nonce length from the
actual entropy length, and therefore it doesn't need a constant that
indicates the default entropy nonce length. A portable application may
be interested in this constant, however. And our test code could
definitely use it.
Define a constant MBEDTLS_CTR_DRBG_ENTROPY_NONCE_LEN and use it in
test code. Previously, test_suite_ctr_drbg had knowledge about the
default entropy nonce length built in and test_suite_psa_crypto_init
failed. Now both use MBEDTLS_CTR_DRBG_ENTROPY_NONCE_LEN.
This change means that the test ctr_drbg_entropy_usage no longer
validates that the default entropy nonce length is sensible. So add a
new test that checks that the default entropy length and the default
entropy nonce length are sufficient to ensure the expected security
strength.
Change the default entropy nonce length to be nonzero in some cases.
Specifically, the default nonce length is now set in such a way that
the entropy input during the initial seeding always contains enough
entropy to achieve the maximum possible security strength per
NIST SP 800-90A given the key size and entropy length.
If MBEDTLS_CTR_DRBG_ENTROPY_LEN is kept to its default value,
mbedtls_ctr_drbg_seed() now grabs extra entropy for a nonce if
MBEDTLS_CTR_DRBG_USE_128_BIT_KEY is disabled and either
MBEDTLS_ENTROPY_FORCE_SHA256 is enabled or MBEDTLS_SHA512_C is
disabled. If MBEDTLS_CTR_DRBG_USE_128_BIT_KEY is enabled, or if
the entropy module uses SHA-512, then the default value of
MBEDTLS_CTR_DRBG_ENTROPY_LEN does not require a second call to the
entropy function to achieve the maximum security strength.
This choice of default nonce size guarantees NIST compliance with the
maximum security strength while keeping backward compatibility and
performance high: in configurations that do not require grabbing more
entropy, the code will not grab more entropy than before.
mbedtls_ctr_drbg_seed() always set the entropy length to the default,
so a call to mbedtls_ctr_drbg_set_entropy_len() before seed() had no
effect. Change this to the more intuitive behavior that
set_entropy_len() sets the entropy length and seed() respects that and
only uses the default entropy length if there was no call to
set_entropy_len().
This removes the need for the test-only function
mbedtls_ctr_drbg_seed_entropy_len(). Just call
mbedtls_ctr_drbg_set_entropy_len() followed by
mbedtls_ctr_drbg_seed(), it works now.
Consolidate the invalid-handle tests from test_suite_psa_crypto and
test_suite_psa_crypto_slot_management. Start with the code in
test_suite_psa_crypto_slot_management and adapt it to test one invalid
handle value per run of the test function.
mbedtls_asn1_get_int() and mbedtls_asn1_get_mpi() behave differently
on negative INTEGERs (0200). Don't change the library behavior for now
because this might break interoperability in some applications. Change
the test function to the library behavior.
Fix the test data with negative INTEGERs. These test cases were
previously not run (they were introduced but deliberately deactivated
in 27d806fab4). The test data was
actually wrong: ASN.1 uses two's complement, which has no negative 0,
and some encodings were wrong. Now the tests have correct data, and
the test code rectifies the expected data to match the library
behavior.
mbedtls_asn1_get_int() and mbedtls_asn1_get_mpi() behave differently
on an empty INTEGER (0200). Don't change the library behavior for now
because this might break interoperability in some applications. Write
a test function that matches the library behavior.
When the asn1parse module is enabled but the bignum module is
disabled, the asn1parse test suite did not work. Fix this.
* Fix a syntax error in get_integer() (label immediately followed by a
closing brace).
* Fix an unused variable in get_integer().
* Fix `TEST_ASSERT( *p == q );` in nested_parse() failing because `*p`
was not set.
* Fix nested_parse() not outputting the length of what it parsed.
Add some ECDSA test cases where the hash is shorter or longer than the
key length, to check that the API doesn't enforce a relationship
between the two.
For the sign_deterministic tests, the keys are
tests/data_files/ec_256_prv.pem and tests/data_files/ec_384_prv.pem
and the signatures were obtained with Python Cryptodome:
from binascii import hexlify, unhexlify
from Crypto.Hash import SHA256, SHA384
from Crypto.PublicKey import ECC
from Crypto.Signature import DSS
k2 = ECC.import_key(unhexlify("3077020101042049c9a8c18c4b885638c431cf1df1c994131609b580d4fd43a0cab17db2f13eeea00a06082a8648ce3d030107a144034200047772656f814b399279d5e1f1781fac6f099a3c5ca1b0e35351834b08b65e0b572590cdaf8f769361bcf34acfc11e5e074e8426bdde04be6e653945449617de45"))
SHA384.new(b'hello').hexdigest()
hexlify(DSS.new(k2, 'deterministic-rfc6979').sign(SHA384.new(b'hello')))
k3 = ECC.import_key(unhexlify("3081a402010104303f5d8d9be280b5696cc5cc9f94cf8af7e6b61dd6592b2ab2b3a4c607450417ec327dcdcaed7c10053d719a0574f0a76aa00706052b81040022a16403620004d9c662b50ba29ca47990450e043aeaf4f0c69b15676d112f622a71c93059af999691c5680d2b44d111579db12f4a413a2ed5c45fcfb67b5b63e00b91ebe59d09a6b1ac2c0c4282aa12317ed5914f999bc488bb132e8342cc36f2ca5e3379c747"))
SHA256.new(b'hello').hexdigest()
hexlify(DSS.new(k3, 'deterministic-rfc6979').sign(SHA256.new(b'hello')))
Add invasive checks that peek at the stored persistent data after some
successful import, generation or destruction operations and after
reinitialization to ensure that the persistent data in storage has the
expected content.
Add a parameter to the p_validate_slot_number method to allow the
driver to modify the persistent data.
With the current structure of the core, the persistent data is already
updated. All it took was adding a way to modify it.
When registering a key in a secure element, go through the transaction
mechanism. This makes the code simpler, at the expense of a few extra
storage operations. Given that registering a key is typically very
rare over the lifetime of a device, this is an acceptable loss.
Drivers must now have a p_validate_slot_number method, otherwise
registering a key is not possible. This reduces the risk that due to a
mistake during the integration of a device, an application might claim
a slot in a way that is not supported by the driver.
If none of the inputs to a key derivation is a
PSA_KEY_DERIVATION_INPUT_SECRET passed with
psa_key_derivation_input_key(), forbid
psa_key_derivation_output_key(). It usually doesn't make sense to
derive a key object if the secret isn't itself a proper key.
After passing some inputs, try getting one byte of output, just to
check that this succeeds (for a valid sequence of inputs) or fails
with BAD_STATE (for an invalid sequence of inputs). Either output a
1-byte key or a 1-byte buffer depending on the test data.
The test data was expanded as follows:
* Output key type (or not a key): same as the SECRET input if success
is expected, otherwise NONE.
* Expected status: PSA_SUCCESS after valid inputs, BAD_STATE after any
invalid input.
Allow a direct input as the SECRET input step in a key derivation, in
addition to allowing DERIVE keys. This makes it easier for
applications to run a key derivation where the "secret" input is
obtained from somewhere else. This makes it possible for the "secret"
input to be empty (keys cannot be empty), which some protocols do (for
example the IV derivation in EAP-TLS).
Conversely, allow a RAW_DATA key as the INFO/LABEL/SALT/SEED input to a key
derivation, in addition to allowing direct inputs. This doesn't
improve security, but removes a step when a personalization parameter
is stored in the key store, and allows this personalization parameter
to remain opaque.
Add test cases that explore step/key-type-and-keyhood combinations.
This commit only makes derive_input more flexible so that the key
derivation API can be tested with different key types and raw data for
each input step. The behavior of the test cases remains the same.
The uint32 is given as a bigendian stream, in the tests, however,
the char buffer that collected the stream read it as is,
without converting it. Add a temporary buffer, to call `greentea_getc()`
8 times, and then put it in the correct endianity for input to `unhexify()`.