3

To quote Ilari on the CFRG mailing list:

e.g. the vile mess that is ECDSA with SHA-3

I have some ideas what he might mean but I couldn't find the discussion so what is that mess?

Elias
  • 4,903
  • 1
  • 14
  • 31

1 Answers1

1

The issue seems to be that ECDSA uses bit strings in several places, and SHA-3 uses a surprising bit-string to octet-string conversion. This is exacerbated by the fact that the ECDSA spec is not freely available.

The strangeness of SHA-3 seems to be that the bit order in which SHA-3 expects bytes requires flipping them around compared to the bit order that is used pretty much everywhere else. In other words, the messages are defined as bits but computers deal with bytes so if you change MSB/LSB of the message you not only need to change endianness but you need to flip all the bytes around.

I cannot easily verify this but it's worth being aware of for implementations so I am posting it anyway.

Elias
  • 4,903
  • 1
  • 14
  • 31
  • 9
    The modern ECDSA spec is available in SEC1. Standard curves are in SEC2. What would be parameters or circumstances where the problem that you mention happens? AFAIK the only thing that ECDSA hashes is the message, and as long as these are bytes, there won't be a problem at the input. So I see no link between whatever peculiarity SHA-3 has on input, and ECDSA. – fgrieu Aug 31 '22 at 17:33
  • 3
    Yeah, same objection worded differently: as long as the input & output in bytes is specified by the algorithm, it doesn't matter if it is big or little endian internally. To me it was a mistake to require the SHA-3 algorithms to be specified in little endian even though x86 is little endian and ARM almost always runs in little endian mode. However, in the end that doesn't make a difference as long as the input & output is binary & canonical. The same input in bytes will always generate the same output in bytes, indexed zero to the output size. – Maarten Bodewes Feb 26 '23 at 12:40