1

I am currently saving hashed passwords for my website. Passwords are salted and then hashed using SHA3-512. I understand the security level of SHA3-512, as it is for all Keccak algorithms, is half the capacity (as it is explained here) , so 512 bits in this case.

My question is, how does re-hashing the resulting hash n times affect the security? Is there a way to effectively compute that? In the scenario of someone stealing the database, what difference would it make having the password hashed, say, 500 times vs just once?

I have seen a similar discussion about entropy after re-hashing here, but they don't address security.

KurtWegner
  • 23
  • 4

1 Answers1

2

By using more iterations you increase the work factor that an attacker needs to deal with needed to conduct a brute force attack. This is a good thing. More is generally better as long as it doesn't affect user satisfaction with UI responsiveness. I think 10,000 iterations would be considered a low number at this point. 50,000 or 100,000 would be preferred.

Indeed one needs to consider the usual advice for password hashing,

  • Salt stored on the DB to disable Rainbow Table attack
  • Pepper stored on the Application server so that DB only attack will fail.
  • Work factor to increase the attack time by brute-forcing the users's passwords
  • Huge memory requirements against massive parallel run like in ASIC.

The good candidates for password hashing algorithms are PBKDF2, Scrypt, and Argon2id, the last one was the winner of the password hashing competition and should be preferable whenever available.

kelalaka
  • 48,443
  • 11
  • 116
  • 196
Swashbuckler
  • 2,053
  • 10
  • 8
  • I see. So it's the "same security strength" but it'll make it more time consuming for the attacker, so the point is to try and make it unfeasible. I'll upvote as soon as I get enough reputation. – KurtWegner Mar 30 '20 at 19:12