>Hmm. I am reminded of Knuth's famous dictum: "never generate random
numbers with a method chosen at random". Is there any actual theory
behind that algorithm, and if so what is it? The combination of
shifting with addition (not xor) seems more likely to lead to weird
cancellations than any improvement in the hash behavior.
For what's worth: the standard way in Java is multiplying by 31. Which 31 amounts to a 5 bit shift and a substraction.
In general, a prime number is recommended though one would like that Knuth made some analysis here... it all seems mostly empirical.
Perhaps the 5 bit shift is related to the spread of ascii charpoints, as that hashcode() implementation was mainly tested and implemented for a String. But now it's also suggested for general objects, and it's even specified for standard collections: for example