I am studying number theory for cryptography and I've got confused by the log notation that they used.
In one video, prof Jonathan Kat refers to the length of a number as the number of bits that are needed to represent that number. To denote the length of a number two vertical bars surrounding the number are used and that is considered to be equal to the logarithm of the number magnitude. See below the expression:
$$\|a\|=O(\log a); a=2^{\|a\|} $$
My question is regarding the notation used for that logarithm. I have always thought that when a log appears without subscript was referring to $\log_{10}$, however in the above example is referring to $\log_{2}$. Then I googled and I found the following:
- $x = \log y$ often means $x = \log_e y$ in mathematics texts.
- $x = \log y$ often means $x = \log_{10} y$ in science and engineering texts.
- $x = \log y$ often means $x = \log_2 y$ in computer science texts.
How often the above statement are met and why the use of different notation between fields?