--%>

State Measuring complexity

Measuring complexity: Many algorithms have an integer n, or two integers m and n, as input - e.g., addition, multiplication, exponentiation, factorisation and primality testing. When we want to describe or analyse the `easiness' or `hardness' of the algorithm, we want to measure its `running time' as a function of the `size' of the input value(s).

If the input value is n, then it is usual to use the number of (decimal) digits, or bits (binary digits), required to store n as a measure of the size of n.

Given input n, the number of decimal digits in n is given by

[log10 n] +1;

where [x], pronounced `floor of x', denotes the greatest integer less than or equal to x. The number of binary digits or bits is similarly given by

[lg n] +1;

where we use the abbreviation lg x for log2 x (this notation is common, but not completely standard).

   Related Questions in Mathematics