What is the complexity of converting a very large n-bit number to a decimal representation?
My thought is that the elementary algorithm of repeated integer division, taking the remainder to get each digit, would have O(M(n)log n)
complexity, where M(n)
is the complexity of the multiplication algorithm; however, the division is not between 2 n-bit numbers but rather 1 n-bit number and a small constant number, so it seems to me the complexity could be smaller.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…