2

I am a bit confused about calculating complexities.

Above is a C++ program converting a char array into an int, incrementing the value, parsing it back to char array.

#include <iostream>

int main() {
    char number[] = {'4', '3', '1'};    
    int num = 0;
    //char to int conversion
    for (int i = 0; i < (int)sizeof(number); i++) {
        num += number[i] - '0';
        num*=10;
    }
    num/=10;

    //incrementation
    num++;

    //int to char conversion
    for (int i = (int)sizeof(number) -1; i >= 0; i--) {
        number[i] = '0' + num % 10;
        num/=10;
    }

    //printing the result
    std::cout << number << endl;
    return 0;
}

Now let's say array size(3) is n. In that case I would say that the complexity is O(n+n) which is O(2n). However I've heard that O(2n) is actually O(n) for some reason but I could not find any actual source about it. What is the time complexity of this program?

A.Schulz
  • 12,167
  • 1
  • 40
  • 63
Sarp Kaya
  • 381
  • 2
  • 4
  • 11

1 Answers1

4

The way $O(f(n))$ is defined, $O(k \times f(n))$ = $O(f(n))$ for positive constants $k$. Therefore, $O(2n)$ = $O(n)$. The reason this works is that $f(n) = O(g(n))$ if and only if there exists some constant $c$ such that for all $n \geq n_0$ it holds that $f(n) \leq c \times g(n)$. We can therefore demonstrate that $k \times f(n) = O(f(n))$ since $k \times f(n) \leq c \times f(n)$ precisely for $c \geq k$.

Patrick87
  • 12,824
  • 1
  • 44
  • 76