I am a bit confused about calculating complexities.
Above is a C++ program converting a char array into an int, incrementing the value, parsing it back to char array.
#include <iostream>
int main() {
char number[] = {'4', '3', '1'};
int num = 0;
//char to int conversion
for (int i = 0; i < (int)sizeof(number); i++) {
num += number[i] - '0';
num*=10;
}
num/=10;
//incrementation
num++;
//int to char conversion
for (int i = (int)sizeof(number) -1; i >= 0; i--) {
number[i] = '0' + num % 10;
num/=10;
}
//printing the result
std::cout << number << endl;
return 0;
}
Now let's say array size(3) is n. In that case I would say that the complexity is O(n+n) which is O(2n). However I've heard that O(2n) is actually O(n) for some reason but I could not find any actual source about it. What is the time complexity of this program?