I'll assume $k<n/2$, for simplicity (otherwise replace $k$ by $n-k$).
One way to calculate it is via
$${n \choose k} = {n (n-1)(n-2) \cdots (n-k+1) \over k!}.$$
This can be calculated using $2k$ multiplications. So, if we counted each multiplication/division/addition as $O(1)$ time, this would be $O(k)$ time.
However, that is misleading. The size of the numbers grows dramatically. So, if you want to compute this exactly (as a rational number), we need to operate on very large numbers, which takes more than $O(1)$ time. In particular, the numbers can grow as large as $n \lg k$ bits long, so each multiplication or division might take $O((n \lg k)^2)$ time [*]. So, the running time might something like $O((n \lg k)^2 k)$ bit operations. If you are a bit cleverer about the order in which you do the multiplications and divisions (multiplying small numbers first, using a binary tree structure to minimize the number of large numbers you have to deal with) you can get this down to something like $O(k^2 \log n)$ bit operations.
Footnote *: I am ignoring sub-quadratic multiplication algorithms. There are algorithms that are asymptotically faster, for very large numbers, but they tend to be only useful when the numbers are super-large, so for simplicity of analysis, I'm ignoring them.