A threshold function is a function $f: \{0,1\}^n \to \{0,1\}$, defined by $n$ integer-valued weights $w_1, w_2, \ldots, w_n$ and an integer valued threshold value $w_0$. It works as follows: $$f(x_1, x_2, \ldots, x_n) = \begin{cases}0 \text{ if } \sum_{i=1}^n w_ix_i \ge w_0 \\ 1 \text{ otherwise.} \end{cases}$$.
I'm confused about the running time of evaluating $f(x)$ for some input $x \in \{0,1\}^n$. If we evaluate the running time as a function of $n$ (the number of bits that $f$ receives), then will this depend on the number of bits needed to represent $w_0, w_1, \ldots, w_n$?