3

A threshold function is a function $f: \{0,1\}^n \to \{0,1\}$, defined by $n$ integer-valued weights $w_1, w_2, \ldots, w_n$ and an integer valued threshold value $w_0$. It works as follows: $$f(x_1, x_2, \ldots, x_n) = \begin{cases}0 \text{ if } \sum_{i=1}^n w_ix_i \ge w_0 \\ 1 \text{ otherwise.} \end{cases}$$.

I'm confused about the running time of evaluating $f(x)$ for some input $x \in \{0,1\}^n$. If we evaluate the running time as a function of $n$ (the number of bits that $f$ receives), then will this depend on the number of bits needed to represent $w_0, w_1, \ldots, w_n$?

Raphael
  • 72,336
  • 29
  • 179
  • 389
Bringomial
  • 31
  • 1
  • 3
    It depends on your model of computation. It is common to assume you can do multiplication and addition in constant time so the running time would be $O(n)$ but if the numbers $w_i$ are very large this is not realistic. In that case, the running time would depend on the number of bits in the representation. – Tom van der Zanden Jul 15 '15 at 22:00
  • It's also known that every threshold function is equivalent to one in which all coefficients have bit-length $O(n\log n)$; so the weights don't have to be too ridiculous. – Yuval Filmus Jul 15 '15 at 23:26
  • Duplicate of this one? Community votes, please! – Raphael Jul 16 '15 at 08:14
  • Bringomial, you can always comment on your own posts, provided you use the same login. – Raphael Jul 16 '15 at 08:15

0 Answers0