0

There are of course $n \choose k$ monotone disjunctions which bounds the VC dimension at $\log_2 {n \choose k}$. I'm wondering if this is bound at $k \log_2 n$? (Possibly follows from combinatorial identities).

More generally I'm looking at the claim in An algorithmic theory of learning: Robust concepts and random projection: "Further, it is NP-hard to learn a disjunction of $k$ variables as a disjunction of few than $k \log n$ variable." The author states this without proof; I'm assuming it's well-known or obvious but am trying to figure out why or where it is proved.

djechlin
  • 497
  • 4
  • 15

1 Answers1

1

Yes. This bound follows by a little bit of straightforward manipulation:

$${n \choose k} = {n \times (n-1) \times \dots \times (n-k+1) \over k!} \le {n \times n \times \dots \times n \over 1} = n^k.$$

Now taking the logarithm of both sides, we see that

$$ \lg {n \choose k} \le \lg(n^k) = k \lg n.$$

The bound can also be derived from bounds listed in Wikipedia: it follows from the fact that ${n \choose k} \le n^k/k!$. Check out Wikipedia in the future -- it's often helpful!

D.W.
  • 159,275
  • 20
  • 227
  • 470
  • Thanks, yeah it was easy. I don't see the full claim following like I thought it would; so posted question on the full claim here. – djechlin Dec 17 '15 at 20:24
  • I see why I was confused... this gives an upper bound, I needed lower ... :/ – djechlin Dec 17 '15 at 20:26
  • @djechlin, You can obtain an asymptotic lower bound from inequalities listed on that same Wikipedia page. – D.W. Dec 17 '15 at 21:11