1

I have read some books in theoretical machine learning.

They always derive some expressions which involve the relationship between $E(Y|X_1,X_2,...,X_n)$ and $E(Y|X_1,X_2,...,X_n, X_{n+1})$ to prove the concentration inequalities (For example: McDiarmid's Inequality).

Which book(s) do you suggest for this type of topic?

  • 1
    This is a fundamental topic in probability, so this question is pretty similar to just asking for a probability book recommendation. There are some threads that discuss this, such as: https://math.stackexchange.com/questions/31838/what-is-the-best-book-to-learn-probability – littleO Mar 31 '18 at 05:48
  • 1
    I like "Introduction to Probability" by Bertsekas. – NicNic8 Mar 31 '18 at 05:48
  • Any probability theory book should do, but you want get to the fun stuff until you take real analysis and measure theory :). – Wolfy Mar 31 '18 at 07:39
  • Some books in probability do not mention the conditional expectation. They only tell you some basic things of E(X). However, some books are very hard which involve measure theory. That's why I want to find the suitable book which is enough for me to understand some derivations in machine learning theory. – user295106 Mar 31 '18 at 15:16
  • Given your disinterest in picking up measure theoretic prob. (which I may be misreading, apologies if so), you should pick up the notes/book for any first graduate course in stochastic processes aimed at engineers. For instance, Hajek has a nice free book here. Note that you likely do not need everything in that book (e.g. Wiener filtering). Also note that you might, at times, miss some subtleties that come from the more formal development. But this should generally do. – stochasticboy321 Apr 06 '18 at 22:59

0 Answers0