Are the $X_i$'s suppose to represent win/loss?
Anyway, if $X_1$ was the result of a coin toss, we would have $\sigma(X_1) = (\emptyset, \Omega, T, H)$ so $\sigma(X_1) \neq (\emptyset, (-1,1), -1, 1)$.
Recall that $\sigma(X_1) = (X^{-1}(B)|B \in \scr{B})$ is the collection of preimages of $X_1$ (I like to think of it as the set of events that determine the value of $X_1$ union with ($\emptyset, \Omega$)).
Thus, $\sigma(X_1) = ((\text{win}),(\text{loss}),\Omega,\emptyset)$
Since $C_1 = 100 +2a_1(X_0)X_1$, I think the events on which the values of $C_1$ depend are the same as the ones on which the values of $X_1$ depends. I guess $X_0$ would be zero or some constant (so that $\sigma(X_0)$ is the trivial sigma-algebra).
Similarly, $C_2 = 100 +2a_1(X_0)X_1 + 2a_2(X_0,X_1)X_2$ (if $X_0$ is constant, you omit it). $C_2$'s values depend on $X_1, X_2$.
To prove this formally, I guess, you could say that:
For $n=1$,
$C_1 = 100 + 2a_1(X_0)(1)$ for a win and $= 100 + 2a_1(X_0)(-1)$ for a loss.
If a Borel set B contains $100 + 2a_1(X_0)(1)$ and not $100 + 2a_1(X_0)(-1)$, then, $C_1^{-1}(B)$ = {win}
If a Borel set B does not contain $100 + 2a_1(X_0)(1)$ but contains $100 + 2a_1(X_0)(-1)$, then, $C_1^{-1}(B)$ = {loss}
If B contains both, $C_1^{-1}(B) = \Omega$.
If B contains neither, $C_1^{-1}(B) = \emptyset$.
Thus, $\sigma(C_1) = (C_1^{-1}(B)|B \in \scr{B}) = ((win),(loss),\Omega,\emptyset)$.
However, $\sigma(C_1) = \sigma(X_1)$. Thus, $\sigma(C_1) \subseteq \sigma(X_1)$.
Can you do the proof for $\sigma(C_2)$ using the definition of a sigma-algebra generated by 2 random variables and then $\sigma(C_3)$ and so on until you can figure out the pattern to prove for all $\sigma(C_n)$ (or use induction or something. I'm not really sure hahahaha)?
To prove the martingale property (which by itself is not enough to prove $(C_n)_{n \in \mathbb{N}}$ is a martingale), we must show that:
$E(C_n|X_1, ..., X_s) = C_s \forall s < n$ and, since this is a discrete random process, $s \in \mathbb{N}$.
Btw, $E(C_n|X_1, ..., X_s) = E(C_n|\sigma(X_1, ..., X_s))$.
Anyway, $E(C_n|X_1, ..., X_s)$
= $E(100+\sum_{k=1}^{n} 2a_k(X_1,...,X_{k-1})X_k|X_1, ..., X_n)$
= 100 + $E(\sum_{k=1}^{n} 2a_k(X_1,...,X_{k-1})X_k|X_1, ..., X_n)$
Split up the sum from $k = 1$ to $s$ and then from $k = s + 1$ to $n$. Use the linearity of expectation to obtain $E($sum from $k = 1$ to $s | X_1, ..., X_n) + E($sum from $k = s + 1$ to $n | X_1, ..., X_n)$.
For the part from $k = 1$ to $s$, we have
$E($sum from $k = 1$ to $s | X_1, ..., X_n) =$ sum from $k = 1$ to $s$ (yeah, without the expected value)
since $X_k$ is $\sigma(X_1, ..., X_n)-\text{measurable}$ (since obviously $\sigma(X_k) \subseteq \sigma(X_1, ..., X_n)$).
For the part from $k = s + 1$ to $n$, use independence to say:
$E($sum from $k = s + 1$ to $n | X_1, ..., X_n) = E($sum from $k = s + 1$ to $n) = 0$ since, as you said, $E(X_{\text{anything}}) = 0$.
The above holds whether or not $a_k$ is constant.
Usually whenever you prove things are martingales, there's a measurable part and an independent part.
If you want, you can check out some of my martingale-related questions last year:
Prove $A_t := W_t^3-3t W_t$ a martingale
Prove X is a martingale
https://quant.stackexchange.com/questions/14955/determine-ew-p-w-q-w-r
https://quant.stackexchange.com/questions/14956/show-that-eb-t-mathscrf-s-b-s
Just show that $\sigma(C_1) \subseteq \sigma(X_1)$, $\sigma(C_2) \subseteq \sigma(X_1, X_2)$, etc. It seems pretty obvious? From the definition of $C_n$, $C_n$ is dependent on $X_1, ..., X_n$.
– BCLC May 14 '15 at 15:00