How to define a quantum calculation
We say quantum operations are linear, meaning they can be defined by their action on individual basis states (ie. each qubit is either 0 or 1).
For example, an ${\rm XOR}$ operator acting on two input qubits and an output qubit may be defined as following:
$$
{\rm XOR} \left[|00⟩|0⟩\right] = |00⟩|0⟩ \\
{\rm XOR} \left[|01⟩|0⟩\right] = |01⟩|1⟩ \\
{\rm XOR} \left[|10⟩|0⟩\right] = |10⟩|1⟩ \\
{\rm XOR} \left[|11⟩|0⟩\right] = |11⟩|0⟩
$$
(This covers the usual cases when your output qubit starts at as 0. You'll also need to define actions on $|x_1 x_2⟩|1⟩$ such that the $\rm XOR$ operation is reversible, but I think that's beyond the scope of your question.)
In order to actually get information out of the quantum operation, you need to measure the output qubit. As long as your input qubits are in a basis state (ie. each qubit is either 0 or 1), measuring the output qubit will always give either 0 or 1.
Considering superposition
Now let's consider the case when the input qubits are taking advantage of their quantum nature and they are representing both 0 and 1 at the same time; ie. they are in a quantum superposition.
For example, say $|ψ_{\rm in}⟩ = a|00⟩ + b|01⟩ + c|10⟩ + d|11⟩$. Because the $\rm XOR$ operator is linear, we simply apply ${\rm XOR}$ to each individual term and add up the result:
$$ {\rm XOR}\left[|ψ_{\rm in}⟩|0⟩\right] = \left[ a|00⟩ + d|11⟩ \right] |0⟩ + \left[ b|01⟩ + c|10⟩ \right] |1⟩ $$
Now the output qubit is in a superposition (in fact it is entangled with the input qubits, but that doesn't actually matter at the moment). When you measure it, you may get both 0 and 1. That is, any single measurement will only ever give one or the other, but repeat the experiment many times and you'll get a mix. Most quantum algorithms are interested in this mixture, rather than the result (0 or 1) from a single experiment.
Precisely what mixture you get depends on the numerical values of $a$, $b$, $c$, and $d$. You could calculate it ahead of time, but I suppose you might say the point of quantum computing is that we don't have to. ^_^
Note on addition
I used ${\rm XOR}$ (aka. binary addition) as my example here because its output is always 0 or 1, so that we would only need one output qubit. In "normal" additon, adding two bits may give 0, 1, or 2. Even classically, you can't put 2 into a single bit..! The solution is to add a second output bit, usually called the carry bit. Please check out the question linked in @Mark S's comment for more details on a quantum implementation.