4

If we have $P(a)=0.6, P(b)=0.7$, can we say they are not mutually exclusive? without any further infotmation?

For example, is there a possible that $b$ depends on $c$, like this $p(b \mid c)$, right now we only have $P(b)=0.7$ and $P(a)=0.6$, can we say $p(a)$ and $p(b)$ are not mutually exclusive, cause $p(a)+p(b)>1$?

If event a and b are totally not related, can we still add them? For example,the p(b) is probability that we go to jail if we rob a bank, p(a) is the probability of jack eat an apple today, can we still say they are not mutually exclusive?

I understand the math here, but do not understand the jail probability p(b) is dependent on the rob probability, is it right that we add p(b) to p(a) to say they are not mutually exclusive? or the example i made is totally wrong?

Raymond
  • 43
  • 3
    Right, the two events are not mutually exclusive, because their probabilities add to more than $1$. – peterwhy Aug 27 '22 at 00:11
  • 3
    One step deeper into the explanation,

    $$\begin{align} P(A\cup B) &= P(A) + P(B) - P(A\cap B)\ &= 1.3-P(A\cap B) \end{align}$$

    The LHS $P(A\cup B) \le 1$, so $P(A\cap B)> 0$, i.e. events $A$ and $B$ are not mutually exclusive.

    – peterwhy Aug 27 '22 at 00:17
  • Thanks Peterwhy, but my question is, if event a and b are totally not related, can we still add them? For example,the p(b) is probability that we go to jail if we rob a bank, p(a) is the probability of jack eat an apple today, can we still say they are not mutually exclusive? – Raymond Aug 27 '22 at 02:13
  • The two probabilities $P(a)$ and $P(b)$ are numbers, so you can add them to get a number. Whether the sum is a meaningful probability is another question, and in this case the sum is greater than $1$ and so is not a probability. If assuming the contrary that events $a$ and $b$ are mutually exclusive, then $P(a\cup a) = P(a)+P(b)$ and so the sum would have some meaning as a probability. – peterwhy Aug 27 '22 at 02:35
  • I understand the math here, but do not understand the jail probability p(b) is dependent on the rob probability, is it right that we add p(b) to p(a) to say they are not mutually exclusive? or the example i made is totally wrong? – Raymond Aug 27 '22 at 02:42
  • Mutual exclusivity is about events; what are the events $a$ and $b$ in your example? For example $b$ is the event that you go to jail. There might be different conditional probabilities $P(b\mid\text{you rob a bank})$ and $P(b\mid\overline{\text{you rob a bank}})$, after considering the two cases of $P(\text{you rob a bank})$ then you find the weighted sum $P(b)=0.7$. Then event $b$ is definitely not mutually exclusive with $a = \text{jack eat an apple today}$ which has probability $P(a) = 0.6$. – peterwhy Aug 27 '22 at 02:46
  • yeah, thats the part i am confused, cause i think if we only say conditional probabilities P(b∣you rob a bank), this is not an event, right? so you are saying the weighted sum p(b)=0.7 is the event. If its the event b, doesnt it make b mutually exclusive with a? cause there is no A∩B – Raymond Aug 27 '22 at 02:56
  • If $b$ is the event that you go to jail, $a$ is the event that jack eats an apple today. Then $a\cap b$ is the event that both $a$ and $b$ happen. Maybe there's no causal relation between $a$ and $b$, but still there's a non-zero probability that $a$ and $b$ happen together: $P(a\cap b)>0$. This is not saying $a$ and $b$ will certainly happen together: $P(a\cap b) \le P(a) = 0.6$. – peterwhy Aug 27 '22 at 03:04
  • hmm, actually , i kind of getting your point now, P(b∣you rob a bank) and P(b∣you not rob a bank) ,sum will be the event b, then the whole probability will be 0.7. then a= jack eat an apple is another event, then adds up to 1.3,thats why they are mutually exclusive. is my understadning correct? So for a and b, are they independent event? – Raymond Aug 27 '22 at 03:06
  • Events $a$ and $b$ are not mutually exclusive, and $P(a\cap b) \ge 0.3$. Whether $a$ and $b$ are independent, there's no given information to tell, but regardless, events $a$ and $b$ are still not mutually exclusive. – peterwhy Aug 27 '22 at 03:12
  • I see, so normally, we need some more information to tell if a and b are independent right? like the flip coin example, we can tell they are independent, but for this example its hard to tell, also becasue there is a∩b≥0.3, so we know there is a and b , but not sure if they are dependent or independet – Raymond Aug 27 '22 at 03:24
  • For a different independent example: I will flip a fair Coin here, let $c$ be the event that my result is Head. You can roll a fair D6 die on your side, let $d$ be the event that your result is $3$ or above. Events $c$ and $d$ are certainly not mutually exclusive, with $P(c) + P(d) = \frac 76 > 1$. And since $c$ and $d$ are independent, I can tell exactly that $P(c\cap d) = P(c)\cdot P(d) = \frac13 > 0$. – peterwhy Aug 27 '22 at 03:35
  • Hi peter, i see, so if the equation P(A∩B) = P(A) · P(B) holds true, it will be independent. I think i get it now. Thank you so much!!! i have been thinking this so long, really appreciate it! Do you mind to copy one of your answer to "answer your question" ? I will accept your answer – Raymond Aug 27 '22 at 03:54
  • Hi ryang, sure, would love to do it, but this is first time i post a question, not exactly sure what you want me to do? do i copy all my questions in the comment to my quesiton? – Raymond Aug 27 '22 at 03:57
  • No, not necessarily, because your understanding has evolved since your first comment above. The edit that you've made (I didn't notice your comment earlier because you didn't ping me, by writing "@ryang") is what I meant. But your 4th paragraph (in relation to your 3rd paragraph) isn't making sense: are you trying to compare (1) jail & jobbery, or (2) jailrobbery & apple? – ryang Aug 28 '22 at 14:49
  • Hi @ryang Yes, it helped! Thanks a log ryang, since i can only pick one of the answer to be the accpeted answer, peter was the first one helped me. Do you know if there any way i can also mark your answer to repay your help? – Raymond Sep 01 '22 at 00:11

3 Answers3

3

Right, the two events $a$ and $b$ are not mutually exclusive, because their probabilities add to more than $1$.

One step deeper into the explanation,

$$\begin{align*} P(a\cup b) &= P(a) + P(b) - P(a\cap b)\\ &= 1.3-P(a\cap b) \end{align*}$$

The LHS $P(a\cup b) \le 1$, so $P(a\cap b)> 0$, i.e. events $a$ and $b$ are not mutually exclusive.

Conditional probabilities $P(b\mid c)$?

About the events $a$, $b$ and $c$ in your question, and you considered $P(b\mid c)$. Maybe $b$ and $c$ are dependent, and there might be different conditional probabilities $P(b\mid\text{you rob a bank})$ and $P(b\mid\overline{\text{you rob a bank}})$. After considering the two cases of $P(\text{you rob a bank})$ then you may find the total probability $P(b) = 0.7$. This $0.7$ is the $P(b)$ that one should consider and add to disprove mutual exclusivity, even if event $b$ may depend on event $c$.

Then $a\cap b$ is the event that both $a$ and $b$ happen. Maybe there's no causal relation between $a$ and $b$, but still there's a non-zero probability that $a$ and $b$ happen together: $P(a\cap b)>0$. This is not saying $a$ and $b$ will certainly happen together: $P(a\cap b) \le P(a) = 0.6$.

Independent or dependent events $a$ and $b$?

When adding $P(a)$ and $P(b)$, if the sum is greater than $1$ then $a$ and $b$ are not mutually exclusive. Whether $a$ and $b$ are independent, there's no given information to tell, but regardless, events $a$ and $b$ are still not mutually exclusive. For your example, maybe $a$ and $b$ are independent, but that might be based on additional real-world assumptions.

And a side note, for two events $X$ and $Y$ that each have some positive probability and are independent, then they are definitely not mutually exclusive, because

$$P(X\cap Y) = P(X)\cdot P(Y) > 0$$

peterwhy
  • 22,256
1

If we have $P(a)=0.6, P(b)=0.7$, can we say they are not mutually exclusive?

Yes, because $P(a\cup b)=P(a)+P(b)-P(a\cap b)$ and since $0\leq P(a\cup b)\leq1$ so, in your case, $P(a\cap b)\neq0$ and $\bf P(a\cap b)\geq0.3$.

While generally we find ourselves checking for whether there is something common between the $2$ events to check for mutual exclusiveness but if we already know that their probabilities add up to more than $1$ then we also know they can't mutually exclusive right away.

So yes you are right when you say "cause $\bf p(a)+p(b)>1$".

If event $a$ and $b$ are totally not related, can we still add them

Nope, not until you know that the events are mutually exclusive. If the events $a$ and $b$ are not related then we say that they are independent and that does not means that $\bf P(a\cap b) =0$ (which is true for mutually exclusive events) but instead that $P(a\cap b)=P(a)\cdot P(b)$.

For example, the (b) is probability that we go to jail if we rob a bank, p(a) is the probability of jack eat an apple today, can we still say they are not mutually exclusive?

Let me instead take a more well-known example to make things more clearer.

event $\bf a$: A passes the test and, as you have said, $P(a)=0.6$
event $\bf b$: B passes the test and, as you have said, $P(a)=0.7$

This by all means, in capacity, is same as your example. A and B are different students and they give different exams and aren't related in any way.

So, that said, here, $a$ and $b$ are not mutually exclusive established by the fact that their combined probabilities is more than $1$ or because as we'll see shortly that $P(a\cap b)=0.42 (\neq 0)$. But there's yet another way to establish this.

Here's another definition of mutually exclusive events to help you understand them:

If there is a set of events such that if any one of them occurs, none of the others can occur, the events are said to be mutually exclusive.

Example: In a coin toss, event of getting heads and the event of getting tails is mutually exclusive as when one occurs the other event can't occur.

So, we also now have another perspective to see why (even in your example) events $a$ and $b$ are not mutually exclusive. As happening of one does not rules out the possibility of happening of the other one.

Let's come to next thing:

In our example regarding passing of tests, we know, events $a$ and $b$ are independent by common knowledge (not mathematical but common sense) so the $P(a\cap b)=P(a)\cdot P(b)=0.42$.

Note: By definition, when $2$ events are independent then then $P(a\cap b)=P(a)\cdot P(b)$.

In your example too it seems quite intuitive enough that events are independent so we'll consider/assume them that way.

but do not understand the jail probability p(b) is dependent on the rob probability, is it right that we add p(b) to p(a) to say they are not mutually exclusive?

So, they are not dependent and your intuition is correct that they are indeed independent (and not mutually exclusive).

or the example i made is totally wrong?

When the events aren't deterministic then we can always get the help of probability.


For the reasons above, hope you are no longer confused between independent and mutually exclusive events.

  • [repost; I've edited that sandbox corresponding to your typo edit] Hi there, your answer is problematic. Too long to comment, so I sandboxed a fuller explanation here. If the link doesn't work properly, just search within the page for "A counterexample to both". – ryang Aug 29 '22 at 13:26
1

Adding to peterwhy's comments & Answer:

If event a and b are totally not related, can we still add them?

The probabilities of any two events—even from different probability experiments—are numbers, and can surely be added.

On the other hand, events are not numbers so cannot be added. And in the below, it is events—not probabilities— that can be independent or mutually exclusive.

  1. the p(b) is probability that we go to jail if we rob a bank, p(a) is the probability of jack eat an apple today; can we still say they are not mutually exclusive?

  2. If the jail probability p(b) is dependent on the rob probability, is it right that we add p(b) to p(a) to check whether they are mutually exclusive?

Go back to the relevant definitions:

  1. Events $A$ and $B$ are mutually exclusive iff $$P(A∩B)=0.$$

    If the sample space is finite, then ‘mutually exclusive’ and ‘having no common outcome’ are synonymous. My elaboration is in the link.

  2. Events $A$ and $B$ are independent iff $$P(A∩B)=P(A)P(B).$$

    If $P(A)=0,$ then $A$ and $B$ being independent means precisely that knowing that $A$ happens doesn't change $B$'s probability. My elaboration is in the link.

From these definitions, you can infer that if events $A$ and $B$ have nonzero probabilities and are mutually exclusive, then they must be dependent. Otherwise, this example shows that naively inferring one from the other is invalid:

enter image description here

$$ \begin{array}{r} \begin{array}{c|c|c} \style{font-family:inherit}{} & \style{font-family:inherit}{U_1} & \style{font-family:inherit}{U_2} & \style{font-family:inherit}{U_3} \\\hline \style{font-family:inherit}{P(X\cap Y)} & 0 & \frac14 & \frac14 \\[0pt]\hline \style{font-family:inherit}{P(X)P(Y)} & \frac14\times\frac12=\frac18 & \frac14\times\frac34=\frac38 & \frac12\times\frac12=\frac14 \\[0pt]\hline \style{font-family:inherit}{ X\text{ and }Y\text{ are}\ldots} & \textbf{dependent} & \textbf{dependent} & \textbf{independent} \\[0pt]\hline \style{font-family:inherit}{X\text{ and }Y\text{ are}\ldots} & \textbf{mutually exclusive} & \textbf{not ME} & \textbf{not ME} \end{array}\hskip-5.5pt \end{array} $$

I gave a few more visual examples here.

Finally, here's an illustration to clarify the terminology (elaboration here). Being careful with these terms (above and below) makes it easier to develop the correct framework and understanding, and wards against misconceptions solidifying:

  1. Randomly choosing one ball from each of four bags, each containing one black and one white ball, each equally likely to be chosen, is a $4$-trial probability experiment with $16$ outcomes.
  2. So, this experiment's sample space, which comprises the experiment outcomes, is $$\{BBBB,BBBW,BBWB,BBWW,\\BWBB,BWBW,BWWB,BWWW,\\WBBB,WBBW,WBWB,WBWW,\\WWBB,WWBW,WWWB,WWWW\}.$$
  3. An event is simply some subset of the sample space.
  4. So, this experiment has $2^{16}=65536$ possible events, including the empty set (i.e., an impossible event, e.g., ‘choosing a yellow ball’), the sample space itself (i.e., a certain event, e.g., ‘choosing four balls’), and any combination of the $16$ outcomes, e.g., $\{BWWW,WBWW,WWBW,WWWB\}=$‘choosing exactly one black ball’.
ryang
  • 38,879
  • 14
  • 81
  • 179