$ (A \cap B) $ and $ (A \cap \lnot B) $ are disjoint events, thus
$ A = (A \cap B) \cup (A \cap \lnot B) \rightarrow P(A) = P(A, B) + P(A, \lnot B) $
Thus,
$ P(S) = P(S, A) + P(S, \lnot A) = 1 \Leftrightarrow \boxed{ P(A) + P(\lnot A) = 1 } $
If $ B_i, \quad i = 1, 2, \dots, n $, is a set of
then
$$ \boxed{ P(A) = \sum_i P(A, B_i) = \sum_i P(A \mid B_i) P(B_i) } $$
$$ \boxed{ P(A \mid B) = \frac{P(A, B)}{P(B)} } $$
$$ \boxed{ P(A \mid K) = \sum_i P(A \mid B_i, K) P(B_i \mid K) } $$
$ A \perp B $ if
$$ P(A \mid B) = P(A) $$
$ A \perp B \mid C $ if
$$ P(A \mid B, C) = P(A \mid C) $$
$$ \boxed{ P(E_1, E_2, \dots, E_n) = P(E_n \mid E_{n - 1}, \dots, E_2, E_1) \dots P(E_2 \mid E_1) P(E_1) } $$
$$ \boxed{ P(H \mid e) = \frac{P(e \mid H) P(H)}{P(e)} } $$
a person at the next gambling table declares the outcome “twelve”
Goal: Know whether he was rolling a pair of dice or spinning a roulette wheel.
$$ (X \perp Y \mid Z) \, \text{ iff } \, P(x \mid y, z) = P(x \mid z) \, \text{ whenever } \, P(y, z) > 0 $$
Learning the value of $ Y $ does not provide additional information about $ X $, once we know $ Z $.
$$ (X \perp Y \mid \empty) \, \text{ iff } \, P(x \mid y) = P(x) \, \text{ whenever } \, P(y) > 0 $$
Consider 2 independence fair coin tosses
$$ H_1 = \{ \text{ 1st toss is H } \} = \{ (H, H), (H, T) \} $$
$$ H_2 = \{ \text{ 2nd toss is H } \} = \{ (H, H), (T, H) \} $$
$$ D = \{ \text{ 2 tosses have different result } \} = \{ (H, T), (T, H) \} $$
On the other hand, $$ P(D, H_1, H_2) = 0 \not= \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} = P(H_1) P(D) P(H_2) $$
Aspect | Bayesian Network (BN) | Causal Bayesian Network (CBN) |
---|---|---|
Edges | Statistical dependencies (conditional independence) | Causal relationships (interventions) |
Purpose | Encode joint probability distribution | Predict effects of interventions |
Use of DAG | Represents factorization of joint distribution | Also encodes causal mechanisms |
Can answer “What if X happens?” | Not directly | Yes (via intervention calculus) ? |
Ordering of Nodes | Any order of variables, as long as the conditional independencies hold | Must respect causal (temporal or mechanistic) ordering |
probabilistic parameter: any quantity that is defined in terms of a joint probability function.
statistical parameter: any quantity that is defined in terms of a joint probability distribution of observed variables, making no assumption whatsoever regarding the existence or nonexistence of unobserved variables.