Alex is given the choice between two games. In Game 1, a fair coin is flipped and if it comes up heads, Alex receives \(\$100\). If the coin comes up tails, Alex receives nothing. In Game 2, a fair coin is flipped twice. Each time the coin comes up heads, Alex receives \(\$50\), and Alex receives nothing for each coin flip that comes up tails. Assuming that Alex has a monotonically increasing utility function for money in the range [$0, $100], show mathematically that if Alex prefers Game 2 to Game 1, then Alex is risk averse (at least with respect to this range of monetary amounts).
Show that if $X_1$ and $X_2$ are preferentially independent of $X_3$, and $X_2$ and $X_3$ are preferentially independent of $X_1$, then $X_3$ and $X_1$ are preferentially independent of $X_2$.