Skip to main content

Posts

Random walk by the cliff

You are one step away from falling down a cliff. In each step, you have 1/3 chance of moving towards the cliff, 2/3 chance of moving away from the cliff. What is the probability you fall down the cliff? Solution It is 1/2, and there are 3 ways to approach this. Let's say we are at position 0, and the cliff is at position -1. We go +1 with probability p, and go -1 with probability 1-p. Approach 1. Catalan number: All paths that end with falling down the cliff can be broken down to two parts: 0 -> ... -> 0 and 0 -> -1. The path 0 -> ... -> 0 is Catalan; if we fix the path length to 2n, then the number of paths would be C_n. What we want to compute is the total probability = (1-p) sum { C_n (p(1-p))^n }. We can use the generating function for Catalan to arrive at: if p <= 1/2, probability = 1; if p > 1/2, probability = 1/p - 1. Approach 2. Let P(i) be the probability of reaching -1 from position i.  P(0) = 1-p + pP(1) Also, P(1) can be derived as follows: can we r
Recent posts

Birthdays and Celebrations (Quick Notes)

Given N people, what is the number of birthday celebrations on average in a year? Solution The quick solution is: the probability of having a celebration in a particular day is 1 - (364/365)^N. So the average is 365(1 - (364/365)^N).  (Linearity if expectation) I wonder if someone tries to solve this problem directly, i.e. compute the number of ways to assign birthdays to N people to have k celebrations, call this B(N, k), then the average number of celebrations = sum k B(N, k). Turns out B(N, k) is very similar to something called Stirling number of second kind, which computes the number of ways to partition N labelled objects into k non-empty unlabelled groups. One can come up with the recurrence relation quite easily, but to directly compute it is interesting and relies on principle of exclusion and inclusion. The PIE for B(N, k) goes like this: let A_d be the set containing all assignments of birthdays where birthday d is unused. Then the answer is k^N - | A_1 union A_2 union ... u

Fermat Pascal Dice Game Fairness

Imagine you play a dice rolling game with a friend. If you roll a '6' in 8 rolls, you win the entire stake, otherwise you lose. Suppose you have rolled 3 times and haven't won yet. Your friend now suggests to skip the 4th roll, but he will compensate you fairly for it. You will still get to roll your 5th, 6th, 7th, and 8th roll. For the compensation of skipping the 4th roll, he will take out a fair amount from the stake and give it to you, keeping the rest of the stake at play. What would be considered a fair compensation? Discussion I talked to a genius and she gave me the following argument "I think of it as two options, to skip or not to skip. I will only skip if the option gives me an expectation that is higher than, or at least the same as, if I don't skip." Expectation if I don't skip:  E1= 1/6 * S + 5/6 * (1 - (5/6)^4) * S where S is the stake Expectation if I skip: E2 = c + (1 - (5/6)^4) * (S - c) where c is the compensation We want E1 <= E2. We

Derive Poisson Distribution

Let's say we are observing an event that to us seems random. E.g. The number of cars passing through a road. Suppose we learn that on average, there are m cars passing the road within 1 hour. Goal: Can we make a model to suggest what is the probability of seeing exactly r cars passing the road in an hour? Let's represent one hour as a line, and on the line there are marks to represent the time on which a car passes. Let's suppose there are m marks, to represent the average case of seeing m cars on the road in an hour. Let's divide the 1 hour line into n uniform time segments, n large enough such that in one time segment there is at most 1 car. Out of the n time segments, m of them contains a car. Then it is reasonable to say that the probability of a time segment to contain a car is m/n. Let's analyse as n goes to infinity. We can say P(seeing exactly r cars in an hour) =  P(exactly r cars contained in n segments) = \( \binom{n}{r} \left( \frac{m}{n} \right)^r \left

Derive e

Prove that \( \left( 1 + \frac{1}{n} \right)^n \) converges. Solution Let \( G(n) =  \left( 1 + \frac{1}{n} \right)^n \). Binomial expansion, \( G(n) = \sum_{k = 0}^{n} \frac{ \binom{n}{k} }{n^k} \). By simple inequality argument, \( \frac{ \binom{n}{k} }{n^k} < \frac{1}{k!} \). So if we let \( F(n) = \sum_{k = 0}^{n} \frac{1}{k!} \), we have \( G(n) < F(n) \). By simple inequality argument, \( F(n) < 1 + \sum_{k = 1}^{n} \frac{1}{2^{k-1}} < 3 \). So G and F are bounded above. By AM-GM, \( \frac{G(n-1)}{G(n)} = \left( \frac{ n^2 }{ n^2 - 1} \right)^{(n-1)} \frac{ n}{n+1} < \left(  \frac{ \frac{ n^2 }{ n^2 - 1} (n-1) + \frac{ n}{n+1} }{n} \right)^n = 1 \). So \( G(n-1) < G(n) \). Since G is monotonic increasing and bounded above, G converges. Similarly F converges. Let's also prove that G and F converges to the same limit. By studying individual terms in \( F(n) - G(n) \) and making simple inequality arguments, we can show that for each \( \epsilon > 0 \) we can

Random chord on circle

Problem:  Given a circle of radius 1, pick a chord at random. What is the chance that the length of the chord is less than the radius? Solution: Depending on how you pick it at random, the probability can be different! 1. If you pick it by randomly choosing 2 points on the circle's circumference, then the chance would be 1/3. 2. Every chord has a midpoint which is also a perpendicular bisector. If we pick this midpoint randomly within the circle, its chord length will be less than radius as long as it is outside of a smaller circle with radius $\frac{\sqrt{3}}{2}$. So the chance is 1/4 based on the ratio of area. 3. From the centre, randomly choose a radial direction and randomly pick a point along this radius, then draw a radially perpendicular line through this point, we'll get a chord. Then the chance would be \( 1 - \frac{\sqrt{3}}{2} \). Surprising, huh. In fact, you can compute the distribution of the chord length given the picking method. Is there a way to pick a chord &

Coin toss game

Problem You and your friend play a game involving coin-flips. Each time the coin lands head, your friend pays you $ 1. But each time the coin lands tail, you pay your friend $ 1. The game goes on until either you win $ 100 or you lose $ 50. What is the probability of you winning? Solution 3 solutions, two of them are algebraic, one of them is argumentative. Algebraic Solution 1 Let \( p_i \) be the probability of winning $ 100 given that we already accumulated \( i \) dollars (can be negative, meaning we are losing money). Then we have \( p_i = \frac{1}{2} p_{i-1} + \frac{1}{2} p_{i+1} \), with special value \( p_{100} = 1 \) and \( p_{-50} = 0 \). It can be shown that \( p_i = \frac{i + 50}{150} \) the only solution that satisfies the system of linear equations. This approach can be extended to a more general problem: what is the probability of winning if we stop at either winning $n or losing $m? It should be \( \frac{m}{n+m} \). Algebraic Solution 2 This approach is inspired by the