# Probability (for Computer Science)

## General information

• Lectures
• Omar Fawzi
• Mondays 13:30-15:30
• Amphi B
• Tutorials
• Elena Krishanova and Bertrand Simon
• ??
The topics we are planing to cover include
• Probability formalism (illustrated with fingerprinting, min-cut algorithm and more)
• Concerntration bounds (Chernoff, Azuma-Hoeffding)
• Random graphs
• The probabilistic method
• Markov chains
• Continuous distributions

### Grading

The grading scheme will take into account
• A final exam
• One midterm exam
• Homeworks (every 1 or 2 weeks)

## Schedule

 Date Topic References Jan 30th Probability spaces, events, independence, conditioning. Examples: Veryfing polynomial identities and matrix multiplication. Ch 1 of [MU] with slightly more formal definitions for prob space Feb 1st Examples of probability spaces, Borel sets, Lebesgue measure. Random variables and expectation. Linearity of expectation. Beginning of ch 2 of [MU] with more formal definitions. See also lecture notes 1,2,3 from this course Feb 8th Geometric and binomial distributions. Coupon collector problem. Randomized quicksort. Markov's inequality. Variance. Variance of a sum of independent random variables. Chebychev's inequality. Continuing ch 2 and beginning of ch 3 of [MU] Feb 27th Chernoff bounds for binomial(n,1/2) and sums of independent 0,1 random variables. Sampling application of Chernoff. Load balancing (Chernoff + union bound method). Ch 4 of [MU]. See also [O'Do] on Chernoff bounds and load balancing. Mar 5th Random processes. Balls and bins: birthday problem, maximum load (ln n)/(ln ln n) for n = m. Poisson approximation (Bin(n,c/n) similar to Poi(c)) Random graphs: isolated vertices. Branchings processes (definition and analysis of expected size of generation n). Defined conditional expectation (for discrete rvs). Ch 5 [MU]. Lecture notes on random graphs (also has a chapter on branching processes) Mar 13th Finished branching processes: extinction probability. The probabilistic method. Lower bound on Ramsey numbers. Using expectation argument: maximum satisfiability. Sample and modify: graphs with large girth. Lovasz local lemma: sufficient condition for a formula to be satisfiable. Ch 6 [MU] Mar 20th Randomized algorithms. Monte-Carlo and Las Vegas. Examples: Min-cut (contractions), Miller-Rabin primality testing and finding an assignement satisfying 7/8 clauses. Started coninuous distributions. Ch 1 of [MU] for min-cut. [O'Do] for Miller Rabin Mar 28th Continuous distributions. Jointly continuous disctribution. Buffon's needle example. Minimum of n uniforms. Examples: Uniform, exponential (with memoryless property), (Multivariate) Normal distribution. Application: Johnson-Lindenstrauss for dimension reduction (without proof) Ch 8 of [MU] Apr 3rd Convergence of random variables: almost sure and in probability. as => in prob but converse not true. Weak and strong law of large numbers (on the way, Borel-Cantelli). See lectures 4 and 5 of this course Apr 10th Convergence in distribution, central limit theorem (proof sketch only). Started Markov chains, definition, representation (matrix, graph). Example with analysis of 2-SAT algorithm. See lecture 6 of this course and Ch 7 of [MU] Apr 24th Finished the analysis of 2-SAT algorithm and discussed 3-SAT. Notion of recurrent and transient states and different characterizations of it. Random walks on Z^d (recurrent for d <= 2 and transient for d > 2). Irreducible Markov chain. Ch 7 of [MU] May 2nd For an irreducible MC, if one state is recurrent then for all i,j probability to reach j starting at i is 1. Definition of a stationary distribution. Obstacles to convergence: reducibility and periodicity. For an irreducible MC with a stationary distribution, all states are recurrent For an irreducible and aperiodic MC, for any states i,j there is an n0 such that P_{i,j}^{n} > 0 for all n > n0. Main convergence theorem: For an irreducible and aperiodic MC with stationary distribution \pi, P_{i,j}^n -> \pi_j Sufficient condition for stationary distribution: finite expected return time h_{i,i}. \pi_i = 1/h_{i,i} is then a stationary distribution. For finite irreducible MC, h_{i,i} is always finite. Ch 7 of [MU] May 9th Important class of MCs: Random walks on graphs. Expected return time of i = 2 |E|/deg(i). Cover time for any graph is upper bounded by 4 |V| |E|. Introduction to statistics. Ch 7 of [MU] May 15th Maximum Likelihood estimator. Hypothesis testing, p-value. Ch 1 of notes and parts from other chapters