Introduction to Mathematical Probability and Statistics - CSUSM

49 downloads 549 Views 38KB Size Report
Mathematical Statistics with Applications, 7. edition. Course de- scription: ... this lecture, but we start the lecture with the mathematical definition of a probability ...
Introduction to Mathematical Probability and Statistics Math 440 Spring 2012 Course Information Instructor: Email: Office: Office phone: Office hours: Website:

Olaf Hansen [email protected] Science 2, Room 229 760–750–8005 Monday 14:00–15:30, Thursday 14:00-15:30 faculty.csusm.edu/ohansen

Lecture: Prerequisite: Textbook:

Tuesday, Thursday, Science II 308, 8:00–9:50 Math 260. D.D. Wackerly, W. Mendenhall, R.L. Scheaffer Mathematical Statistics with Applications, 7. edition

Course description:

Statistics is concerned with methods to make inferences based on observed data. For example, how can we design an experiment, i.e. tossing a die several times, in order to conclude with a given certainty that a die is not loaded. In this example we already see that statistics is interwoven with the idea of randomness. We will not try to define randomness in this lecture, but we start the lecture with the mathematical definition of a probability measure on a sample space. Once we have defined the properties of a probability we study different kinds of sample spaces (discrete, R, R2 ) and various well known examples of probability distributions (Binomial, Poisson, Normal, Gamma, Beta) and their properties. We introduce random variables and study properties like expected value, variance, correlation and the probability distributions of functions of random variables. We conclude the study of random variables with the Central Limit Theorem. Once we have developed the tools to formulate statistical questions in our mathematical framework we will develop some elementary statistical theory. We study point and interval estimators and introduce notations like bias, consistency, sufficiency, to describe the properties of estimators. The method of moments and the maximum likelihood method are two methods to construct estimators. We conclude the lecture with an introduction to test statistics. If time allows we will study the Neyman–Pearson Lemma and the likelihood ratio test as our final topics.

1

Course goals:

The course is an introduction to the mathematical theory of probability and statistics. After the course students should be familiar with • sample spaces and probability measures. Students should be able to construct a suitable sample space for simple experiments and find the corresponding probability • common probability distributions, like binomial, hypergeometric, Poisson, uniform, normal, Gamma,... • conditional probability and its use to calculuate the probability of events • random variables, their properties and use as statistical estimators • point and interval estimators and their properties • test statistics and their properties

Homework:

Homework will be regularly assigned and collected. Your 2500 word writing requirement will be met by these assignments.

General policy:

It is important that students attend class. It is expected that students read the relevant sections of the textbook before they come to class. When a class is missed, the student is responsible to obtain the class notes from another student. All works on homework and exams must be your own. No cheating is tolerated. Anyone who cheats during an exam will receive an F for this test. Incidents of Academic Dishonesty will be reported to the Dean of Students, Sanctions at the University level may include suspension or expulsion from the University (see also, page 94,95, University Catalog 2010–2012). If a student misses an exam and can present a legitimate, documented excuse the exam will not be counted and the final exam will get more weight for the calculation of the overall grade. Examples of legitimate excuses are: illness (then I would like to get a note from the doctor), University activities (then I would like to get a note from the instructor), serious family emergencies,... Students who have a disability, which may require some modification of seating, testing, or other class requirements, are encouraged to contact me after class or during office hours.

2

Grading:

Grading will be based on the homework, two midterm exams, and one final exam. Date Exam 1: 25 % March 8. Exam 2: 25 % April 12. Final Exam: 30 % May 15, 7:00-9:00 Homework: 20 % There will be a grade for every exam and the grade for the homework is given by the following table (the grade will include plus and minus): Grade Percentage F 0 %–39% 40%–54% D C 55%–69% 70%–84% B A 85%–100%

Tentative Course Outline: Week Sections 1/23–1/27 2.4,2.5, 2.6

We define a sample space and discuss the mathematical definition of a probability measure, the Kolmogorov axioms. We construct our first probability measures. We start to study counting methods. 1/30–2/3 2.6,2.7,2.8 We continue with counting methods to derive probabilities and introduce conditional probabilities and the independence of events for discrete sample spaces. 2/6–2/10 2.9,2.10, We cover Bayes rule and introduce random variables. Then we start to 2.11,3.2,3.3 study the probability distribution of discrete random variables. 2/13–2/17 3.3,3.11,3.4, We define the expected value of a random variable. We continue with 3.5 the variance of random variables and Tschebyscheff’s Theorem connects expected value and variance. The binomial distribution is one of the most important discrete probability distributions. Then we cover the geometric distribution. 2/20–2/24 3.7, We continue with the hypergeometric and Poisson distribution. Then we 3.8,3.9,4.2 look at the moments of a discrete random variable and define the moment generating function. Then we see the first continuous random variable. 2/27–3/2 4.2,4.3, We continue to look at continuous random variables and repeat the con4.10,4.4,4.5 struction from Chapter 3: expected value, variance, Tschebyscheff’s Theorem. Our first examples of continuous distributions are the uniform and the normal distribution.

3

3/5–3/9 4.6, 4.7 3/12–3/16 5.2,5.3

3/19–3/23 3/26–3/30

4/2–4/6

4/9–4/13 4/16–4/20

4/23–4/27 4/30–5/4 5/7–5/11

Then we continue with gamma and beta distribution. 1. Exam. Then we continue with distributions in several dimensions.For distributions in several dimensions we can define marginal distributions and again conditional distributions and densities. Springbreak 5.4,5.5,5.6, The notation of conditional densities leads to the definition of independent 5.7 random variables. Then we study functions of random variables where the covariance is an example. 5.8,6.2, 6.5 Finally we study how these functions, like expected value, covariance, behave under linear transformations. We learn how to find the density of the function of random variables. The moment generating function is again helpful to find the distribution of a function of a random variable. 7.2, 7.3 Then we look at special distributions like the χ2 distribution. We will only quote the Central Limit Theorem. 2. Exam. 8.2,8.3, Then we introduce point estimators and learn what it means for an estimator 8.4,8.5,8.6 to be biased. We learn one method to evaluate the quality of an estimator. We construct confidence intervals. A special topic are confidence intervals for large sample sizes. 8.6,8.7,9.2, Then we turn to question of how to choose the sample size. Two more 9.3 properties of estimators are efficiency and sufficiency. 9.6,9.7 Two methods to construct estimators are the method of moments and the maximum likelihood method. 10.2,10.3, Then we introduce statistical tests and we define type I and type II errors. 10.4,(10.10) We look at examples for tests, calculate type II errors. If time allows we will learn about the power of a test.

4