(Ebook) Elements of Distribution Theory by Severini T.A., et al. (eds.) ISBN 9780521844727, 9781107630734, 052184472X, 1107630738
PrefaceDistribution theory lies at the interface of probability and statistics. It is closely relatedto probability theory; however, it differs in its focus on the calculation and approximationof probability distributions and associated quantities such as moments and cumulants.Although distribution theory plays a central role in the development of statistical methodology,distribution theory itself does not deal with issues of statistical inference.Many standard texts on mathematical statistics and statistical inference contain either afew chapters or an appendix on basic distribution theory. I have found that such treatmentsare generally too brief, often ignoring such important concepts as characteristic functionsor cumulants. On the other hand, the discussion in books on probability theory is often tooabstract for readers whose primary interest is in statistical methodology.The purpose of this book is to provide a detailed introduction to the central results ofdistribution theory, in particular, those results needed to understand statistical methodology,without requiring an extensive background in mathematics. Chapters 1 to 4 cover basic topicssuch as random variables, distribution and density functions, expectation, conditioning,characteristic functions, moments, and cumulants. Chapter 5 covers parametric families ofdistributions, including exponential families, hierarchical models, and models with a groupstructure. Chapter 6 contains an introduction to stochastic processes.Chapter 7 covers distribution theory for functions of random variables and Chapter 8 coversdistribution theory associated with the normal distribution. Chapters 9 and 10 are morespecialized, covering asymptotic approximations to integrals and orthogonal polynomials,respectively. Although these are classical topics in mathematics, they are often overlookedin statistics texts, despite the fact that the results are often used in statistics. For instance,Watson’s lemma and Laplace’s method are general, useful tools for approximating theintegrals that arise in statistics, and orthogonal polynomials are used in areas ranging fromnonparametric function estimation to experimental design.Chapters 11 to 14 cover large-sample approximations to probability distributions. Chapter11 covers the basic ideas of convergence in distribution and Chapter 12 contains severalversions of the central limit theorem. Chapter 13 considers the problem of approximatingthe distribution of statistics that are more general than sample means, such as nonlinearfunctions of sample means and U-statistics. Higher-order asymptotic approximationssuch as Edgeworth series approximations and saddlepoint approximations are presented inChapter 14.I have attempted to keep each chapter as self-contained as possible, but some dependenciesare inevitable. Chapter 1 and Sections 2.1–2.4, 3.1–3.2, and 4.1-4.4 contain core topicsthat are used throughout the book; the material covered in these sections will most likely befamiliar to readers who have taken a course in basic probability theory. Chapter 12 requiresChapter 11 and Chapters 13 and 14 require Chapter 12; in addition, Sections 13.3 and 13.5use material from Sections 7.5 and 7.6.The mathematical prerequisites for this book are modest. Good backgrounds in calculusand linear algebra are important and a course in elementary mathematical analysis at thelevel of Rudin (1976) is useful, but not required. Appendix 3 gives a detailed summary ofthe mathematical definitions and results that are used in the book.Although many results from elementary probability theory are presented in Chapters 1to 4, it is assumed that readers have had some previous exposure to basic probabilitytheory. Measure theory, however, is not needed and is not used in the book. Thus, althoughmeasurability is briefly discussed in Chapter 1, throughout the book all subsets of a givensample space are implictly assumed to be measurable. The main drawback of this is that itis not possible to rigorously define an integral with respect to a distribution function andto establish commonly used properties of this integral. Although, ideally, readers will havehad previous exposure to integration theory, it is possible to use these results without fullyunderstanding their proofs; to help in this regard, Appendix 1 contains a brief summary ofthe integration theory needed, along with important properties of the integral.Proofs are given for nearly every result stated. The main exceptions are results requiringmeasure theory, although there are surprisingly few results of this type. In these cases,I have tried to outline the basic ideas of the proof and to give an indication of why moresophisticated mathematical results are needed. The other exceptions are a fewcases in whicha proof is given for the case of real-valued random variables and the extension to randomvectors is omitted and a number of cases in which the proof is left as an exercise. I havenot attempted to state results under the weakest possible conditions; on the contrary, I haveoften imposed relatively strong conditions if that allows a simpler and more transparentproof.
*Free conversion of into popular formats such as PDF, DOCX, DOC, AZW, EPUB, and MOBI after payment.