The False Discovery Rate

Its Meaning, Interpretation and Application in Data Science

Nicholas W Galwey author Geert Molenberghs editor

Format:Hardback

Publisher:John Wiley and Sons Ltd

Publishing:30th Jan '25

£74.99

This title is due to be published on 30th January, and will be despatched as soon as possible.

The False Discovery Rate cover

An essential tool for statisticians and data scientists seeking to interpret the vast troves of data that increasingly power our world

First developed in the 1990s, the False Discovery Rate (FDR) is a way of describing the rate at which null hypothesis testing produces errors. It has since become an essential tool for interpreting large datasets. In recent years, as datasets have become ever larger, and as the importance of ‘big data‘ to scientific research has grown, the significance of the FDR has grown correspondingly.

The False Discovery Rate provides an analysis of the FDR's value as a tool, including why it should generally be preferred to the Bonferroni correction and other methods by which multiplicity can be accounted for. It offers a systematic overview of the FDR, its core claims, and its applications.

Readers of The False Discovery Rate will also find:

  • Case studies throughout, rooted in real and simulated data sets
  • Detailed discussion of topics including representation of the FDR on a Q-Q plot, consequences of non-monotonicity, and many more
  • Wide-ranging analysis suited for a broad readership

The False Discovery Rate is ideal for Statistics and Data Science courses, and short courses associated with conferences. It is also useful as supplementary reading in courses in other disciplines that require the statistical interpretation of “big data.’ The book will also be of great value to statisticians and researchers looking to learn more about the FDR.

ISBN: 9781119889779

Dimensions: unknown

Weight: unknown

316 pages