front cover of Are Changing Constituencies Driving Rising Polarization in the U.S. House of Representatives?
Are Changing Constituencies Driving Rising Polarization in the U.S. House of Representatives?
Jesse Sussell
RAND Corporation, 2015
This report addresses two questions: first, whether the spatial distribution of the American electorate has become more geographically clustered over the last 40 years with respect to party voting and socioeconomic attributes; and second, whether this clustering process has contributed to rising polarization in the U.S. House of Representatives.
[more]

front cover of The Broken Dice, and Other Mathematical Tales of Chance
The Broken Dice, and Other Mathematical Tales of Chance
Ivar Ekeland
University of Chicago Press, 1993
Ivar Ekeland extends his consideration of the catastrophe theory of the universe begun in his widely acclaimed Mathematics and the Unexpected, by drawing on rich literary sources, particularly the Norse saga of Saint Olaf, and such current topics as chaos theory, information theory, and particle physics.

"Ivar Ekeland gained a large and enthusiastic following with Mathematics and the Unexpected, a brilliant and charming exposition of fundamental new discoveries in the theory of dynamical systems. The Broken Dice continues the same theme, and in the same elegant, seemingly effortless style, but focuses more closely on the implications of those discoveries for the rest of human culture. What are chance and probability? How has our thinking about them been changed by the discovery of chaos? What are all of these concepts good for? . . . Ah, but, I mustn't give the game away, any more than I should if I were reviewing a detective novel. And this is just as gripping a tale. . . . Beg, borrow, or preferably buy a copy. . . . I guarantee you won't be disappointed."—Ian Stewart, Science
[more]

front cover of The Chicago Guide to Writing about Multivariate Analysis, Second Edition
The Chicago Guide to Writing about Multivariate Analysis, Second Edition
Jane E. Miller
University of Chicago Press, 2013
Many different people, from social scientists to government agencies to business professionals, depend on the results of multivariate models to inform their decisions.  Researchers use these advanced statistical techniques to analyze relationships among multiple variables, such as how exercise and weight relate to the risk of heart disease, or how unemployment and interest rates affect economic growth. Yet, despite the widespread need to plainly and effectively explain the results of multivariate analyses to varied audiences, few are properly taught this critical skill.

The Chicago Guide to Writing about Multivariate Analysis
is the book researchers turn to when looking for guidance on how to clearly present statistical results and break through the jargon that often clouds writing about applications of statistical analysis. This new edition features even more topics and real-world examples, making it the must-have resource for anyone who needs to communicate complex research results.

For this second edition, Jane E. Miller includes four new chapters that cover writing about interactions, writing about event history analysis, writing about multilevel models, and the “Goldilocks principle” for choosing the right size contrast for interpreting results for different variables. In addition, she has updated or added numerous examples, while retaining her clear voice and focus on writers thinking critically about their intended audience and objective. Online podcasts, templates, and an updated study guide will help readers apply skills from the book to their own projects and courses.

This continues to be the only book that brings together all of the steps involved in communicating findings based on multivariate analysis—finding data, creating variables, estimating statistical models, calculating overall effects, organizing ideas, designing tables and charts, and writing prose—in a single volume. When aligned with Miller’s twelve fundamental principles for quantitative writing, this approach will empower readers—whether students or experienced researchers—to communicate their findings clearly and effectively.
[more]

front cover of The Cult of Statistical Significance
The Cult of Statistical Significance
How the Standard Error Costs Us Jobs, Justice, and Lives
Stephen T. Ziliak and Deirdre N. McCloskey
University of Michigan Press, 2010

“McCloskey and Ziliak have been pushing this very elementary, very correct, very important argument through several articles over several years and for reasons I cannot fathom it is still resisted. If it takes a book to get it across, I hope this book will do it. It ought to.”

—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics

“With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”

—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health

The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots.

Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).

[more]

front cover of Error and the Growth of Experimental Knowledge
Error and the Growth of Experimental Knowledge
Deborah G. Mayo
University of Chicago Press, 1996
We may learn from our mistakes, but Deborah Mayo argues that, where experimental knowledge is concerned, we haven't begun to learn enough. Error and the Growth of Experimental Knowledge launches a vigorous critique of the subjective Bayesian view of statistical inference, and proposes Mayo's own error-statistical approach as a more robust framework for the epistemology of experiment. Mayo genuinely addresses the needs of researchers who work with statistical analysis, and simultaneously engages the basic philosophical problems of objectivity and rationality.

Mayo has long argued for an account of learning from error that goes far beyond detecting logical inconsistencies. In this book, she presents her complete program for how we learn about the world by being "shrewd inquisitors of error, white gloves off." Her tough, practical approach will be important to philosophers, historians, and sociologists of science, and will be welcomed by researchers in the physical, biological, and social sciences whose work depends upon statistical analysis.
[more]

logo for Amsterdam University Press
Fact or Fluke?
A Critical Look at Statistical Evidence
Ronald Meester
Amsterdam University Press

front cover of Get in the Game
Get in the Game
An Interactive Introduction to Sports Analytics
Tim Chartier
University of Chicago Press, 2022
An award-winning math popularizer, who has advised the US Olympic Committee, NFL, and NBA, offers sports fans a new way to understand truly improbable feats in their favorite games.
 
In 2013, NBA point guard Steph Curry wowed crowds when he sunk 11 out of 13 three-pointers for a game total of 54 points—only seven other players, including Michael Jordan and Kobe Bryant, had scored more in a game at Madison Square Garden. Four years later, the University of Connecticut women’s basketball team won its hundredth straight game, defeating South Carolina 66–55. And in 2010, one forecaster—an octopus named Paul—correctly predicted the outcome of all of Germany’s matches in the FIFA World Cup. These are surprising events—but are they truly improbable?
 
In Get in the Game, mathematician and sports analytics expert Tim Chartier helps us answer that question—condensing complex mathematics down to coin tosses and dice throws to give readers both an introduction to statistics and a new way to enjoy sporting events. With these accessible tools, Chartier leads us through modeling experiments that develop our intuitive sense of the improbable. For example, to see how likely you are to beat Curry’s three-pointer feat, consider his 45.3 percent three-point shooting average in 2012–13. Take a coin and assume heads is making the shot (slightly better than Curry at a fifty percent chance). Can you imagine getting heads eleven out of thirteen times? With engaging exercises and fun, comic book–style illustrations by Ansley Earle, Chartier’s book encourages all readers—including those who have never encountered formal statistics or data simulations, or even heard of sports analytics, but who enjoy watching sports—to get in the game.
[more]

front cover of Good Thinking
Good Thinking
The Foundations of Probability and Its Applications
I.J. Good
University of Minnesota Press, 1983
Good Thinking was first published in 1983.Good Thinking is a representative sampling of I. J. Good’s writing on a wide range of questions about the foundations of statistical inference, especially where induction intersects with philosophy. Good believes that clear reasoning about many important practical and philosophical questions is impossible except in terms of probability. This book collects from various published sources 23 of Good’s articles with an emphasis on more philosophical than mathematical.He covers such topics as rational decisions, randomness, operational research, measurement of knowledge, mathematical discovery, artificial intelligence, cognitive psychology, chess, and the nature of probability itself. In spite of the wide variety of topics covered, Good Thinking is based on a unified philosophy which makes it more than the sum of its parts. The papers are organized into five sections: Bayesian Rationality; Probability; Corroboration, Hypothesis Testing, and Simplicity; Information and Surprise; and Causality and Explanation. The numerous references, an extensive index, and a bibliography guide the reader to related modern and historic literature.This collection makes available to a wide audience, for the first time, the most accessible work of a very creative thinker. Philosophers of science, mathematicians, scientists, and, in Good’s words, anyone who wants “to understand understanding, to reason about reasoning, to explain explanation, to think about thought, and to decide how to decide” will find Good Thinking a stimulating and provocative look at probability.
[more]

front cover of Handbook of Quantitative Ecology
Handbook of Quantitative Ecology
Justin Kitzes
University of Chicago Press, 2022
An essential guide to quantitative research methods in ecology and conservation biology, accessible for even the most math-averse student or professional.

Quantitative research techniques have become increasingly important in ecology and conservation biology, but the sheer breadth of methods that must be understood—from population modeling and probabilistic thinking to modern statistics, simulation, and data science—and a lack of computational or mathematics training have hindered quantitative literacy in these fields. In this book, ecologist Justin Kitzes addresses those challenges for students and practicing scientists alike.

Requiring only basic algebra and the ability to use a spreadsheet, Handbook of Quantitative Ecology is designed to provide a practical, intuitive, and integrated introduction to widely used quantitative methods. Kitzes builds each chapter around a specific ecological problem and arrives, step by step, at a general principle through the process of solving that problem. Grouped into five broad categories—difference equations, probability, matrix models, likelihood statistics, and other numerical methods—the book introduces basic concepts, starting with exponential and logistic growth, and helps readers to understand the field’s more advanced subjects, such as bootstrapping, stochastic optimization, and cellular automata. Complete with online solutions to all numerical problems, Kitzes’s Handbook of Quantitative Ecology is an ideal coursebook for both undergraduate and graduate students of ecology, as well as a useful and necessary resource for mathematically out-of-practice scientists.
[more]

front cover of The Hidden Game of Baseball
The Hidden Game of Baseball
A Revolutionary Approach to Baseball and Its Statistics
John Thorn, Pete Palmer, with David Reuther
University of Chicago Press, 2015
Long before Moneyball became a sensation or Nate Silver turned the knowledge he’d honed on baseball into electoral gold, John Thorn and Pete Palmer were using statistics to shake the foundations of the game. First published in 1984, The Hidden Game of Baseball ushered in the sabermetric revolution by demonstrating that we were thinking about baseball stats—and thus the game itself—all wrong. Instead of praising sluggers for gaudy RBI totals or pitchers for wins, Thorn and Palmer argued in favor of more subtle measurements that correlated much more closely to the ultimate goal: winning baseball games.
            The new gospel promulgated by Thorn and Palmer opened the door for a flood of new questions, such as how a ballpark’s layout helps or hinders offense or whether a strikeout really is worse than another kind of out. Taking questions like these seriously—and backing up the answers with data—launched a new era, showing fans, journalists, scouts, executives, and even players themselves a new, better way to look at the game.
            This brand-new edition retains the body of the original, with its rich, accessible analysis rooted in a deep love of baseball, while adding a new introduction by the authors tracing the book’s influence over the years. A foreword by ESPN’s lead baseball analyst, Keith Law, details The Hidden Game’s central role in the transformation of baseball coverage and team management and shows how teams continue to reap the benefits of Thorn and Palmer’s insights today. Thirty years after its original publication, The Hidden Game is still bringing the high heat—a true classic of baseball literature.
[more]

front cover of The Hidden Game of Football
The Hidden Game of Football
A Revolutionary Approach to the Game and Its Statistics
Bob Carroll, Pete Palmer, and John Thorn
University of Chicago Press, 2023
The 1988 cult classic behind football’s data analytics revolution, now back in print with a new foreword and preface.

Data analytics have revolutionized football. With play sheets informed by advanced statistical analysis, today’s coaches pass more, kick less, and go for more two-point or fourth-down conversions than ever before. In 1988, sportswriters Bob Carroll, Pete Palmer, and John Thorn proposed just this style of play in The Hidden Game of Football, but at the time baffled readers scoffed at such a heartless approach to the game. Football was the ultimate team sport and unlike baseball could not be reduced to pure probabilities. Nevertheless, the book developed a cult following among analysts who, inspired by its unorthodox methods, went on to develop the core metrics of football analytics used today: win probability, expected points, QBR, and more. With a new preface by Thorn and Palmer and a new foreword by Football Outsiders’s Aaron Schatz, The Hidden Game of Football remains an essential resource for armchair coaches, fantasy managers, and fans of all stripes.
[more]

front cover of The History of Statistics
The History of Statistics
The Measurement of Uncertainty before 1900
Stephen M. Stigler
Harvard University Press, 1986

This magnificent book is the first comprehensive history of statistics from its beginnings around 1700 to its emergence as a distinct and mature discipline around 1900. Stephen M. Stigler shows how statistics arose from the interplay of mathematical concepts and the needs of several applied sciences including astronomy, geodesy, experimental psychology, genetics, and sociology. He addresses many intriguing questions: How did scientists learn to combine measurements made under different conditions? And how were they led to use probability theory to measure the accuracy of the result? Why were statistical methods used successfully in astronomy long before they began to play a significant role in the social sciences? How could the introduction of least squares predate the discovery of regression by more than eighty years? On what grounds can the major works of men such as Bernoulli, De Moivre, Bayes, Quetelet, and Lexis be considered partial failures, while those of Laplace, Galton, Edgeworth, Pearson, and Yule are counted as successes? How did Galton’s probability machine (the quincunx) provide him with the key to the major advance of the last half of the nineteenth century?

Stigler’s emphasis is upon how, when, and where the methods of probability theory were developed for measuring uncertainty in experimental and observational science, for reducing uncertainty, and as a conceptual framework for quantitative studies in the social sciences. He describes with care the scientific context in which the different methods evolved and identifies the problems (conceptual or mathematical) that retarded the growth of mathematical statistics and the conceptual developments that permitted major breakthroughs.

Statisticians, historians of science, and social and behavioral scientists will gain from this book a deeper understanding of the use of statistical methods and a better grasp of the promise and limitations of such techniques. The product of ten years of research, The History of Statistics will appeal to all who are interested in the humanistic study of science.

[more]

front cover of How Our Days Became Numbered
How Our Days Became Numbered
Risk and the Rise of the Statistical Individual
Dan Bouk
University of Chicago Press, 2015
Long before the age of "Big Data" or the rise of today's "self-quantifiers," American capitalism embraced "risk"--and proceeded to number our days. Life insurers led the way, developing numerical practices for measuring individuals and groups, predicting their fates, and intervening in their futures. Emanating from the gilded boardrooms of Lower Manhattan and making their way into drawing rooms and tenement apartments across the nation, these practices soon came to change the futures they purported to divine.

How Our Days Became Numbered tells a story of corporate culture remaking American culture--a story of intellectuals and professionals in and around insurance companies who reimagined Americans' lives through numbers and taught ordinary Americans to do the same. Making individuals statistical did not happen easily. Legislative battles raged over the propriety of discriminating by race or of smoothing away the effects of capitalism's fluctuations on individuals. Meanwhile, debates within companies set doctors against actuaries and agents, resulting in elaborate, secretive systems of surveillance and calculation.

Dan Bouk reveals how, in a little over half a century, insurers laid the groundwork for the much-quantified, risk-infused world that we live in today. To understand how the financial world shapes modern bodies, how risk assessments can perpetuate inequalities of race or sex, and how the quantification and claims of risk on each of us continue to grow, we must take seriously the history of those who view our lives as a series of probabilities to be managed.
[more]

front cover of Infinite-Dimensional Optimization and Convexity
Infinite-Dimensional Optimization and Convexity
Ivar Ekeland and Thomas Turnbull
University of Chicago Press, 1983
In this volume, Ekeland and Turnbull are mainly concerned with existence theory. They seek to determine whether, when given an optimization problem consisting of minimizing a functional over some feasible set, an optimal solution—a minimizer—may be found.
[more]

front cover of An Introduction to Mathematical Statistics
An Introduction to Mathematical Statistics
Fetsje Bijma, Marianne Jonker, and Aad van der Vaart
Amsterdam University Press, 2017
Statistics is the science that focuses on drawing conclusions from data, by modeling and analyzing the data using probabilistic models. In An Introduction to Mathematical Statistics the authors describe key concepts from statistics and give a mathematical basis for important statistical methods. Much attention is paid to the sound application of those methods to data. The three main topics in statistics are estimators, tests, and confidence regions. The authors illustrate these in many examples, with a separate chapter on regression models, including linear regression and analysis of variance. They also discuss the optimality of estimators and tests, as well as the selection of the best-fitting model. Each chapter ends with a case study in which the described statistical methods are applied. This book assumes a basic knowledge of probability theory, calculus, and linear algebra. Several annexes are available for Mathematical Statistics on this page.
[more]

front cover of The Logic of Decision
The Logic of Decision
Richard C. Jeffrey
University of Chicago Press, 1990
"[This book] proposes new foundations for the Bayesian principle of rational action, and goes on to develop a new logic of desirability and probabtility."—Frederic Schick, Journal of Philosophy
[more]

logo for University of Chicago Press
Modern Factor Analysis
Harry H. Harman
University of Chicago Press, 1976
This thoroughly revised third edition of Harry H. Harman's authoritative text incorporates the many new advances made in computer science and technology over the last ten years. The author gives full coverage to both theoretical and applied aspects of factor analysis from its foundations through the most advanced techniques. This highly readable text will be welcomed by researchers and students working in psychology, statistics, economics, and related disciplines.
[more]

logo for University of Minnesota Press
Modern Sampling Methods
Theory, Experimentation, Application
Palmer Johnson
University of Minnesota Press, 1959
Modern Sampling Methods: Theory, Experimentation, Application was first published in 1959. Minnesota Archive Editions uses digital technology to make long-unavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.Of both theoretical and practical use to statisticians and research workers using sampling techniques, this book describes five new multi-stage sampling models. The models are described, compared, and evaluated through a skillfully designed experiment. The number of stages in all five models is the same; the manner in which they differ is in the particular sampling technique applied at each of the several stages. Recommendations are given on the choice of the most suitable model for a given practical situation. A mathematical appendix presents two lemmas that are useful for derivation of sampling formulas in multi-stage sampling.
[more]

front cover of The Nature of Scientific Evidence
The Nature of Scientific Evidence
Statistical, Philosophical, and Empirical Considerations
Edited by Mark L. Taper and Subhash R. Lele
University of Chicago Press, 2004
An exploration of the statistical foundations of scientific inference, The Nature of Scientific Evidence asks what constitutes scientific evidence and whether scientific evidence can be quantified statistically. Mark Taper, Subhash Lele, and an esteemed group of contributors explore the relationships among hypotheses, models, data, and inference on which scientific progress rests in an attempt to develop a new quantitative framework for evidence. Informed by interdisciplinary discussions among scientists, philosophers, and statisticians, they propose a new "evidential" approach, which may be more in keeping with the scientific method. The Nature of Scientific Evidence persuasively argues that all scientists should care more about the fine points of statistical philosophy because therein lies the connection between theory and data.

Though the book uses ecology as an exemplary science, the interdisciplinary evaluation of the use of statistics in empirical research will be of interest to any reader engaged in the quantification and evaluation of data.
[more]

front cover of Observation and Experiment
Observation and Experiment
An Introduction to Causal Inference
Paul R. Rosenbaum
Harvard University Press, 2017

A daily glass of wine prolongs life—yet alcohol can cause life-threatening cancer. Some say raising the minimum wage will decrease inequality while others say it increases unemployment. Scientists once confidently claimed that hormone replacement therapy reduced the risk of heart disease but now they equally confidently claim it raises that risk. What should we make of this endless barrage of conflicting claims?

Observation and Experiment is an introduction to causal inference by one of the field’s leading scholars. An award-winning professor at Wharton, Paul Rosenbaum explains key concepts and methods through lively examples that make abstract principles accessible. He draws his examples from clinical medicine, economics, public health, epidemiology, clinical psychology, and psychiatry to explain how randomized control trials are conceived and designed, how they differ from observational studies, and what techniques are available to mitigate their bias.

“Carefully and precisely written…reflecting superb statistical understanding, all communicated with the skill of a master teacher.”
—Stephen M. Stigler, author of The Seven Pillars of Statistical Wisdom

“An excellent introduction…Well-written and thoughtful…from one of causal inference’s noted experts.”
Journal of the American Statistical Association

“Rosenbaum is a gifted expositor…an outstanding introduction to the topic for anyone who is interested in understanding the basic ideas and approaches to causal inference.”
Psychometrika

“A very valuable contribution…Highly recommended.”
International Statistical Review

[more]

logo for University of Minnesota Press
Prediction and Regulation by Linear Least-Square Methods
Peter Whittle
University of Minnesota Press, 1963
Prediction and Regulation by Linear Least-Square Methods was first published in 1963. This revised second edition was issued in 1983. Minnesota Archive Editions uses digital technology to make long-unavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.During the past two decades, statistical theories of prediction and control have assumed an increasing importance in all fields of scientific research. To understand a phenomenon is to be able to predict it and to influence it in predictable ways. First published in 1963 and long out of print, Prediction and Regulation by Linear Least-Square Methods offers important tools for constructing models of dynamic phenomena. This elegantly written book has been a basic reference for researchers in many applied sciences who seek practical information about the representation and manipulation of stationary stochastic processes. Peter Whittle’s text has a devoted group of readers and users, especially among economists. This edition contains the unchanged text of the original and adds new works by the author and a foreword by economist Thomas J. Sargent.
[more]

front cover of A Primer of Probability Logic
A Primer of Probability Logic
Ernest W. Adams
CSLI, 1996

This book is meant to be a primer, that is, an introduction, to probability logic, a subject that appears to be in its infancy. Probability logic is a subject envisioned by Hans Reichenbach and largely created by Adams. It treats conditionals as bearers of conditional probabilities and discusses an appropriate sense of validity for arguments such conditionals, as well as ordinary statements as premisses.

This is a clear well-written text on the subject of probability logic, suitable for advanced undergraduates or graduates, but also of interest to professional philosophers. There are well-thought-out exercises, and a number of advanced topics treated in appendices, while some are brought up in exercises and some are alluded to only in footnotes. By this means, it is hoped that the reader will at least be made aware of most of the important ramifications of the subject and its tie-ins with current research, and will have some indications concerning recent and relevant literature.

[more]

front cover of Probably Overthinking It
Probably Overthinking It
How to Use Data to Answer Questions, Avoid Statistical Traps, and Make Better Decisions
Allen B. Downey
University of Chicago Press, 2023
An essential guide to the ways data can improve decision making.
 
Statistics are everywhere: in news reports, at the doctor’s office, and in every sort of forecast, from the stock market to the weather. Blogger, teacher, and computer scientist Allen B. Downey knows well that people have an innate ability both to understand statistics and to be fooled by them. As he makes clear in this accessible introduction to statistical thinking, the stakes are big. Simple misunderstandings have led to incorrect medical prognoses, underestimated the likelihood of large earthquakes, hindered social justice efforts, and resulted in dubious policy decisions. There are right and wrong ways to look at numbers, and Downey will help you see which are which.
 
Probably Overthinking It uses real data to delve into real examples with real consequences, drawing on cases from health campaigns, political movements, chess rankings, and more. He lays out common pitfalls—like the base rate fallacy, length-biased sampling, and Simpson’s paradox—and shines a light on what we learn when we interpret data correctly, and what goes wrong when we don’t. Using data visualizations instead of equations, he builds understanding from the basics to help you recognize errors, whether in your own thinking or in media reports. Even if you have never studied statistics—or if you have and forgot everything you learned—this book will offer new insight into the methods and measurements that help us understand the world.
[more]

front cover of Proximity and Preference
Proximity and Preference
Problems in the Multidimensional Analysis of Large Data Sets
Reginald Golledge
University of Minnesota Press, 1982
Proximity and Preference was first published in 1982.How does one design experiments for collecting large volumes of data such as those needed for marketing surveys, studies of travel patterns, and public opinion polls? This is a common problem for social and behavioral scientists. The papers in this collection address the problems of working with large data sets primarily from the perspectives of geography and psychology, two fields that share a common quantitative research methodology.After an introductory paper on substantive and methodological aspects of the interface between geography and psychology, the book is divided into three sections, experimental design and measurement problems, preference functions and choice behavior, and special problems of analyzing large data sets with multidimensional methods. Each paper is directed toward some fundamental problem such as those relating to experimental design, data reliability, and the selection of analytical methods which are appropriate for data sets of various sizes, completeness, and reliability.
[more]

front cover of Randomness
Randomness
Deborah J. Bennett
Harvard University Press, 1998

From the ancients’ first readings of the innards of birds to your neighbor’s last bout with the state lottery, humankind has put itself into the hands of chance. Today life itself may be at stake when probability comes into play—in the chance of a false negative in a medical test, in the reliability of DNA findings as legal evidence, or in the likelihood of passing on a deadly congenital disease—yet as few people as ever understand the odds. This book is aimed at the trouble with trying to learn about probability. A story of the misconceptions and difficulties civilization overcame in progressing toward probabilistic thinking, Randomness is also a skillful account of what makes the science of probability so daunting in our own day.

To acquire a (correct) intuition of chance is not easy to begin with, and moving from an intuitive sense to a formal notion of probability presents further problems. Author Deborah Bennett traces the path this process takes in an individual trying to come to grips with concepts of uncertainty and fairness, and also charts the parallel path by which societies have developed ideas about chance. Why, from ancient to modern times, have people resorted to chance in making decisions? Is a decision made by random choice “fair”? What role has gambling played in our understanding of chance? Why do some individuals and societies refuse to accept randomness at all? If understanding randomness is so important to probabilistic thinking, why do the experts disagree about what it really is? And why are our intuitions about chance almost always dead wrong?

Anyone who has puzzled over a probability conundrum is struck by the paradoxes and counterintuitive results that occur at a relatively simple level. Why this should be, and how it has been the case through the ages, for bumblers and brilliant mathematicians alike, is the entertaining and enlightening lesson of Randomness.

[more]

front cover of Ratio Correlation
Ratio Correlation
A Manual for Students of Petrology and Geochemistry
Felix Chayes
University of Chicago Press, 1971

front cover of Risk Quantification and Allocation Methods for Practitioners
Risk Quantification and Allocation Methods for Practitioners
Jaume Belles-Sampers, Montserrat Guillén, and Miguel Santolino
Amsterdam University Press, 2017
Risk Quantification and Allocation Methods for Practitioners offers a practical approach to risk management in the financial industry. This in-depth study provides quantitative tools to better describe qualitative issues, as well as clear explanations of how to transform recent theoretical developments into computational practice, and key tools for dealing with the issues of risk measurement and capital allocation.
[more]

front cover of The Seven Pillars of Statistical Wisdom
The Seven Pillars of Statistical Wisdom
Stephen M. Stigler
Harvard University Press, 2016

What gives statistics its unity as a science? Stephen Stigler sets forth the seven foundational ideas of statistics—a scientific discipline related to but distinct from mathematics and computer science.

Even the most basic idea—aggregation, exemplified by averaging—is counterintuitive. It allows one to gain information by discarding information, namely, the individuality of the observations. Stigler’s second pillar, information measurement, challenges the importance of “big data” by noting that observations are not all equally important: the amount of information in a data set is often proportional to only the square root of the number of observations, not the absolute number. The third idea is likelihood, the calibration of inferences with the use of probability. Intercomparison is the principle that statistical comparisons do not need to be made with respect to an external standard. The fifth pillar is regression, both a paradox (tall parents on average produce shorter children; tall children on average have shorter parents) and the basis of inference, including Bayesian inference and causal reasoning. The sixth concept captures the importance of experimental design—for example, by recognizing the gains to be had from a combinatorial approach with rigorous randomization. The seventh idea is the residual: the notion that a complicated phenomenon can be simplified by subtracting the effect of known causes, leaving a residual phenomenon that can be explained more easily.

The Seven Pillars of Statistical Wisdom presents an original, unified account of statistical science that will fascinate the interested layperson and engage the professional statistician.

[more]

front cover of Statistics for Public Policy
Statistics for Public Policy
A Practical Guide to Being Mostly Right (or at Least Respectably Wrong)
Jeremy G. Weber
University of Chicago Press, 2024

A long-overdue guide on how to use statistics to bring clarity, not confusion, to policy work.

Statistics are an essential tool for making, evaluating, and improving public policy. Statistics for Public Policy is a crash course in wielding these unruly tools to bring maximum clarity to policy work. Former White House economist Jeremy G. Weber offers an accessible voice of experience for the challenges of this work, focusing on seven core practices: 

  • Thinking big-picture about the role of data in decisions
  • Critically engaging with data by focusing on its origins, purpose, and generalizability
  • Understanding the strengths and limits of the simple statistics that dominate most policy discussions
  • Developing reasons for considering a number to be practically small or large  
  • Distinguishing correlation from causation and minor causes from major causes
  • Communicating statistics so that they are seen, understood, and believed
  • Maintaining credibility by being right (or at least respectably wrong) in every setting
Statistics for Public Policy dispenses with the opacity and technical language that have long made this space impenetrable; instead, Weber offers an essential resource for all students and professionals working at the intersections of data and policy interventions. This book is all signal, no noise.
[more]

front cover of Statistics on the Table
Statistics on the Table
The History of Statistical Concepts and Methods
Stephen M. Stigler
Harvard University Press, 2002
This lively collection of essays examines in witty detail the history of some of the concepts involved in bringing statistical argument "to the table," and some of the pitfalls that have been encountered. The topics range from seventeenth-century medicine and the circulation of blood, to the cause of the Great Depression and the effect of the California gold discoveries of 1848 upon price levels, to the determinations of the shape of the Earth and the speed of light, to the meter of Virgil's poetry and the prediction of the Second Coming of Christ. The title essay tells how the statistician Karl Pearson came to issue the challenge to put "statistics on the table" to the economists Marshall, Keynes, and Pigou in 1911. The 1911 dispute involved the effect of parental alcoholism upon children, but the challenge is general and timeless: important arguments require evidence, and quantitative evidence requires statistical evaluation. Some essays examine deep and subtle statistical ideas such as the aggregation and regression paradoxes; others tell of the origin of the Average Man and the evaluation of fingerprints as a forerunner of the use of DNA in forensic science. Several of the essays are entirely nontechnical; all examine statistical ideas with an ironic eye for their essence and what their history can tell us about current disputes.
[more]

front cover of Thinking Through Statistics
Thinking Through Statistics
John Levi Martin
University of Chicago Press, 2018
Simply put, Thinking Through Statistics is a primer on how to maintain rigorous data standards in social science work, and one that makes a strong case for revising the way that we try to use statistics to support our theories. But don’t let that daunt you. With clever examples and witty takeaways, John Levi Martin proves himself to be a most affable tour guide through these scholarly waters.

Martin argues that the task of social statistics isn't to estimate parameters, but to reject false theory. He illustrates common pitfalls that can keep researchers from doing just that using a combination of visualizations, re-analyses, and simulations. Thinking Through Statistics gives social science practitioners accessible insight into troves of wisdom that would normally have to be earned through arduous trial and error, and it does so with a lighthearted approach that ensures this field guide is anything but stodgy.
 
[more]

logo for University of Chicago Press
Torsion-Free Modules
Eben Matlis
University of Chicago Press, 1973
The subject of torsion-free modules over an arbitrary integral domain arises naturally as a generalization of torsion-free abelian groups. In this volume, Eben Matlis brings together his research on torsion-free modules that has appeared in a number of mathematical journals. Professor Matlis has reworked many of the proofs so that only an elementary knowledge of homological algebra and commutative ring theory is necessary for an understanding of the theory.

The first eight chapters of the book are a general introduction to the theory of torsion-free modules. This part of the book is suitable for a self-contained basic course on the subject. More specialized problems of finding all integrally closed D-rings are examined in the last seven chapters, where material covered in the first eight chapters is applied.

An integral domain is said to be a D-ring if every torsion-free module of finite rank decomposes into a direct sum of modules of rank 1. After much investigation, Professor Matlis found that an integrally closed domain is a D-ring if, and only if, it is the intersection of at most two maximal valuation rings.
[more]

front cover of The Total Survey Error Approach
The Total Survey Error Approach
A Guide to the New Science of Survey Research
Herbert F. Weisberg
University of Chicago Press, 2005
In 1939, George Gallup's American Institute of Public Opinion published a pamphlet optimistically titled The New Science of Public Opinion Measurement. At the time, though, survey research was in its infancy, and only now, six decades later, can public opinion measurement be appropriately called a science, based in part on the development of the total survey error approach.

Herbert F. Weisberg's handbook presents a unified method for conducting good survey research centered on the various types of errors that can occur in surveys—from measurement and nonresponse error to coverage and sampling error. Each chapter is built on theoretical elements drawn from specific disciplines, such as social psychology and statistics, and follows through with detailed treatments of the specific types of error and their potential solutions. Throughout, Weisberg is attentive to survey constraints, including time and ethical considerations, as well as controversies within the field and the effects of new technology on the survey process—from Internet surveys to those completed by phone, by mail, and in person. Practitioners and students will find this comprehensive guide particularly useful now that survey research has assumed a primary place in both public and academic circles.
[more]

front cover of Unifying Political Methodology
Unifying Political Methodology
The Likelihood Theory of Statistical Inference
Gary King
University of Michigan Press, 1998
One of the hallmarks of the development of political science as a discipline has been the creation of new methodologies by scholars within the discipline--methodologies that are well-suited to the analysis of political data. Gary King has been a leader in the development of these new approaches to the analysis of political data. In his book, Unifying Political Methodology, King shows how the likelihood theory of inference offers a unified approach to statistical modeling for political research and thus enables us to better analyze the enormous amount of data political scientists have collected over the years. Newly reissued, this book is a landmark in the development of political methodology and continues to challenge scholars and spark controversy.
"Gary King's Unifying Political Methodology is at once an introduction to the likelihood theory of statistical inference and an evangelist's call for us to change our ways of doing political methodology. One need not accept the altar call to benefit enormously from the book, but the intellectual debate over the call for reformation is likely to be the enduring contribution of the work."
--Charles Franklin, American Political Science Review
"King's book is one of the only existing books which deal with political methodology in a clear and consistent framework. The material in it is now and will continue to be essential reading for all serious students and researchers in political methodology." --R. Michael Alvarez, California Institute of Tech-nology
Gary King is Professor of Government, Harvard University. One of the leading thinkers in political methodology, he is the author of A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data and other books and articles.
[more]


Send via email Share on Facebook Share on Twitter