From random security checks at airports to the use of risk assessment in sentencing, actuarial methods are being used more than ever to determine whom law enforcement officials target and punish. And with the exception of racial profiling on our highways and streets, most people favor these methods because they believe they’re a more cost-effective way to fight crime.
In Against Prediction, Bernard E. Harcourt challenges this growing reliance on actuarial methods. These prediction tools, he demonstrates, may in fact increase the overall amount of crime in society, depending on the relative responsiveness of the profiled populations to heightened security. They may also aggravate the difficulties that minorities already have obtaining work, education, and a better quality of life—thus perpetuating the pattern of criminal behavior. Ultimately, Harcourt shows how the perceived success of actuarial methods has begun to distort our very conception of just punishment and to obscure alternate visions of social order. In place of the actuarial, he proposes instead a turn to randomization in punishment and policing. The presumption, Harcourt concludes, should be against prediction.
“McCloskey and Ziliak have been pushing this very elementary, very correct, very important argument through several articles over several years and for reasons I cannot fathom it is still resisted. If it takes a book to get it across, I hope this book will do it. It ought to.”
—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics
“With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”
—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health
The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots.
Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).
Essential Demographic Methods brings to readers the full range of ideas and skills of demographic analysis that lie at the core of social sciences and public health. Classroom tested over many years, filled with fresh data and examples, this approachable text is tailored to the needs of beginners, advanced students, and researchers alike. An award-winning teacher and eminent demographer, Kenneth Wachter uses themes from the individual lifecourse, history, and global change to convey the meaning of concepts such as exponential growth, cohorts and periods, lifetables, population projection, proportional hazards, parity, marity, migration flows, and stable populations. The presentation is carefully paced and accessible to readers with knowledge of high-school algebra. Each chapter contains original problem sets and worked examples.
From the most basic concepts and measures to developments in spatial demography and hazard modeling at the research frontier, Essential Demographic Methods brings out the wider appeal of demography in its connections across the sciences and humanities. It is a lively, compact guide for understanding quantitative population analysis in the social and biological world.
This book provides a language and a set of tools for finding bounds on the predictions that social and behavioral scientists can logically make from nonexperimental and experimental data. The economist Charles Manski draws on examples from criminology, demography, epidemiology, social psychology, and sociology as well as economics to illustrate this language and to demonstrate the broad usefulness of the tools.
There are many traditional ways to present identification problems in econometrics, sociology, and psychometrics. Some of these are primarily statistical in nature, using concepts such as flat likelihood functions and nondistinct parameter estimates. Manski's strategy is to divorce identification from purely statistical concepts and to present the logic of identification analysis in ways that are accessible to a wide audience in the social and behavioral sciences. In each case, problems are motivated by real examples with real policy importance, the mathematics is kept to a minimum, and the deductions on identifiability are derived giving fresh insights.
Manski begins with the conceptual problem of extrapolating predictions from one population to some new population or to the future. He then analyzes in depth the fundamental selection problem that arises whenever a scientist tries to predict the effects of treatments on outcomes. He carefully specifies assumptions and develops his nonparametric methods of bounding predictions. Manski shows how these tools should be used to investigate common problems such as predicting the effect of family structure on children's outcomes and the effect of policing on crime rates.
Successive chapters deal with topics ranging from the use of experiments to evaluate social programs, to the use of case-control sampling by epidemiologists studying the association of risk factors and disease, to the use of intentions data by demographers seeking to predict future fertility. The book closes by examining two central identification problems in the analysis of social interactions: the classical simultaneity problem of econometrics and the reflection problem faced in analyses of neighborhood and contextual effects.
Measured Language: Quantitative Studies of Acquisition, Assessment, and Variation focuses on ways in which various aspects of language can be quantified and how measurement informs and advances our understanding of language. The metaphors and operationalizations of quantification serve as an important lingua franca for seemingly disparate areas of linguistic research, allowing methods and constructs to be translated from one area of linguistic investigation to another.
Measured Language includes forms of measurement and quantitative analysis current in diverse areas of linguistic research from language assessment to language change, from generative linguistics to experimental psycholinguistics, and from longitudinal studies to classroom research. Contributors demonstrate how to operationalize a construct, develop a reliable way to measure it, and finally validate that measurement—and share the relevance of their perspectives and findings to other areas of linguistic inquiry. The range and clarity of the research collected here ensures that even linguists who would not traditionally use quantitative methods will find this volume useful.
Social scientists study complex phenomena about which they often propose intricate hypotheses tested with linear-interactive or multiplicative terms. While interaction terms are hardly new to social science research, researchers have yet to develop a common methodology for using and interpreting them. Modeling and Interpreting Interactive Hypotheses in Regression Analysis provides step-by-step guidance on how to connect substantive theories to statistical models and how to interpret and present the results.
"Kam and Franzese is a must-have for all empirical social scientists interested in teasing out the complexities of their data."
---Janet M. Box-Steffensmeier, Ohio State University
"Kam and Franzese have written what will become the definitive source on dealing with interaction terms and testing interactive hypotheses. It will serve as the standard reference for political scientists and will be one of those books that everyone will turn to when helping our students or doing our work. But more than that, this book is the best text I have seen for getting students to really think about the importance of careful specification and testing of their hypotheses."
---David A. M. Peterson, Texas A&M University
"Kam and Franzese have given scholars and teachers of regression models something they've needed for years: a clear, concise guide to understanding multiplicative interactions. Motivated by real substantive examples and packed with valuable examples and graphs, their book belongs on the shelf of every working social scientist."
---Christopher Zorn, University of South Carolina
"Kam and Franzese make it easy to model what good researchers have known for a long time: many important and interesting causal effects depend on the presence of other conditions. Their book shows how to explore interactive hypotheses in your own research and how to present your results. The book is straightforward yet technically sophisticated. There are no more excuses for misunderstanding, misrepresenting, or simply missing out on interaction effects!"
---Andrew Gould, University of Notre Dame
Cindy D. Kam is Assistant Professor, Department of Political Science, University of California, Davis.
Robert J. Franzese Jr. is Associate Professor, Department of Political Science, University of Michigan, and Research Associate Professor, Center for Political Studies, Institute for Social Research, University of Michigan.
For datasets, syntax, and worksheets to help readers work through the examples covered in the book, visit: www.press.umich.edu/KamFranzese/Interactions.html
From two leading experts, a revolutionary new way to think about and measure aging.
Aging is a complex phenomenon. We usually think of chronological age as a benchmark, but it is actually a backward way of defining lifespan. It tells us how long we’ve lived so far, but what about the rest of our lives?
In this pathbreaking book, Warren C. Sanderson and Sergei Scherbov provide a new way to measure individual and population aging. Instead of counting how many years we’ve lived, we should think about the number of years we have left, our “prospective age.” Two people who share the same chronological age probably have different prospective ages, because one will outlive the other. Combining their forward-thinking measure of our remaining years with other health metrics, Sanderson and Scherbov show how we can generate better demographic estimates, which inform better policies. Measuring prospective age helps make sense of observed patterns of survival, reorients understanding of health in old age, and clarifies the burden of old-age dependency. The metric also brings valuable data to debates over equitable intergenerational pensions.
Sanderson and Scherbov’s pioneering model has already been adopted by the United Nations. Prospective Longevity offers us all an opportunity to rethink aging, so that we can make the right choices for our societal and economic health.
A comprehensive and accessible guide to learning and successfully applying QCA
Social phenomena can rarely be attributed to single causes—instead, they typically stem from a myriad of interwoven factors that are often difficult to untangle. Drawing on set theory and the language of necessary and sufficient conditions, qualitative comparative analysis (QCA) is ideally suited to capturing this causal complexity. A case-based research method, QCA regards cases as combinations of conditions and compares the conditions of each case in a structured way to identify the necessary and sufficient conditions for an outcome.
Qualitative Comparative Analysis: An Introduction to Research Design and Application is a comprehensive guide to QCA. As QCA becomes increasingly popular across the social sciences, this textbook teaches students, scholars, and self-learners the fundamentals of the method, research design, interpretation of results, and how to communicate findings.
Following an ideal typical research cycle, the book’s ten chapters cover the methodological basis and analytical routine of QCA, as well as matters of research design, causation and causal complexity, QCA variants, and the method’s reception in the social sciences. A comprehensive glossary helps to clarify the meaning of frequently used terms. The book is complemented by an accessible online R manual to help new users to practice QCA’s analytical steps on sample data and then implement with their own findings. This hands-on textbook is an essential resource for students and researchers looking for a complete and up-to-date introduction to QCA.
A long-overdue guide on how to use statistics to bring clarity, not confusion, to policy work.
Statistics are an essential tool for making, evaluating, and improving public policy. Statistics for Public Policy is a crash course in wielding these unruly tools to bring maximum clarity to policy work. Former White House economist Jeremy G. Weber offers an accessible voice of experience for the challenges of this work, focusing on seven core practices:
A hard-hitting investigation of the racist uses of statistics—now in paperback!
Tukufu Zuberi offers a concise account of the historical connections between the development of the idea of race and the birth of social statistics. Zuberi describes how race-differentiated data are misinterpreted in the social sciences and asks searching questions about the ways racial statistics are used. He argues that statistical analysis can and must be deracialized, and that this deracialization is essential to the goal of achieving social justice for all.
Place has become both a major field of criminological study as well as an important area for policy development. Capturing state of the art crime and place research methods and analysis, Understanding Crime and Place is a comprehensive Handbook focused on the specific skills researchers need.
The editors and contributors are scholars who have been fundamental in introducing or developing a particular method for crime and place research. Understanding Crime and Place is organized around the scientific process, introducing major crime and place theories and concepts, discussions of data and data collection, core spatial data concepts, as well as statistical and computational techniques for analyzing spatial data and place-based evaluation. The lessons in the book are supplemented by additional instructions, examples, problems, and datasets available for download.
Conducting place-based research is an emerging field that requires a wide range of cutting-edge methods and analysis techniques that are only beginning to be widely taught in criminology. Understanding Crime and Place bridges that gap, formalizes the discipline, and promotes an even greater use of place-based research.
Contributors: Martin A. Andresen, Matthew P J Ashby, Eric Beauregard, Wim Bernasco, Daniel Birks, Hervé Borrion, Kate Bowers, Anthony A. Braga, Tom Brenneman, David Buil-Gil, Meagan Cahill, Stefano Caneppele, Julien Chopin, Jeffrey E. Clutter, Toby Davies, Hashem Dehghanniri, Jillian Shafer Desmond, Beidi Dong, John E. Eck, Miriam Esteve, Timothy C. Hart, Georgia Hassall, David N. Hatten, Julie Hibdon, James Hunter, Shane D. Johnson, Samuel Langton, YongJei Lee, Ned Levine, Brian Lockwood, Dominique Lord, Nick Malleson, Dennis Mares, David Mazeika, Lorraine Mazerolle, Asier Moneva, Andrew Newton, Bradley J. O’Guinn, Ajima Olaghere, Graham C. Ousey, Ken Pease, Eric L. Piza, Jerry Ratcliffe, Caterina G. Roman, Stijn Ruiter, Reka Solymosi, Evan T. Sorg, Wouter Steenbeek, Hannah Steinman, Ralph B. Taylor, Marie Skubak Tillyer, Lisa Tompson, Brandon Turchan, David Weisburd, Brandon C. Welsh, Clair White, Douglas J. Wiebe, Pamela Wilcox, David B. Wilson, Alese Wooditch, Kathryn Wuschke, Sue-Ming Yang, and the editors.
READERS
Browse our collection.
PUBLISHERS
See BiblioVault's publisher services.
STUDENT SERVICES
Files for college accessibility offices.
UChicago Accessibility Resources
home | accessibility | search | about | contact us
BiblioVault ® 2001 - 2024
The University of Chicago Press