front cover of Science and an African Logic
Science and an African Logic
Helen Verran
University of Chicago Press, 2001
Does 2 + 2 = 4? Ask almost anyone and they will unequivocally answer yes. A basic equation such as this seems the very definition of certainty, but is it?

In this captivating book, Helen Verran addresses precisely that question by looking at how science, mathematics, and logic come to life in Yoruba primary schools. Drawing on her experience as a teacher in Nigeria, Verran describes how she went from the radical conclusion that logic and math are culturally relative, to determining what Westerners find so disconcerting about Yoruba logic, to a new understanding of all generalizing logic. She reveals that in contrast to the one-to-many model found in Western number systems, Yoruba thinking operates by figuring things as wholes and their parts. Quantity is not absolute but always relational. Certainty is derived not from abstract logic, but from cultural practices and associations.

A powerful story of how one woman's investigation in this everday situation led to extraordinary conclusions about the nature of numbers, generalization, and certainty, this book will be a signal contribution to philosophy, anthropology of science, and education.
[more]

front cover of Selected Papers on Analysis of Algorithms
Selected Papers on Analysis of Algorithms
Donald E. Knuth
CSLI, 2000
Analysis of Algorithms is the fourth in a series of collected works by world-renowned computer scientist Donald Knuth. This volume is devoted to an important subfield of Computer Science that Knuth founded in the 1960s and still considers his main life's work. This field, to which he gave the name Analysis of Algorithms, deals with quantitative studies of computer techniques, leading to methods for understanding and predicting the efficiency of computer programs. Analysis of Algorithms, which has grown to be a thriving international discipline, is the unifying theme underlying Knuth's well known book The Art of Computer Programming. More than 30 of the fundamental papers that helped to shape this field are reprinted and updated in the present collection, together with historical material that has not previously been published. Although many ideas come and go in the rapidly changing world of computer science, the basic concepts and techniques of algorithmic analysis will remain important as long as computers are used.
[more]

front cover of Selected Papers on Fun and Games
Selected Papers on Fun and Games
Donald E. Knuth
CSLI, 2011

Donald E. Knuth’s influence in computer science ranges from the invention of methods for translating and defining programming languages to the creation of the TeX and METAFONT systems for desktop publishing. His award-winning textbooks have become classics that are often given credit for shaping the field, and his scientific papers are widely referenced and stand as milestones of development over a wide variety of topics. The present volume is the eighth in a series of his collected papers.

[more]

logo for University of Chicago Press
Selected Papers, Volume 6
The Mathematical Theory of Black Holes and of Colliding Plane Waves
S. Chandrasekhar
University of Chicago Press, 1991
This is the first of six volumes collecting significant papers of the distinguished astrophysicist and Nobel laureate S. Chandrasekhar. His work is notable for its breadth as well as for its brilliance; his practice has been to change his focus from time to time to pursue new areas of research. The result has been a prolific career full of discoveries and insights, some of which are only now being fully appreciated.

Chandrasekhar has selected papers that trace the development of his ideas and that present aspects of his work not fully covered in the books he has periodically published to summarize his research in each area.
[more]

front cover of Selected Topics on Polynomials
Selected Topics on Polynomials
Andrzej Schinzel
University of Michigan Press, 1982
Complete proofs of both new results and original work on polynomials and Diophantine equations are presented here for the first time in book form. Although the results are technical, they will be of interest to algebraists and those interested in algebraic number theory.
[more]

front cover of Self-Reference
Self-Reference
Thomas Bolander, Vincent F. Hendricks, and Stig Andur Pedersen
CSLI, 2006
An anthology of previously unpublished essays from some of the most outstanding scholars working in philosophy, mathematics, and computer science today, Self-Reference reexamines the latest theories of self-reference, including those that attempt to explain and resolve the semantic and set-theoretic paradoxes. With a thorough introduction that contextualizes the subject for students, this book will be important reading for anyone interested in the general area of self-reference and philosophy.
[more]

front cover of Semantic Properties of Diagrams and Their Cognitive Potentials
Semantic Properties of Diagrams and Their Cognitive Potentials
Atsushi Shimojima
CSLI, 2015
Why are diagrams sometimes so useful, facilitating our understanding and thinking, while at other times they can be unhelpful and even misleading? Drawing on a comprehensive survey of modern research in philosophy, logic, artificial intelligence, cognitive psychology, and graphic design, Semantic Properties of Diagrams and Their Cognitive Potentials reveals the systematic reasons for this dichotomy, showing that the cognitive functions of diagrams are rooted in the characteristic ways they carry information. In analyzing the logical mechanisms behind the relative efficacy of diagrammatic representation, Atsushi Shimojima provides deep insight into the crucial question: What makes a diagram a diagram?
[more]

front cover of The Seven Pillars of Statistical Wisdom
The Seven Pillars of Statistical Wisdom
Stephen M. Stigler
Harvard University Press, 2016

What gives statistics its unity as a science? Stephen Stigler sets forth the seven foundational ideas of statistics—a scientific discipline related to but distinct from mathematics and computer science.

Even the most basic idea—aggregation, exemplified by averaging—is counterintuitive. It allows one to gain information by discarding information, namely, the individuality of the observations. Stigler’s second pillar, information measurement, challenges the importance of “big data” by noting that observations are not all equally important: the amount of information in a data set is often proportional to only the square root of the number of observations, not the absolute number. The third idea is likelihood, the calibration of inferences with the use of probability. Intercomparison is the principle that statistical comparisons do not need to be made with respect to an external standard. The fifth pillar is regression, both a paradox (tall parents on average produce shorter children; tall children on average have shorter parents) and the basis of inference, including Bayesian inference and causal reasoning. The sixth concept captures the importance of experimental design—for example, by recognizing the gains to be had from a combinatorial approach with rigorous randomization. The seventh idea is the residual: the notion that a complicated phenomenon can be simplified by subtracting the effect of known causes, leaving a residual phenomenon that can be explained more easily.

The Seven Pillars of Statistical Wisdom presents an original, unified account of statistical science that will fascinate the interested layperson and engage the professional statistician.

[more]

front cover of Several Complex Variables
Several Complex Variables
Raghavan Narasimhan
University of Chicago Press, 1971
Drawn from lectures given by Raghavan Narasimhan at the University of Geneva and the University of Chicago, this book presents the part of the theory of several complex variables pertaining to unramified domains over C . Topics discussed are Hartogs' theory, domains in holomorphy, and automorphism of bounded domains.
[more]

front cover of The Shaping of American Liberalism
The Shaping of American Liberalism
The Debates over Ratification, Nullification, and Slavery
David F. Ericson
University of Chicago Press, 1993
In The Liberal Tradition in America (1955), Louis Hartz first put forth his thesis that the American political tradition derives essentially from consensual liberal principles. The many detractors to this theory include Bernard Bailyn, who argued that preliberal, republican values initially held sway in eighteenth-century American politics. In The Shaping of American Liberalism, David Ericson offers an innovative reinterpretation of both positions by redefining the terms of the argument.

Focusing on three critical debates in American history—the debate between Anti-Federalists and Federalists over the ratification of the Constitution; the debate between the national republicans and the states-rights republicans over the nullification of the tariff; and the Lincoln-Douglas debates over slavery and pluralist democracy—Ericson shows that republicanism, rather than being opposed to liberalism, is in fact an offshoot of it. His descriptions of republicanism and pluralism represent the poles of an evolving tradition of liberal ideas in America: the former championing the claims of the public sphere, general welfare, and civic virtue; the latter protecting the rights of the individual to liberty, property, and privacy.

Republicanism and pluralism are therefore more properly understood as two sets of competing ideas that evolved from common roots. Ericson concludes that although republican themes persist in American politics, the profound transformations brought about by the Civil War made the ascendancy of pluralism virtually inevitable.

This highly original discussion of the relation between liberalism and republicanism—the central concern of much of the recent scholarship in American political thought—will be important reading for those interested in American politics, history, and culture.
[more]

front cover of Simplicial Objects in Algebraic Topology
Simplicial Objects in Algebraic Topology
J. P. May
University of Chicago Press, 1967
Simplicial sets are discrete analogs of topological spaces. They have played a central role in algebraic topology ever since their introduction in the late 1940s, and they also play an important role in other areas such as geometric topology and algebraic geometry. On a formal level, the homotopy theory of simplicial sets is equivalent to the homotopy theory of topological spaces. In view of this equivalence, one can apply discrete, algebraic techniques to perform basic topological constructions. These techniques are particularly appropriate in the theory of localization and completion of topological spaces, which was developed in the early 1970s.

Since it was first published in 1967, Simplicial Objects in Algebraic Topology has been the standard reference for the theory of simplicial sets and their relationship to the homotopy theory of topological spaces. J. Peter May gives a lucid account of the basic homotopy theory of simplicial sets, together with the equivalence of homotopy theories alluded to above. The central theme is the simplicial approach to the theory of fibrations and bundles, and especially the algebraization of fibration and bundle theory in terms of "twisted Cartesian products." The Serre spectral sequence is described in terms of this algebraization. Other topics treated in detail include Eilenberg-MacLane complexes, Postnikov systems, simplicial groups, classifying complexes, simplicial Abelian groups, and acyclic models.
 
"Simplicial Objects in Algebraic Topology presents much of the elementary material of algebraic topology from the semi-simplicial viewpoint. It should prove very valuable to anyone wishing to learn semi-simplicial topology. [May] has included detailed proofs, and he has succeeded very well in the task of organizing a large body of previously scattered material."—Mathematical Review

[more]

front cover of The Situation in Logic
The Situation in Logic
Jon Barwise
CSLI, 1989
Situation theory and situation semantics are recent approaches to language and inforamtion, approaches first formulated by Jon Barwise amd John Perry in Situations and Attitudes (1983). The present volume collects some of Barwise's papers written since then, those directly concerned with relations between logic, situation theory, and situation semantics. Several appers appear here mfor the first time. JON BARWISE is director of the Symbolic Systems Program and professor of philosophy at Stanford University and a researcher at CSLI.
[more]

front cover of Solomon Maimon
Solomon Maimon
Monism, Skepticism, and Mathematics
Meir Buzaglo
University of Pittsburgh Press, 2002
The philosophy of Solomon Maimon (1753–1800) is usually considered an important link between Kant’s transcendental philosophy and German idealism. Highly praised during his lifetime, over the past two centuries Maimon’s genius has been poorly understood and often ignored. Meir Buzaglo offers a reconstruction of Maimon’s philosophy, revealing that its true nature becomes apparent only when viewed in light of his philosophy of mathematics.

This provides the key to understanding Maimon’s solution to Kant’s quid juris question concerning the connection between intuition and concept in mathematics. Maimon’s original approach avoids dispensing with intuition (as in some versions of logicism and formalism) while reducing the reliance on intuition in its Kantian sense. As Buzaglo demonstrates, this led Maimon to question Kant’s ultimate rejection of the possibility of metaphysics and, simultaneously, to suggest a unique type of skepticism.
[more]

front cover of Sound Authorities
Sound Authorities
Scientific and Musical Knowledge in Nineteenth-Century Britain
Edward J. Gillin
University of Chicago Press, 2021

Sound Authorities shows how experiences of music and sound played a crucial role in nineteenth-century scientific inquiry in Britain.

In Sound Authorities, Edward J. Gillin focuses on hearing and aurality in Victorian Britain, claiming that the development of the natural sciences in this era cannot be understood without attending to the study of sound and music.

During this time, scientific practitioners attempted to fashion themselves as authorities on sonorous phenomena, coming into conflict with traditional musical elites as well as religious bodies. Gillin pays attention to sound in both musical and nonmusical contexts, specifically the cacophony of British industrialization. Sound Authorities begins with the place of acoustics in early nineteenth-century London, examining scientific exhibitions, lectures, spectacles, workshops, laboratories, and showrooms. He goes on to explore how mathematicians mobilized sound in their understanding of natural laws and their vision of a harmonious ordered universe. In closing, Gillin delves into the era’s religious and metaphysical debates over the place of music (and humanity) in nature, the relationship between music and the divine, and the tensions between spiritualist understandings of sound and scientific ones.

[more]

logo for Harvard University Press
A Source Book in Classical Analysis
Garrett Birkhoff
Harvard University Press, 1973

An understanding of the developments in classical analysis during the nineteenth century is vital to a full appreciation of the history of twentieth-century mathematical thought. It was during the nineteenth century that the diverse mathematical formulae of the eighteenth century were systematized and the properties of functions of real and complex variables clearly distinguished; and it was then that the calculus matured into the rigorous discipline of today, becoming in the process a dominant influence on mathematics and mathematical physics.

This Source Book, a sequel to D. J. Struik’s Source Book in Mathematics, 1200–1800, draws together more than eighty selections from the writings of the most influential mathematicians of the period. Thirteen chapters, each with an introduction by the editor, highlight the major developments in mathematical thinking over the century. All material is in English, and great care has been taken to maintain a high standard of accuracy both in translation and in transcription. Of particular value to historians and philosophers of science, the Source Book should serve as a vital reference to anyone seeking to understand the roots of twentieth-century mathematical thought.

[more]

logo for Harvard University Press
A Source Book in Mathematics, 1200-1800
D. J. Struik
Harvard University Press

The Source Book contains 75 excerpts from the writings of Western mathematics from the thirteenth to the end of the eighteenth century. The selection has been confined to pure mathematics or to those fields of applied mathematics that had a direct bearing on the development of pure mathematics.

The authors range from Al-Khwarizmi (a Latin translation of whose work was much used in Europe), Viète, and Oresme, to Newton, Euler, and Lagrange. The selections are grouped in chapters on arithmetic, algebra, geometry, and analysis. All the excerpts are translated into English. Some of the translations have been newly made by Mr. and Mrs. Struik; if a translation was already available it has been used, but in every such case it has been checked against the original and amended or corrected where it seemed necessary. The editor has taken considerable pains to put each selection in context by means of introductory comments and has explained obscure or doubtful points in footnote wherever necessary.

The Source Book should be particularly valuable to historians of science, but all who are concerned with the origins and growth of mathematics will find it interesting and useful.

[more]

logo for University of Minnesota Press
Springs of Scientific Creativity
Essays on Founders of Modern Science
claire Aris
University of Minnesota Press, 1983
Springs of Scientific Creativity was first published in 1983.Mathematician Henri Poincaré was boarding a bus when he realized that the transformations of non-Euclidian geometry were just those he needed in his research on the theory of functions. He did not have to interrupt his conversation, still less to verify the equation in detail; his insight was complete at that point. Poincaré’s insight into his own creativity -- his awareness that preliminary cogitation and the working of the subconscious had prepared his mind for an intuitive flash of recognition -- is just one of many possible analyses of scientific creativity, a subject as fascinating as it is elusive.The authors of this book have chosen to search for the springs of scientific creativity by examining the lives and work of a dozen innovative thinkers in the fields of mathematics, physics, and chemistry from the seventeenth down to the mid-twentieth century. First prepared for delivery in a lecture series held at the University of Minnesota, these essays delve into the social, psychological, and intellectual factors that fostered creativity in the lives of Galilei Galileo, Isaac Newton, J. P. Joule, James Cler Maxwell, Josiah Willard Gibbs, Lord Rayleigh, Elmer Sperry and Adrian Leverkühn, Walter Nernst, Albert Einstein, Erwin Schrödinger, Michael Polyani, and John von Neumann.The contributors are Thomas B. Settle, Richard S. Westfall, Donald S. L. Cardwell, C. W. F. Everitt, Martin J. Klein, John N. Howard, Thomas P. Hughes, Erwin N. Hiebert, Stanley Goldberg, Linda Wessels, William T. Scott, and Herman H. Goldstine.
[more]

front cover of Squaring the Circle
Squaring the Circle
The War between Hobbes and Wallis
Douglas M. Jesseph
University of Chicago Press, 1999
In 1655, the philosopher Thomas Hobbes claimed he had solved the centuries-old problem of "squaring of the circle" (constructing a square equal in area to a given circle). With a scathing rebuttal to Hobbes's claims, the mathematician John Wallis began one of the longest and most intense intellectual disputes of all time. Squaring the Circle is a detailed account of this controversy, from the core mathematics to the broader philosophical, political, and religious issues at stake.

Hobbes believed that by recasting geometry in a materialist mold, he could solve any geometric problem and thereby demonstrate the power of his materialist metaphysics. Wallis, a prominent Presbyterian divine as well as an eminent mathematician, refuted Hobbes's geometry as a means of discrediting his philosophy, which Wallis saw as a dangerous mix of atheism and pernicious political theory.

Hobbes and Wallis's "battle of the books" illuminates the intimate relationship between science and crucial seventeenth-century debates over the limits of sovereign power and the existence of God.
[more]

front cover of Statistics for Public Policy
Statistics for Public Policy
A Practical Guide to Being Mostly Right (or at Least Respectably Wrong)
Jeremy G. Weber
University of Chicago Press, 2024

A long-overdue guide on how to use statistics to bring clarity, not confusion, to policy work.

Statistics are an essential tool for making, evaluating, and improving public policy. Statistics for Public Policy is a crash course in wielding these unruly tools to bring maximum clarity to policy work. Former White House economist Jeremy G. Weber offers an accessible voice of experience for the challenges of this work, focusing on seven core practices: 

  • Thinking big-picture about the role of data in decisions
  • Critically engaging with data by focusing on its origins, purpose, and generalizability
  • Understanding the strengths and limits of the simple statistics that dominate most policy discussions
  • Developing reasons for considering a number to be practically small or large  
  • Distinguishing correlation from causation and minor causes from major causes
  • Communicating statistics so that they are seen, understood, and believed
  • Maintaining credibility by being right (or at least respectably wrong) in every setting
Statistics for Public Policy dispenses with the opacity and technical language that have long made this space impenetrable; instead, Weber offers an essential resource for all students and professionals working at the intersections of data and policy interventions. This book is all signal, no noise.
[more]

front cover of Statistics on the Table
Statistics on the Table
The History of Statistical Concepts and Methods
Stephen M. Stigler
Harvard University Press, 2002
This lively collection of essays examines in witty detail the history of some of the concepts involved in bringing statistical argument "to the table," and some of the pitfalls that have been encountered. The topics range from seventeenth-century medicine and the circulation of blood, to the cause of the Great Depression and the effect of the California gold discoveries of 1848 upon price levels, to the determinations of the shape of the Earth and the speed of light, to the meter of Virgil's poetry and the prediction of the Second Coming of Christ. The title essay tells how the statistician Karl Pearson came to issue the challenge to put "statistics on the table" to the economists Marshall, Keynes, and Pigou in 1911. The 1911 dispute involved the effect of parental alcoholism upon children, but the challenge is general and timeless: important arguments require evidence, and quantitative evidence requires statistical evaluation. Some essays examine deep and subtle statistical ideas such as the aggregation and regression paradoxes; others tell of the origin of the Average Man and the evaluation of fingerprints as a forerunner of the use of DNA in forensic science. Several of the essays are entirely nontechnical; all examine statistical ideas with an ironic eye for their essence and what their history can tell us about current disputes.
[more]

front cover of Studies in Weak Arithmetics, Volume 1
Studies in Weak Arithmetics, Volume 1
Edited by Patrick Cégielski
CSLI, 2009

The field of weak arithmetics is an application of logical methods to number theory that was developed by mathematicians, philosophers, and theoretical computer scientists. In this volume, after a general presentation of weak arithmetics, the following topics are studied: the properties of integers of a real closed field equipped with exponentiation; conservation results for the induction schema restricted to first-order formulas with a finite number of alternations of quantifiers; a survey on a class of tools called pebble games; the fact that the reals e and pi have approximations expressed by first-order formulas using bounded quantifiers; properties of infinite pictures depending on the universe of sets used; a language that simulates in a sufficiently nice manner all  algorithms of a certain restricted class; the logical complexity of the axiom of infinity in some variants of set theory without the axiom of  foundation; and the complexity to determine whether a trace is included in another one.

[more]

front cover of Studies in Weak Arithmetics, Volume 2
Studies in Weak Arithmetics, Volume 2
Edited by Patrick Cégielski, Charalampos Cornaros, and Costas Dimitracopoulos
CSLI, 2013
The field of weak arithmetics is an application of logical methods to number theory that was developed by mathematicians, philosophers, and theoretical computer scientists. New Studies in Weak Arithmetics is dedicated to late Australian mathematician Alan Robert Woods (1953-2011), whose seminal thesis is published here for the first time. This volume also contains the unpublished but significant thesis of Hamid Lesan (1951-2006) as well as other original papers on topics addressed in Woods’s thesis and life’s work that were first presented at the 31st Journées sur les Arithmétiques Faibles meeting held in Samos, Greece, in 2012.
[more]

front cover of Studies in Weak Arithmetics, Volume 3
Studies in Weak Arithmetics, Volume 3
Edited by Patrick Cegielski, Ali Enayat, and Roman Kossak
CSLI, 2013
The field of weak arithmetics is an application of logical methods to number theory that was developed by mathematicians, philosophers, and theoretical computer scientists. This third volume in the weak arithmetics collection contains nine substantive papers based on lectures delivered during the two last meetings of the conference series Journées sur les Arithmétiques, held in 2014 at the University of Gothenburg, Sweden, and in 2015 at the City University of New York Graduate Center.
[more]

front cover of Studies on Divergent Series and Summability
Studies on Divergent Series and Summability
Walter Burton Ford, Ph.D.
University of Michigan Press, 1916
A publication of the University of Michigan’s Science Series, Studies on Divergent Series and Summability is based on lectures and courses given by Walter Burton Ford at the University of Michigan about infinite series and divergent series. According to Ford, the study of divergent series can be divided into two parts, the first regarding asymptotic series and the second regarding the theory of summability, both of which are discussed within this volume.
[more]

logo for University of Chicago Press
The Subject Matters
Classroom Activity in Math and Social Studies
Susan S. Stodolsky
University of Chicago Press, 1988
To achieve quality education in American schools, we need a better understanding of the way classroom instruction works. Susan S. Stodolsky addresses this need with her pioneering analysis of the interrelations between forms of instruction, levels of student involvement, and subject matter. Her intensive observation of fifth-grade math and social studies classes reveals that subject matter, a variable overlooked in recent research, has a profound effect on instructional practice.

Stodolsky presents a challenge to educational research. She shows that classroom activities are coherent actions shaped by the instructional context—especially what is taught. Stodolsky contradicts the received view of both teaching and learning as uniform and consistent. Individual teachers arrange instruction very differently, depending on what they are teaching, and students respond to instruction very differently, depending on the structure and demands of the lesson.

The instructional forms used in math classes, a "basic" subject, and social studies classes, an "enrichment" subject, differ even when the same teacher conducts both classes. Social studies classes show more diversity in activities, while math classes are very similar to one another. Greater variety is found in social studies within a given teacher's class and when different teachers' classes are compared. Nevertheless, in the classrooms Stodolsky studied, the range of instructional arrangements is very constricted.

Challenging the "back to basics" movement, Stodolsky's study indicates that, regardless of subject matter, students are more responsive to instruction that requires a higher degree of intellectual complexity and performance, to learning situations that involve them in interaction with their peers, and to active modes of learning. Stodolsky also argues that students develop ideas about how to learn a school subject, such as math, by participating in particular activities tied to instruction in the subject. These conceptions about learning are unplanned but enduring and significant consequences of schooling.

The Subject Matters has important implications for instructional practice and the training, education, and supervision of teachers. Here is a new way of understanding the dynamics of teaching and learning that will transform how we think about schools and how we study them.
[more]

front cover of Sum of the Parts
Sum of the Parts
The Mathematics and Politics of Region, Place, and Writing
Kent C Ryden
University of Iowa Press, 2011
Proponents of the new regional history understand that regional identities are constructed and contested, multifarious and not monolithic, that they involve questions of dominance and power, and that their nature is inherently political. In this lively new book, writing in the spirit of these understandings, Kent Ryden engagingly examines works of American regional writing to show us how literary partisans of place create and recreate, attack and defend, argue over and dramatize the meaning and identity of their regions in the pages of their books.
 
Cleverly drawing upon mathematical models that complement his ideas and focusing on both classic and contemporary literary regionalists, Ryden demonstrates that regionalism, in the cultural sense, retains a great deal of power as a framework for literary interpretation. For New England he examines such writers as Robert Frost and Hayden Carruth, Mary E. Wilkins Freeman and Edith Wharton, and Carolyn Chute and Russell Banks to demonstrate that today’s regionalists inspire closer, more democratic readings of life and landscape. For the West and South, he describes Wallace Stegner’s and William Faulkner’s use of region to, respectively, exclude and evade or confront and indict. For the Midwest, he focuses on C. J. Hribal, William Least Heat-Moon, Paul Gruchow, and others to demonstrate that midwesterners continually construct the past anew from the materials at hand, filling the seemingly empty midlands with history and significance.
 
Ryden reveals that there are many Wests, many New Englands, many Souths, and many Midwests, all raising similar issues about the cultural politics of region and place. Writing with appealing freshness and a sense of adventure, he shows us that place, and the stories that emerge from and define place, can be a source of subversive energy that blunts the homogenizing force of region, inscribing marginal places and people back onto the imaginative surface of the landscape when we read it on a place-by-place, landscape-by-landscape, book-by-book basis.
[more]

front cover of Surfaces and Superposition
Surfaces and Superposition
Field Notes on some Geometrical Excavations
Ernest W. Adams
CSLI, 2001
Buildings appear to rest on top of the earth's surface, yet the surface is actually permeated by the buildings' foundations-out of view. If a foundation's blueprints are unavailable, as in archaeology, excavation would be needed to discover what actually supports a specific building. Analogously, the fields of geometry and topology have easily observable concepts resting on the surface of theoretical underpinnings that have not been completely discovered, unearthed or understood. Moreover, geometrical and topological principles of superposition provide insight into probing the connections between accessible superstructures and their hidden underpinnings. This book develops and applies these insights broadly, from physics to mathematics to philosophy. Even analogies and abstractions can now be seen as foundational superpositions.

This book examines the dimensionality of surfaces, how superpositions can make stable frameworks, and gives a quasi-Leibnizian account of the relative `spaces' that are defined by these frameworks. Concluding chapters deal with problems concerning the spatio-temporal frameworks of physical theories and implications for theories of visual geometry. The numerous illustrations, while surprisingly simple, are satisfyingly clear.
[more]

front cover of Symbols and Things
Symbols and Things
Material Mathematics in the Eighteenth and Nineteenth Centuries
Kevin Lambert
University of Pittsburgh Press, 2021

In the steam-powered mechanical age of the eighteenth and nineteenth centuries, the work of late Georgian and early Victorian mathematicians depended on far more than the properties of number. British mathematicians came to rely on industrialized paper and pen manufacture, railways and mail, and the print industries of the book, disciplinary journal, magazine, and newspaper. Though not always physically present with one another, the characters central to this book—from George Green to William Rowan Hamilton—relied heavily on communication technologies as they developed their theories in consort with colleagues. The letters they exchanged, together with the equations, diagrams, tables, or pictures that filled their manuscripts and publications, were all tangible traces of abstract ideas that extended mathematicians into their social and material environment. Each chapter of this book explores a thing, or assembling of things, mathematicians needed to do their work—whether a textbook, museum, journal, library, diagram, notebook, or letter—all characteristic of the mid-nineteenth-century British taskscape, but also representative of great change to a discipline brought about by an industrialized world in motion.

[more]

logo for University of Chicago Press
Systems of Linear Inequalities
A. S. Solodovnikov
University of Chicago Press, 1980
This volume describes the relationship between systems of linear inequalities and the geometry of convex polygons, examines solution sets for systems of linear inequalities in two and three unknowns (extension of the processes introduced to systems in any number of unknowns is quite simple), and examines questions of the consistency or inconsistency of such systems. Finally, it discusses the field of linear programming, one of the principal applications of the theory of systems of linear inequalities. A proof of the duality theorem of linear programming is presented in the last section.
[more]

front cover of Systems with Small Dissipation
Systems with Small Dissipation
V. B. Braginsky, V. P. Mitrofanov, and V. I. Panov
University of Chicago Press, 1985
Electromagnetic and mechanical oscillators are crucial in such diverse fields as electrical engineering, microwave technology, optical technology, and experimental physics. For example, such oscillators are the key elements in instruments for detecting extremely weak mechanical forces and electromagnetic signals are essential to highly stable standards of time and frequency. The central problem in developing such instruments is to construct oscillators that are as perfectly simple harmonic as possible; the largest obstacle is the oscillator's dissipation and the fluctuating forces associated with it.

This book, first published in Russian in 1981 and updated with new data for this English edition, is a treatise on the sources of dissipation and other defects in mechanical and electromagnetic oscillators and on practical techniques for minimizing such defects. Written by a team of researchers from Moscow State University who are leading experts in the field, the book is a virtual encyclopedia of theoretical formulas, experimental techniques, and practical lore derived from twenty-five years of experience. Intended for the experimenter who wishes to construct near-perfect instrumentation, the book provides information on everything from the role of phonon-phonon scattering as a fundamental source of dissipation to the effectiveness of a thin film of pork fat in reducing the friction between a support wire and a mechanically oscillating sapphire crystal.

The researchers that V. B. Braginsky has led since the mid-1960s are best known in the West for their contributions to the technology of gravitational-wave detection, their experimental search for quarks, their test of the equivalency principle, and their invention of new experimental techniques for high-precision measurement, including "quantum nondemolition movements." Here, for the first time, they provide a thorough overview of the practical knowledge and experimental methods that have earned them a worldwide reputation for ingenuity, talent, and successful technique.
[more]


Send via email Share on Facebook Share on Twitter