front cover of Across Space and Time
Across Space and Time
Papers from the 41st Conference on Computer Applications and Quantitative Methods in Archaeology, Perth, 25-28 March 2013
Edited by Arianna Traviglia
Amsterdam University Press, 2015
This volume presents a selection of the best papers presented at the forty-first annual Conference on Computer Applications and Quantitative Methods in Archaeology. The theme for the conference was “Across Space and Time” and the papers explore a multitude of topics related to that concept, including databases, the semantic Web, geographical information systems, data collection and management, and more.

logo for The Institution of Engineering and Technology
Algorithmic and Knowledge-based CAD for VLSI
Gaynor Taylor
The Institution of Engineering and Technology, 1992
The continuing growth in the size and complexity of VLSI devices requires a parallel development of well-designed, efficient CAD tools. The majority of commercially available tools are based on an algorithmic approach to the problem and there is a continuing research effort aimed at improving these. The sheer complexity of the problem has, however, led to an interest in examining the applicability of expert systems and other knowledge based techniques to certain problems in the area and a number of results are becoming available. The aim of this book is to sample the present state-of-the-art in CAD for VLSI and it covers both newly developed algorithms and applications of techniques from the artificial intelligence community. The editors believe it will prove of interest to all engineers concerned with the design and testing of integrated circuits and systems.

front cover of Algorithms of Education
Algorithms of Education
How Datafication and Artificial Intelligence Shape Policy
Kalervo N. Gulson
University of Minnesota Press, 2022

A critique of what lies behind the use of data in contemporary education policy 

While the science fiction tales of artificial intelligence eclipsing humanity are still very much fantasies, in Algorithms of Education the authors tell real stories of how algorithms and machines are transforming education governance, providing a fascinating discussion and critique of data and its role in education policy.

Algorithms of Education explores how, for policy makers, today’s ever-growing amount of data creates the illusion of greater control over the educational futures of students and the work of school leaders and teachers. In fact, the increased datafication of education, the authors argue, offers less and less control, as algorithms and artificial intelligence further abstract the educational experience and distance policy makers from teaching and learning. Focusing on the changing conditions for education policy and governance, Algorithms of Education proposes that schools and governments are increasingly turning to “synthetic governance”—a governance where what is human and machine becomes less clear—as a strategy for optimizing education.

Exploring case studies of data infrastructures, facial recognition, and the growing use of data science in education, Algorithms of Education draws on a wide variety of fields—from critical theory and media studies to science and technology studies and education policy studies—mapping the political and methodological directions for engaging with datafication and artificial intelligence in education governance. According to the authors, we must go beyond the debates that separate humans and machines in order to develop new strategies for, and a new politics of, education.


front cover of The Analysis of Animal Bones from Archeological Sites
The Analysis of Animal Bones from Archeological Sites
Richard G. Klein and Kathryn Cruz-Uribe
University of Chicago Press, 1984
In growing numbers, archeologists are specializing in the analysis of excavated animal bones as clues to the environment and behavior of ancient peoples. This pathbreaking work provides a detailed discussion of the outstanding issues and methods of bone studies that will interest zooarcheologists as well as paleontologists who focus on reconstructing ecologies from bones. Because large samples of bones from archeological sites require tedious and time-consuming analysis, the authors also offer a set of computer programs that will greatly simplify the bone specialist's job.

After setting forth the interpretive framework that governs their use of numbers in faunal analysis, Richard G. Klein and Kathryn Cruz-Uribe survey various measures of taxonomic abundance, review methods for estimating the sex and age composition of a fossil species sample, and then give examples to show how these measures and sex/age profiles can provide useful information about the past. In the second part of their book, the authors present the computer programs used to calculate and analyze each numerical measure or count discussed in the earlier chapters. These elegant and original programs, written in BASIC, can easily be used by anyone with a microcomputer or with access to large mainframe computers.

front cover of Arabic Computational Linguistics
Arabic Computational Linguistics
Edited by Ali Farghaly
CSLI, 2010
Arabic is an exciting—yet challenging—language for scholars because many of its linguistic properties have not been fully described. Arabic Computational Linguistics documents the recent work of researchers in both academia and industry who have taken up the challenge of solving the real-life problems posed by an understudied language.

This comprehensive volume explores new Arabic machine translation systems, innovations in speech recognition and mention detection, tree banks, and linguistic corpora. Arabic Computational Linguistics will be an indispensable reference for language researchers and practitioners alike.

logo for The Institution of Engineering and Technology
Artificial Intelligence Techniques in Power Systems
Kevin Warwick
The Institution of Engineering and Technology, 1997
Research in artificial intelligence has developed many techniques and methodologies that can be either adapted or used directly to solve complex power system problems. A variety of such problems are covered in this book including reactive power control, alarm analysis, fault diagnosis, protection systems and load forecasting. Methods such as knowledge-based (expert) systems, fuzzy logic, neural networks and genetic algorithms are all first introduced and then investigated in terms of their applicability in the power systems field. The book, therefore, serves as both an introduction to the use of artificial intelligence techniques for those from a power systems background and as an overview of the power systems implementation area for those from an artificial intelligence computing or control background. It is structured so that it is suitable for various levels of reader, covering basic principles as well as applications and case studies. The most popular methods and the most fruitful application fields are considered in more detail. The book contains contributions from top international authors and will be an extremely useful text for all those with an interest in the field.

front cover of Augmented Exploitation
Augmented Exploitation
Artificial Intelligence, Automation and Work
Phoebe Moore
Pluto Press, 2021
Artificial Intelligence is a seemingly neutral technology, but it is increasingly used to manage workforces and make decisions to hire and fire employees. Its proliferation in the workplace gives the impression of a fairer, more efficient system of management. A machine can't discriminate, after all. Augmented Exploitation explores the reality of the impact of AI on workers' lives. While the consensus is that AI is a completely new way of managing a workplace, the authors show that, on the contrary, AI is used as most technologies are used under capitalism: as a smokescreen that hides the deep exploitation of workers. Going beyond platform work and the gig economy, the authors explore emerging forms of algorithmic governance and AI-augmented apps that have been developed to utilise innovative ways to collect data about workers and consumers, as well as to keep wages and worker representation under control. They also show that workers are not taking this lying down, providing case studies of new and exciting form of resistance that are springing up across the globe.

front cover of Automaton Theories of Human Sentence Comprehension
Automaton Theories of Human Sentence Comprehension
John T. Hale
CSLI, 2014
By relating grammar to cognitive architecture, John T. Hale shows step-by-step how incremental parsing works in models of perceptual processing and how specific learning rules might lead to frequency-sensitive preferences. Along the way, Hale reconsiders garden-pathing, the parallel/serial distinction, and information-theoretical complexity metrics, such as surprisal. This book is a must for cognitive scientists of language.

front cover of Big Books in Times of Big Data
Big Books in Times of Big Data
Inge Van de Ven
Leiden University Press, 2019
This book explores the aesthetics, medial affordances, and cultural economics of monumental literary works of the digital age and offers a comparative and cross-cultural perspective on a wide range of contemporary writers. Using an international archive of hefty tomes by authors such as Mark Z. Danielewski, Roberto Bolaño, Elena Ferrante, Karl Ove Knausgård, George R.R. Martin, Jonathan Franzen, and William T. Vollmann, van de Ven investigates multiple strands of bigness that speak to the tenuous position of print literature in the present but also to the robust stature of literary discourse within our age of proliferating digital media. Her study makes a case for the cultural agency of the big book—as a material object and a discursive phenomenon, entangled in complex ways with questions of canonicity, materiality, gender, and power. Van de Ven takes us into a contested terrain beyond the 1,000-page mark, where issues of scale and reader comprehension clash with authorial aggrandizement and the pleasures of binge reading and serial consumption.

front cover of Big Data for Twenty-First-Century Economic Statistics
Big Data for Twenty-First-Century Economic Statistics
Edited by Katharine G. Abraham, Ron S. Jarmin, Brian C. Moyer, and Matthew D. Shapiro
University of Chicago Press, 2022
The papers in this volume analyze the deployment of Big Data to solve both existing and novel challenges in economic measurement. 

The existing infrastructure for the production of key economic statistics relies heavily on data collected through sample surveys and periodic censuses, together with administrative records generated in connection with tax administration. The increasing difficulty of obtaining survey and census responses threatens the viability of existing data collection approaches. The growing availability of new sources of Big Data—such as scanner data on purchases, credit card transaction records, payroll information, and prices of various goods scraped from the websites of online sellers—has changed the data landscape. These new sources of data hold the promise of allowing the statistical agencies to produce more accurate, more disaggregated, and more timely economic data to meet the needs of policymakers and other data users. This volume documents progress made toward that goal and the challenges to be overcome to realize the full potential of Big Data in the production of economic statistics. It describes the deployment of Big Data to solve both existing and novel challenges in economic measurement, and it will be of interest to statistical agency staff, academic researchers, and serious users of economic statistics.

logo for University of Chicago Press
Chicago Guide to Preparing Electronic Manuscripts
The University of Chicago Press
University of Chicago Press, 1987
This guide to preparing manuscripts on computer offers authors and publishers practical assistance on how to use authors' disks or tapes for typesetting. When the thirteenth edition of The Chicago Manual of Style was published in 1982, the impact of personal computers on the publishing process had just begun to be felt. This new book supplements information in the Chicago Manual by covering the rapidly changing subject of electronic manuscripts. Since the early 1980s more and more authors have been producing manuscripts on computers and expecting their publishers to make use of the electronic version. For a number of reasons, including the proliferation of incompatible machines and software, however, publishers have not always found it easy to work with electronic manuscripts. The University of Chicago Press has been doing so since 1981, and in this book passes on the results of six years' experience with preparing such manuscripts and converting them to type.

front cover of Coding Streams of Language
Coding Streams of Language
Techniques for the Systematic Coding of Text, Talk, and Other Verbal Data
Cheryl Geisler
University Press of Colorado, 2020
Coding Streams of Language is a systematic and practical research guide to coding verbal data in all its forms—written texts and oral talk, in print or online, gathered through surveys and interviews, database searches, or audio or video recordings. The thoughtful, detailed advice found in this book will help readers carry out analyses of language that are both systematic and complex. Situating themselves in the relatively new line of mixed methods research, the authors provide guidance on combining context-based inquiry with quantitative tools for examining big picture patterns that acknowledges the complexity of language use. Throughout Coding Streams of Language, exercises, points for discussion, and milestones help guide readers through an analytic project. As a supplement to the book, YouTube videos demonstrate tools and techniques.

front cover of Collecting Experiments
Collecting Experiments
Making Big Data Biology
Bruno J. Strasser
University of Chicago Press, 2019
Databases have revolutionized nearly every aspect of our lives. Information of all sorts is being collected on a massive scale, from Google to Facebook and well beyond. But as the amount of information in databases explodes, we are forced to reassess our ideas about what knowledge is, how it is produced, to whom it belongs, and who can be credited for producing it.
Every scientist working today draws on databases to produce scientific knowledge. Databases have become more common than microscopes, voltmeters, and test tubes, and the increasing amount of data has led to major changes in research practices and profound reflections on the proper professional roles of data producers, collectors, curators, and analysts.
Collecting Experiments traces the development and use of data collections, especially in the experimental life sciences, from the early twentieth century to the present. It shows that the current revolution is best understood as the coming together of two older ways of knowing—collecting and experimenting, the museum and the laboratory. Ultimately, Bruno J. Strasser argues that by serving as knowledge repositories, as well as indispensable tools for producing new knowledge, these databases function as digital museums for the twenty-first century.

front cover of Collecting Lives
Collecting Lives
Critical Data Narrative as Modernist Aesthetic in Early Twentieth-Century U.S. Literatures
Elizabeth Rodrigues
University of Michigan Press, 2022
On a near-daily basis, data is being used to narrate our lives. Categorizing algorithms drawn from amassed personal data to assign narrative destinies to individuals at crucial junctures, simultaneously predicting and shaping the paths of our lives. Data is commonly assumed to bring us closer to objectivity, but the narrative paths these algorithms assign seem, more often than not, to replicate biases about who an individual is and could become.

While the social effects of such algorithmic logics seem new and newly urgent to consider, Collecting Lives looks to the late nineteenth and early twentieth century U.S. to provide an instructive prehistory to the underlying question of the relationship between data, life, and narrative. Rodrigues contextualizes the application of data collection to human selfhood in order to uncover a modernist aesthetic of data that offers an alternative to the algorithmic logic pervading our sense of data’s revelatory potential. Examining the work of W. E. B. Du Bois, Henry Adams, Gertrude Stein, and Ida B. Wells-Barnett, Rodrigues asks how each of these authors draw from their work in sociology, history, psychology, and journalism to formulate a critical data aesthetic as they attempt to answer questions of identity around race, gender, and nation both in their research and their life writing. These data-driven modernists not only tell different life stories with data, they tell life stories differently because of data.

front cover of Composing Media Composing Embodiment
Composing Media Composing Embodiment
Kristin L. Arola and Anne Frances Wysocki
Utah State University Press, 2012

“What any body is—and is able to do—cannot be disentangled from the media we use to consume and produce texts.” ---from the Introduction.

Kristin Arola and Anne Wysocki argue that composing in new media is composing the body—is embodiment. In Composing (Media) = Composing (Embodiment), they havebrought together a powerful set of essays that agree on the need for compositionists—and their students—to engage with a wide range of new media texts. These chapters explore how texts of all varieties mediate and thereby contribute to the human experiences of communication, of self, the body, and composing. Sample assignments and activities exemplify how this exploration might proceed in the writing classroom.

Contributors here articulate ways to understand how writing enables the experience of our bodies as selves, and at the same time to see the work of (our) writing in mediating selves to make them accessible to institutional perceptions and constraints. These writers argue that what a body does, and can do, cannot be disentangled from the media we use, nor from the times and cultures and technologies with which we engage.

To the discipline of composition, this is an important discussion because it clarifies the impact/s of literacy on citizens, freedoms, and societies. To the classroom, it is important because it helps compositionists to support their students as they enact, learn, and reflect upon their own embodied and embodying writing.


logo for The Institution of Engineering and Technology
Computer Control of Real-Time Processes
S. Bennett
The Institution of Engineering and Technology, 1990
This book provides an introduction to many aspects of computer control. It covers techniques or control algorithm design and tuning of controllers; computer communications; parallel processing; and software design and implementation. The theoretical material is supported by case studies covering power systems control, robot manipulators, liquid natural as vaporisers, batch control of chemical processes; and active control of aircraft.

front cover of Computers in Education
Computers in Education
A Half-Century of Innovation
Robert Smith
CSLI, 2015
Described by the New York Times as a visionary “pioneer in computerized learning,” Patrick Suppes (1922-2014) and his many collaborators at Stanford University conducted research on the development, commercialization, and use of computers in education from 1963 to 2013. Computers in Education synthesizes this wealth of scholarship into a single succinct volume that highlights the profound interconnections of technology in education. By capturing the great breadth and depth of this research, this book offers an accessible introduction to Suppes’s striking work.

front cover of The Core and the Periphery
The Core and the Periphery
Data-Driven Perspectives on Syntax Inspired by Ivan A. Sag
Edited by Philip Hofmeister and Elisabeth Norcliffe
CSLI, 2013
The Core and the Periphery is a collection of papers inspired by the linguistics career of Ivan A. Sag (1949-2013), written to commemorate his many contributions to the field. Sag was professor of linguistics at Stanford University from 1979 to 2013; served as the director of the Symbolic Systems Program from 2005 to 2009; authored, co-authored, or edited fifteen volumes on linguistics; and was at the forefront of non-transformational approaches to syntax. Reflecting the breadth of Sag’s theoretical interests and approaches to linguistic problems, the papers collected here tackle a range of grammar-related issues using corpora, intuitions, and laboratory experiments. They are united by their use of and commitment to rich datasets and share the perspective that the best theories of grammar attempt to account for the full diversity and complexity of language data.

logo for Assoc of College & Research Libraries
Curating Research Data Volume One
Practical Strategies for Your Digital Repository
Lisa R. Johnston
Assoc of College & Research Libraries, 2017

logo for Assoc of College & Research Libraries
Curating Research Data Volume Two
A Handbook of Current Practice
Lisa R. Johnston
Assoc of College & Research Libraries, 2017

logo for American Library Association
Data Management for Libraries
Carly A. Strasser
American Library Association, 2013

front cover of Data Privacy During Pandemics
Data Privacy During Pandemics
A Scorecard Approach for Evaluating the Privacy Implications of COVID-19 Mobile Phone Surveillance Programs
Benjamin Boudreaux
RAND Corporation, 2020
As part of the response to the COVID-19 pandemic, governments worldwide have deployed mobile phone surveillance programs to augment public health interventions. However, these programs raise privacy concerns. The authors of this report examine whether two goals can be achieved concurrently: the use of mobile phones as public health surveillance tools to help manage COVID‑19 and future crises, and the protection of privacy and civil liberties.

front cover of Database Aesthetics
Database Aesthetics
Art in the Age of Information Overflow
Victoria Vesna
University of Minnesota Press, 2007

Database Aesthetics examines the database as cultural and aesthetic form, explaining how artists have participated in network culture by creating data art. The essays in this collection look at how an aesthetic emerges when artists use the vast amounts of available information as their medium. Here, the ways information is ordered and organized become artistic choices, and artists have an essential role in influencing and critiquing the digitization of daily life.

Contributors: Sharon Daniel, U of California, Santa Cruz; Steve Deitz, Carleton College; Lynn Hershman Leeson, U of California, Davis; George Legrady, U of California, Santa Barbara; Eduardo Kac, School of the Art Institute of Chicago; Norman Klein, California Institute of the Arts; John Klima; Lev Manovich, U of California, San Diego; Robert F. Nideffer, U of California, Irvine; Nancy Paterson, Ontario College of Art and Design; Christiane Paul, School of Visual Arts in New York; Marko Peljhan, U of California, Santa Barbara; Warren Sack, U of California, Santa Cruz; Bill Seaman, Rhode Island School of Design; Grahame Weinbren, School of Visual Arts, New York.

Victoria Vesna is a media artist, and professor and chair of the Department of Design and Media Arts at the University of California, Los Angeles.


logo for Assoc of College & Research Libraries
The Academic Data Librarian in Theory and Practice
Lynda Kellam
Assoc of College & Research Libraries, 2016

front cover of Data-Centric Biology
Data-Centric Biology
A Philosophical Study
Sabina Leonelli
University of Chicago Press, 2016
In recent decades, there has been a major shift in the way researchers process and understand scientific data. Digital access to data has revolutionized ways of doing science in the biological and biomedical fields, leading to a data-intensive approach to research that uses innovative methods to produce, store, distribute, and interpret huge amounts of data. In Data-Centric Biology, Sabina Leonelli probes the implications of these advancements and confronts the questions they pose. Are we witnessing the rise of an entirely new scientific epistemology? If so, how does that alter the way we study and understand life—including ourselves?

 Leonelli is the first scholar to use a study of contemporary data-intensive science to provide a philosophical analysis of the epistemology of data. In analyzing the rise, internal dynamics, and potential impact of data-centric biology, she draws on scholarship across diverse fields of science and the humanities—as well as her own original empirical material—to pinpoint the conditions under which digitally available data can further our understanding of life. Bridging the divide between historians, sociologists, and philosophers of science, Data-Centric Biology offers a nuanced account of an issue that is of fundamental importance to our understanding of contemporary scientific practices.

front cover of Data-Driven Modeling, Filtering and Control
Data-Driven Modeling, Filtering and Control
Methods and applications
Carlo Novara
The Institution of Engineering and Technology, 2019
The scientific research in many engineering fields has been shifting from traditional first-principle-based to data-driven or evidence-based theories. The latter methods may enable better system design, based on more accurate and verifiable information.

front cover of Defense Resource Planning Under Uncertainty
Defense Resource Planning Under Uncertainty
An Application of Robust Decision Making to Munitions Mix Planning
Robert J. Lempert
RAND Corporation, 2016
Defense planning faces significant uncertainties. This report applies robust decision making (RDM) to the air-delivered munitions mix challenge. RDM is quantitative, decision support methodology designed to inform decisions under conditions of deep uncertainty and complexity. This proof-of-concept demonstration suggests that RDM could help defense planners make plans more robust to a wide range of hard-to-predict futures.

front cover of Democracy’s Detectives
Democracy’s Detectives
The Economics of Investigative Journalism
James T. Hamilton
Harvard University Press, 2016

Winner of the Goldsmith Book Prize, Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School of Government
Winner of the Tankard Book Award, Association for Education in Journalism and Mass Communication
Winner of the Frank Luther Mott–Kappa Tau Alpha Journalism & Mass Communication Research Award

In democratic societies, investigative journalism holds government and private institutions accountable to the public. From firings and resignations to changes in budgets and laws, the impact of this reporting can be significant—but so too are the costs. As newspapers confront shrinking subscriptions and advertising revenue, who is footing the bill for journalists to carry out their essential work? Democracy’s Detectives puts investigative journalism under a magnifying glass to clarify the challenges and opportunities facing news organizations today.

“Hamilton’s book presents a thoughtful and detailed case for the indispensability of investigative journalism—and just at the time when we needed it. Now more than ever, reporters can play an essential role as society’s watchdogs, working to expose corruption, greed, and injustice of the years to come. For this reason, Democracy’s Detectives should be taken as both a call to arms and a bracing reminder, for readers and journalists alike, of the importance of the profession.”
—Anya Schiffrin, The Nation

“A highly original look at exactly what the subtitle promises…Has this topic ever been more important than this year?”
—Tyler Cowen, Marginal Revolution


front cover of Dialectical Rhetoric
Dialectical Rhetoric
Bruce Mccomiskey
Utah State University Press, 2015

In Dialectical Rhetoric, Bruce McComiskey argues that the historical conflict between rhetoric and dialectic can be overcome in ways useful to both composition theory and the composition classroom.

Historically, dialectic has taken two forms in relation to rhetoric. First, it has been the logical development of linear propositions leading to necessary conclusions, a one-dimensional form that was the counterpart of rhetorics in which philosophical, metaphysical, and scientific truths were conveyed with as little cognitive interference from language as possible. Second, dialectic has been the topical development of opposed arguments on controversial issues and the judgment of their relative strengths and weaknesses, usually in political and legal contexts, a two-dimensional form that was the counterpart of rhetorics in which verbal battles over competing probabilities in public institutions revealed distinct winners and losers.

The discipline of writing studies is on the brink of developing a new relationship between dialectic and rhetoric, one in which dialectics and rhetorics mediate and negotiate different arguments and orientations that are engaged in any rhetorical situation. This new relationship consists of a three-dimensional hybrid art called “dialectical rhetoric,” whose method is based on five topoi: deconstruction, dialogue, identification, critique, and juxtaposition. Three-dimensional dialectical rhetorics function effectively in a wide variety of discursive contexts, including digital environments, since they can invoke contrasts in stagnant contexts and promote associations in chaotic contexts. Dialectical Rhetoric focuses more attention on three-dimensional rhetorics from the rhetoric and composition community. 


front cover of Digital Critical Editions
Digital Critical Editions
Edited by Daniel Apollon, Claire Belisle, and Philippe Regnier
University of Illinois Press, 2014

Provocative yet sober, Digital Critical Editions examines how transitioning from print to a digital milieu deeply affects how scholars deal with the work of editing critical texts. On one hand, forces like changing technology and evolving reader expectations lead to the development of specific editorial products, while on the other hand, they threaten traditional forms of knowledge and methods of textual scholarship.

Using the experiences of philologists, text critics, text encoders, scientific editors, and media analysts, Digital Critical Editions ranges from philology in ancient Alexandria to the vision of user-supported online critical editing, from peer-directed texts distributed to a few to community-edited products shaped by the many. The authors discuss the production and accessibility of documents, the emergence of tools used in scholarly work, new editing regimes, and how the readers' expectations evolve as they navigate digital texts. The goal: exploring questions such as, What kind of text is produced? Why is it produced in this particular way?

Digital Critical Editions provides digital editors, researchers, readers, and technological actors with insights for addressing disruptions that arise from the clash of traditional and digital cultures, while also offering a practical roadmap for processing traditional texts and collections with today's state-of-the-art editing and research techniques thus addressing readers' new emerging reading habits.


logo for Assoc of College & Research Libraries
Digital Humanities In The Library
Challenges And
Adrianne Hartsell-Gundy
Assoc of College & Research Libraries, 2015

logo for Assoc of College & Research Libraries
Digital Humanities in the Library
Challenges and Opportunities for Subject Specialists
Arianne Hartsell-Gundy
Assoc of College & Research Libraries, 2015

logo for Intellect Books
Digital Magazine Design
With Case Studies
Daniel Carpenter and Paul Honeywill
Intellect Books, 2000
Publishers of contemporary magazines invest more and more money in developing innovative design for an increasingly design-literate reader. Innovation, however, must always be grounded in the underlying conventions of legibility to ensure loyal readership and economic success.

Digital Magazine Design provides detailed descriptions of all the necessary rules of design, and uses these rules to cast a critical eye over a selection of contemporary high-street magazines.

The second part of this volume, written by publishing students, demonstrates how the tools of design can be applied to the analysis and practice of contemporary magazine design.

Through an understanding of the relationship between text, image and design, and the ability to make informed judgements, the student is able to critically evaluate all publishable material.

front cover of Digital Poetics
Digital Poetics
Hypertext, Visual-Kinetic Text and Writing in Programmable Media
Loss Pequeño Glazier
University of Alabama Press, 2001
Glazier investigates the ways in which computer technology has influenced and transformed the writing and dissemination of poetry
In Digital Poetics, Loss Pequeño Glazier argues that the increase in computer technology and accessibility, specifically the World Wide Web, has created a new and viable place for the writing and dissemination of poetry. Glazier’s work not only introduces the reader to the current state of electronic writing but also outlines the historical and technical contexts out of which electronic poetry has emerged and demonstrates some of the possibilities of the new medium.
Glazier examines three principal forms of electronic textuality: hypertext, visual/kinetic text, and works in programmable media. He considers avant-garde poetics and its relationship to the on-line age, the relationship between web “pages” and book technology, and the way in which certain kinds of web constructions are in and of themselves a type of writing. With convincing alacrity, Glazier argues that the materiality of electronic writing has changed the idea of writing itself. He concludes that electronic space is the true home of poetry and, in the 20th century, has become the ultimate “space of poesis.”
Digital Poetics will attract a readership of scholars and students interested in contemporary creative writing and the potential of electronic media for imaginative expression.

logo for West Virginia University Press
West Virginia University Press, 2001

front cover of Digital Rhetoric
Digital Rhetoric
Theory, Method, Practice
Douglas Eyman
University of Michigan Press, 2015
What is “digital rhetoric”? This book aims to answer that question by looking at a number of interrelated histories, as well as evaluating a wide range of methods and practices from fields in the humanities, social sciences, and information sciences to determine what might constitute the work and the world of digital rhetoric. The advent of digital and networked communication technologies prompts renewed interest in basic questions such as What counts as a text? and Can traditional rhetoric operate in digital spheres or will it need to be revised? Or will we need to invent new rhetorical practices altogether?

Through examples and consideration of digital rhetoric theories, methods for both researching and making in digital rhetoric fields, and examples of digital rhetoric pedagogy, scholarship, and public performance, this book delivers a broad overview of digital rhetoric. In addition, Douglas Eyman provides historical context by investigating the histories and boundaries that arise from mapping this emerging field and by focusing on the theories that have been taken up and revised by digital rhetoric scholars and practitioners. Both traditional and new methods are examined for the tools they provide that can be used to both study digital rhetoric and to potentially make new forms that draw on digital rhetoric for their persuasive power.


front cover of Digital Typography
Digital Typography
Donald E. Knuth
CSLI, 1998
In this collection, the second in the series, Knuth explores the relationship between computers and typography. The present volume, in the words of the author, is a legacy to all the work he has done on typography. When he thought he would take a few years' leave from his main work on the art of computer programming, as is well known, the short typographic detour lasted more than a decade. When type designers, punch cutters, typographers, book historians, and scholars visited the University during this period, it gave to Stanford what some consider to be its golden age of digital typography. By the author's own admission, the present work is one of the most difficult books that he has prepared. This is truly a work that only Knuth himself could have produced.

front cover of Distant Readings of Disciplinarity
Distant Readings of Disciplinarity
Knowing and Doing in Composition/Rhetoric Dissertations
by Benjamin Miller
Utah State University Press, 2022
In Distant Readings of Disciplinarity, Benjamin Miller brings a big data approach to the study of disciplinarity in rhetoric, composition, and writing studies (RCWS) by developing scalable maps of the methods and topics of several thousand RCWS dissertations from 2001 to 2015. Combining charts and figures with engaging and even playful prose, Miller offers an accessible model of how large-scale data-driven research can advance disciplinary understanding—both answering and amplifying the call to add replicable data analysis and visualization to the mix of methods regularly employed in the field.

Writing studies has long been marked by a multitude of methods and interlocking purposes, partaking of not just humanities approaches but also social scientific ones, with data drawn from interviews and surveys alongside historical and philosophical arguments and with corpus analytics in large-scale collections jostling against small-scale case studies of individuals. These areas of study aren’t always cleanly separable; shifting modes mark the discipline as open and welcoming to many different angles of research. The field needs to embrace that vantage point and generate new degrees of familiarity with methods beyond those of any individual scholar.
Not only a training genre and not only a knowledge-making genre, the dissertation is also a discipline-producing genre. Illustrating what the field has been studying, and how, Distant Readings of Disciplinarity supports more fruitful collaborations within and across research areas and methods.

front cover of The End of Books--or Books Without End?
The End of Books--or Books Without End?
Reading Interactive Narratives
J. Yellowlees Douglas
University of Michigan Press, 2001
Of all developments surrounding hypermedia, none has been as hotly or frequently debated as the conjunction of fiction and digital technology. J. Yellowlees Douglas considers the implications of this union. She looks at the new light that interactive narratives may shed on theories of reading and interpretation and the possibilities for hypertext novels, World Wide Web-based short stories, and cinematic, interactive narratives on CD-ROM. She confronts questions that are at the center of the current debate: Does an interactive story demand too much from readers? Does the concept of readerly choice destroy the integrity of an author's vision? Does interactivity turn reading fiction from "play" into "work"--too much work? Will hypertext fiction overtake the novel as a form of art or entertainment? And what might future interactive books look like?
The book examines criticism on interactive fiction from both proponents and skeptics and examines similarities and differences between print and hypertext fiction. It looks closely at critically acclaimed interactive works, including Stuart Moulthrop's Victory Garden and Michael Joyce's Afternoon: A Story that illuminate how these hypertext narratives "work." While she sees this as a still-evolving technology and medium, the author identifies possible developments for the future of storytelling from outstanding examples of Web-based fiction and CD-ROM narratives, possibilities that will enable narratives to both portray the world with greater realism an to transcend the boundaries of novels and films, character and plot alike.
Written to be accessible to a wide range of readers, this lively and accessibly-written volume will appeal to those interested in technology and cyberculture, as well as to readers familiar with literary criticism and modern fiction.
J. Yellowlees Douglas is the Director of the William and Grace Dial Center for Written and Oral Communication, University of Florida. She is the author of numerous articles and essays on the subject of hypertext and interactive literature.

front cover of Enumerations
Data and Literary Study
Andrew Piper
University of Chicago Press, 2018
For well over a century, academic disciplines have studied human behavior using quantitative information. Until recently, however, the humanities have remained largely immune to the use of data—or vigorously resisted it. Thanks to new developments in computer science and natural language processing, literary scholars have embraced the quantitative study of literary works and have helped make Digital Humanities a rapidly growing field. But these developments raise a fundamental, and as yet unanswered question: what is the meaning of literary quantity?
          In Enumerations, Andrew Piper answers that question across a variety of domains fundamental to the study of literature. He focuses on the elementary particles of literature, from the role of punctuation in poetry, the matter of plot in novels, the study of topoi, and the behavior of characters, to the nature of fictional language and the shape of a poet’s career. How does quantity affect our understanding of these categories? What happens when we look at 3,388,230 punctuation marks, 1.4 billion words, or 650,000 fictional characters? Does this change how we think about poetry, the novel, fictionality, character, the commonplace, or the writer’s career? In the course of answering such questions, Piper introduces readers to the analytical building blocks of computational text analysis and brings them to bear on fundamental concerns of literary scholarship. This book will be essential reading for anyone interested in Digital Humanities and the future of literary study.

front cover of Finite-State Morphology
Finite-State Morphology
Kenneth R. Beesley and Lauri Karttunen
CSLI, 2003
The finite-state paradigm of computer science has provided a basis for natural-language applications that are efficient, elegant, and robust. This volume is a practical guide to finite-state theory and the affiliated programming languages lexc and xfst. Readers will learn how to write tokenizers, spelling checkers, and especially morphological analyzer/generators for words in English, French, Finnish, Hungarian, and other languages.

Included are graded introductions, examples, and exercises suitable for individual study as well as formal courses. These take advantage of widely-tested lexc and xfst applications that are just becoming available for noncommercial use via the Internet.

front cover of Flexible Semantics for Reinterpretation Phenomena
Flexible Semantics for Reinterpretation Phenomena
Markus Egg
CSLI, 2005
Deriving the correct meaning of such colloquial expressions as "I am parked out back" requires a unique interaction of knowledge about the world with a person's natural language tools. In this volume, Markus Egg examines how natural language rules and knowledge of the world work together to produce correct understandings of expressions that cannot be fully understood through literal reading. An in-depth and exciting work on semantics and natural language, this volume will be essential reading for scholars in computational linguistics.

front cover of A Grammar Writer's Cookbook
A Grammar Writer's Cookbook
Edited by Miriam Butt, Tracy Holloway King, María-Eugenia Niño, and Frédérique S
CSLI, 1999
A Grammar Writer's Cookbook is an introduction to the issues involved in the writing and design of computational grammars, reporting on experiences and analyses within the ParGram parallel grammar development project. Using the Lexical Functional Grammar (LFG) framework, this project implemented grammars for German, French, and English to cover parallel corpora.

front cover of A Guide to MATLAB® Object-Oriented Programming
A Guide to MATLAB® Object-Oriented Programming
Andy H. Register
The Institution of Engineering and Technology, 2007
A Guide to MATLAB® Object-Oriented Programming is the first book to deliver broad coverage of the documented and undocumented object-oriented features of MATLAB®. Unlike the typical approach of other resources, this guide explains why each feature is important, demonstrates how each feature is used, and promotes an understanding of the interactions between features.

front cover of The Iconic Page in Manuscript, Print, and Digital Culture
The Iconic Page in Manuscript, Print, and Digital Culture
George Bornstein and Theresa Tinkle, Editors
University of Michigan Press, 1998
Most readers think of a written work as producing its meaning through the words it contains. But what is the significance of the detailed and beautiful illuminations on a medieval manuscript? Of the deliberately chosen typefaces in a book of poems by Yeats? Of the design and layout of text in an electronic format? How does the material form of a work shape its understanding in a particular historical moment, in a particular culture?
The material features of texts as physical artifacts--their "bibliographic codes" --have over the last decade excited increasing interest in a variety of disciplines. The Iconic Page in Manuscript, Print, and Digital Culture gathers essays by an extraordinarily distinguished group of scholars to offer the most comprehensive examination of these issues yet, drawing on examples from literature, history, the fine arts, and philosophy.
Fittingly, the volume contains over two dozen illustrations that display the iconic features of the works analyzed--from Alfred the Great's Boethius through medieval manuscripts to the philosophy of C. S. Peirce and the dustjackets on works by F. Scott Fitzgerald and William Styron.
The Iconic Page in Manuscript, Print, and Digital Culture will be groundbreaking reading for scholars in a wide range of fields.
George Bornstein is C. A. Patrides Professor of English, University of Michigan. Theresa Tinkle is Associate Professor of English, University of Michigan.

front cover of Interdisciplining Digital Humanities
Interdisciplining Digital Humanities
Boundary Work in an Emerging Field
Julie Thompson Klein
University of Michigan Press, 2015
Interdisciplining Digital Humanities sorts through definitions and patterns of practice over roughly sixty-five years of work, providing an overview for specialists and a general audience alike. It is the only book that tests the widespread claim that Digital Humanities is interdisciplinary. By examining the boundary work of constructing, expanding, and sustaining a new field, it depicts both the ways this new field is being situated within individual domains and dynamic cross-fertilizations that are fostering new relationships across academic boundaries. It also accounts for digital reinvigorations of “public humanities” in cultural heritage institutions of museums, archives, libraries, and community forums.

front cover of Language at Work
Language at Work
Analyzing Communication Breakdown in the Workplace to Inform Systems Design
Keith Devlin and Duska Rosenberg
CSLI, 1996
People are very creative in their use of language. This observation was made convincingly by Chomsky in the 1950s and is generally accepted in the scientific communities concerned with the study of language. Computers, on the other hand, are neither creative, flexible, nor adaptable. This is in spite of the fact that their ability to process language is based largely on the grammars developed by linguists and computer scientists. Thus, there is a mismatch between the observed human creativity and our ability as theorists to explain it. Language at Work examines grammars and other descriptions of language by combining the scientific and the practical. The scientific motivation is to unite distinct intellectual traditions, mathematics and descriptive social science, which have tried to provide an adequate explanation of language and its use on their own to no avail. This volume argues that Situation Theory, a theory of information couched in mathematics, has provided a uniform framework for the investigation of the creative aspects of language use. The application of Situation Theory in the study of language use in everyday communication to improve human/computer interaction is explored and espoused.

front cover of Least Cost Analysis of Social Landscapes
Least Cost Analysis of Social Landscapes
Archaeological Case Studies
Devin A. White
University of Utah Press, 2012

A growing number of archaeologists are applying Geographic Information Science (GIS) technologies to their research problems and questions. Advances in GIS and its use across disciplines allows for collaboration and enables archaeologists to ask ever more sophisticated questions and develop increasingly elaborate models on numerous aspects of past human behavior. Least cost analysis (LCA) is one such avenue of inquiry. While least cost studies are not new to the social sciences in general, LCA is relatively new to archaeology; until now, there has been no systematic exploration of its use within the field.

This edited volume presents a series of case studies illustrating the intersection of archaeology and LCA modeling at the practical, methodological, and theoretical levels. Designed to be a guidebook for archaeologists interested in using LCA in their own research, it presents a wide cross-section of practical examples for both novices and experts. The contributors to the volume showcase the richness and diversity of LCA’s application to archaeological questions, demonstrate that even simple applications can be used to explore sophisticated research questions, and highlight the challenges that come with injecting geospatial technologies into the archaeological research process.


front cover of Machine Scoring of Student Essays
Machine Scoring of Student Essays
Truth and Consequences
edited by Patricia Freitag Ericsson & Rich Haswell
Utah State University Press, 2006

The current trend toward machine-scoring of student work, Ericsson and Haswell argue, has created an emerging issue with implications for higher education across the disciplines, but with particular importance for those in English departments and in administration. The academic community has been silent on the issue—some would say excluded from it—while the commercial entities who develop essay-scoring software have been very active.

Machine Scoring of Student Essays is the first volume to seriously consider the educational mechanisms and consequences of this trend, and it offers important discussions from some of the leading scholars in writing assessment.

Reading and evaluating student writing is a time-consuming process, yet it is a vital part of both student placement and coursework at post-secondary institutions. In recent years, commercial computer-evaluation programs have been developed to score student essays in both of these contexts. Two-year colleges have been especially drawn to these programs, but four-year institutions are moving to them as well, because of the cost-savings they promise. Unfortunately, to a large extent, the programs have been written, and institutions are installing them, without attention to their instructional validity or adequacy.

Since the education software companies are moving so rapidly into what they perceive as a promising new market, a wider discussion of machine-scoring is vital if scholars hope to influence development and/or implementation of the programs being created. What is needed, then, is a critical resource to help teachers and administrators evaluate programs they might be considering, and to more fully envision the instructional consequences of adopting them. And this is the resource that Ericsson and Haswell are providing here.


front cover of Macroanalysis
Digital Methods and Literary History
Matthew L. Jockers
University of Illinois Press, 2013
In this volume, Matthew L. Jockers introduces readers to large-scale literary computing and the revolutionary potential of macroanalysis--a new approach to the study of the literary record designed for probing the digital-textual world as it exists today, in digital form and in large quantities. Using computational analysis to retrieve key words, phrases, and linguistic patterns across thousands of texts in digital libraries, researchers can draw conclusions based on quantifiable evidence regarding how literary trends are employed over time, across periods, within regions, or within demographic groups, as well as how cultural, historical, and societal linkages may bind individual authors, texts, and genres into an aggregate literary culture.
Moving beyond the limitations of literary interpretation based on the "close-reading" of individual works, Jockers describes how this new method of studying large collections of digital material can help us to better understand and contextualize the individual works within those collections.


front cover of MATLAB® for Electrical and Computer Engineering Students and Professionals
MATLAB® for Electrical and Computer Engineering Students and Professionals
With Simulink®
Roland Priemer
The Institution of Engineering and Technology, 2013
This book combines the teaching of the MATLAB® programming language with the presentation and development of carefully selected electrical and computer engineering (ECE) fundamentals. This is what distinguishes it from other books concerned with MATLAB®: it is directed specifically to ECE concerns. Students will see, quite explicitly, how and why MATLAB® is well suited to solve practical ECE problems.

front cover of Multiliteracies for a Digital Age
Multiliteracies for a Digital Age
Stuart A. Selber
Southern Illinois University Press, 2004

Just as the majority of books about computer literacy deal more with technological issues than with literacy issues, most computer literacy programs overemphasize technical skills and fail to adequately prepare students for the writing and communications tasks in a technology-driven era. Multiliteracies for a Digital Age serves as a guide for composition teachers to develop effective, full-scale computer literacy programs that are also professionally responsible by emphasizing different kinds of literacies and proposing methods for helping students move among them in strategic ways.

Defining computer literacy as a domain of writing and communication, Stuart A. Selber addresses the questions that few other computer literacy texts consider: What should a computer literate student be able to do? What is required of literacy teachers to educate such a student? How can functional computer literacy fit within the values of teaching writing and communication as a profession? Reimagining functional literacy in ways that speak to teachers of writing and communication, he builds a framework for computer literacy instruction that blends functional, critical, and rhetorical concerns in the interest of social action and change.

Multiliteracies for a Digital Age reviews the extensive literature on computer literacy and critiques it from a humanistic perspective. This approach, which will remain useful as new versions of computer hardware and software inevitably replace old versions, helps to usher students into an understanding of the biases, belief systems, and politics inherent in technological contexts. Selber redefines rhetoric at the nexus of technology and literacy and argues that students should be prepared as authors of twenty-first-century texts that defy the established purview of English departments. The result is a rich portrait of the ideal multiliterate student in a digital age and a social approach to computer literacy envisioned with the requirements for systemic change in mind.


front cover of Natures of Data
Natures of Data
A Discussion between Biologists, Artists and Science Scholars
Philipp Fischer, Gabriele Gramelsberger, Christoph Hoffmann, Hans Hofmann, Hans-Jörg Rheinberger, and Hannes Rickli
Diaphanes, 2020
Computer-based technologies for the production and analysis of data have been an integral part of biological research since the 1990s at the latest. This not only applies to genomics and its offshoots but also to less conspicuous subsections such as ecology. But little consideration has been given to how this new technology has changed research practically. How and when do data become questionable? To what extent does necessary infrastructure influence the research process? What status is given to software and algorithms in the production and analysis of data? These questions are discussed by the biologists Philipp Fischer and Hans Hofmann, the philosopher Gabriele Gramelsberger, the historian of science and biology Hans-Jörg Rheinberger, the science theorist Christoph Hoffmann, and the artist Hannes Rickli. The conditions of experimentation in the digital sphere are examined in four chapters—“Data,” “Software,” “Infrastructure,” and “in silico”—in which the different perspectives of the discussion partners complement one another. Rather than confirming any particular point of view, Natures of Data deepens understanding of the contemporary basis of biological research.

front cover of Network Medicine
Network Medicine
Complex Systems in Human Disease and Therapeutics
Joseph Loscalzo
Harvard University Press, 2017

Big data, genomics, and quantitative approaches to network-based analysis are combining to advance the frontiers of medicine as never before. Network Medicine introduces this rapidly evolving field of medical research, which promises to revolutionize the diagnosis and treatment of human diseases. With contributions from leading experts that highlight the necessity of a team-based approach in network medicine, this definitive volume provides readers with a state-of-the-art synthesis of the progress being made and the challenges that remain.

Medical researchers have long sought to identify single molecular defects that cause diseases, with the goal of developing silver-bullet therapies to treat them. But this paradigm overlooks the inherent complexity of human diseases and has often led to treatments that are inadequate or fraught with adverse side effects. Rather than trying to force disease pathogenesis into a reductionist model, network medicine embraces the complexity of multiple influences on disease and relies on many different types of networks: from the cellular-molecular level of protein-protein interactions to correlational studies of gene expression in biological samples. The authors offer a systematic approach to understanding complex diseases while explaining network medicine’s unique features, including the application of modern genomics technologies, biostatistics and bioinformatics, and dynamic systems analysis of complex molecular networks in an integrative context.

By developing techniques and technologies that comprehensively assess genetic variation, cellular metabolism, and protein function, network medicine is opening up new vistas for uncovering causes and identifying cures of disease.


front cover of New Technologies and Renaissance Studies
New Technologies and Renaissance Studies
Edited by William R. Bowen and Raymond G. Siemens
Iter Press, 2008
Near the forefront of any examination of disciplinary pursuits in the academy today, among the many important issues being addressed is the role of computing and its integration into, and perhaps revolutionizing of, central methodological approaches. The series New Technologies in Medieval and Renaissance Studies addresses this context from both broad and narrow perspectives, with anticipated discussions rooted in areas including literature, art history, musicology, and culture in the medieval and Renaissance periods.

The first volume of the series, New Technologies and Renaissance Studies, presents a collection of contributions to one ongoing forum for the dialogue which lies at the heart of the book series, the annual "conference within a conference" of the same name which takes place during the Renaissance Society of America gathering, dedicated specifically to the intersection of computational methods and Renaissance studies. Papers in this volume exemplify those fruitful and productive exchanges, from their inception at the 2001 meeting in Chicago to the 2005 meeting in Cambridge.

front cover of New Technologies and Renaissance Studies II
New Technologies and Renaissance Studies II
Edited by Tassie Gniady, Kris McAbee, Jessica Murphy
Iter Press, 2014
Near the forefront of any examination of disciplinary pursuits in the academy today, among the many important issues being addressed is the role of computing and its integration into, and perhaps revolutionizing of, central methodological approaches. The series New Technologies in Medieval and Renaissance Studies addresses this context from both broad and narrow perspectives, with anticipated discussions rooted in areas including literature, art history, musicology, and culture in the medieval and Renaissance periods.

In the fourth volume of the New Technologies in Medieval and Renaissance Studies series, volume editors Tassie Gniady, Kris McAbee, and Jessica Murphy bring together some of the best work from the New Technologies in Medieval and Renaissance Studies panels at the Renaissance Society of America (RSA) annual meetings for the years 2004–2010. These essays demonstrate a dedication to grounding the use of “newest” practices in the theories of the early modern period. At the same time, the essays are interested in the moment—the needs of scholars then, the theories of media that informed current understanding, and the tools used to conduct studies.

front cover of New Technologies and Renaissance Studies III
New Technologies and Renaissance Studies III
Edited by Matthew Evan Davis and Colin Wilder
Iter Press, 2022
These essays explore problems with digital approaches to analog objects and offer digital methods to study networks of production, dissemination, and collection. Further, they reflect on the limitations of those methods and speak to a central truth of digital projects: unlike traditional scholarship, digital scholarship is often the result of collective networks of not only disciplinary scholars but also of library professionals and other technical and professional staff as well as students.

logo for The Institution of Engineering and Technology
Numerical Methods for Engineering
An introduction using MATLAB® and computational electromagnetics examples
Karl F. Warnick
The Institution of Engineering and Technology, 2011
This textbook teaches students to create computer codes used to engineer antennas, microwave circuits, and other critical technologies for wireless communications and other applications of electromagnetic fields and waves. Worked code examples are provided for MATLAB technical computing software. It is the only textbook on numerical methods that begins at the undergraduate engineering student level but brings students to the state-of-the-art by the end of the book. It focuses on the most important and popular numerical methods, going into depth with examples and problem sets of escalating complexity. This book requires only one core course of electromagnetics, allowing it to be useful both at the senior and beginning graduate levels. Developing and using numerical methods in a powerful tool for students to learn the principles of intermediate and advanced electromagnetics. This book fills the missing space of current textbooks that either lack depth on key topics (particularly integral equations and the method of moments) and where the treatment is not accessible to students without an advanced theory course. Important topics include: Method of Moments; Finite Difference Time Domain Method; Finite Element Method; Finite Element Method-Boundary Element Method; Numerical Optimization; and Inverse Scattering.

front cover of Open-Domain Question Answering from Large Text Collections
Open-Domain Question Answering from Large Text Collections
Marius Pasca
CSLI, 2003
Many books have indexes, but most textual media have none. Newspapers, legal transcripts, conference proceedings, correspondence, video subtitles, and web pages are increasingly accessible with computers, yet are still without indexes or other sophisticated means of finding the excerpts most relevant to a reader's question.

Better than an index, and much better than a keyword search, are the high-precision computerized question-answering systems explored in this book. Marius Pasca presents novel and robust methods for capturing the semantics of natural language questions and for finding the most relevant portions of texts. This research has led to a fully implemented and rigorously evaluated architecture that has produced experimental results showing great promise for the future of internet search technology.

front cover of Othermindedness
The Emergence of Network Culture
Michael Joyce
University of Michigan Press, 2001

Michael Joyce's new collection continues to examine the connections between the poles of art and instruction, writing and teaching in the form of what Joyce has called theoretical narratives, pieces that are both narratives of theory and texts in which theory often takes the form of narrative. His concerns include hypertext and interactive fiction, the geography of cyberspace, and interactive film, and Joyce here searches out the emergence of network culture in spaces ranging from the shifting nature of the library to MOOs and other virtual spaces to life along a river.

While in this collection Joyce continues to be one of our most lyrical, wide-ranging, and informed cultural critics and theorists of new media, his essays exhibit an evolving distrust of unconsidered claims for newness in the midst of what Joyce calls "the blizzard of the next," as well as a recurrent insistence upon grounding our experience of the emergence of network culture in the body.

Michael Joyce is Associate Professor of English, Vassar College. He is author of a number of hypertext fictions on the web and on disk, most notably Afternoon: A Story.

His previous books are Of Two Minds: Hypertext Pedagogy and Poetics and Moral Tale and Meditations: Technological Parables and Refractions.


front cover of Pascal Programming for Music Research
Pascal Programming for Music Research
Alexander R. Brinkman
University of Chicago Press, 1990
Pascal Programming for Music Research addresses those who wish to develop the programming skills necessary for doing computer-assisted music research, particularly in the fields of music theory and musicology. Many of the programming techniques are also applicable to computer assisted instruction (CAI), composition, and music synthesis. The programs and techniques can be implemented on personal computers or larger computer systems using standard Pascal compilers and will be valuable to anyone in the humanities creating data bases.

Among its useful features are:
-complete programs, from simple illustrations to substantial applications;
-beginning programming through such advanced topics as linked data structures, recursive algorithms, DARMS translation, score processing;
-bibliographic references at the end of each chapter to pertinent sources in music theory, computer science, and computer applications in music;
-exercises which explore and extend topics discussed in the text;
-appendices which include a DARMS translator and a library of procedures for building and manipulating a linked representation of scores;
-most algorithms and techniques that are given in Pascal programming translate easily to other computer languages.

Beginning, as well as advanced, programmers and anyone interested in programming music applications will find this book to be an invaluable resource.

front cover of Passions Pedagogies and 21st Century Technologies
Passions Pedagogies and 21st Century Technologies
edited by Gail E. Hawisher & Cynthia L. Selfe
Utah State University Press, 1999

Gail Hawisher and Cynthia Selfe created a volume that set the agenda in the field of computers and composition scholarship for a decade. The technology changes that scholars of composition studies faced as the new century opened couldn't have been more deserving of passionate study. While we have always used technologies (e.g., the pencil) to communicate with each other, the electronic technologies we now use have changed the world in ways that we have yet to identify or appreciate fully. Likewise, the study of language and literate exchange, even our understanding of terms like literacy, text, and visual, has changed beyond recognition, challenging even our capacity to articulate them.

As Hawisher, Selfe, and their contributors engage these challenges and explore their importance, they "find themselves engaged in the messy, contradictory, and fascinating work of understanding how to live in a new world and a new century." The result is a broad, deep, and rewarding anthology of work still among the standard works of computers and composition study.


front cover of Post-Digital Rhetoric and the New Aesthetic
Post-Digital Rhetoric and the New Aesthetic
Justin Hodgson
The Ohio State University Press, 2019
The proliferation of smart devices, digital media, and network technologies has led to everyday people experiencing everyday things increasingly on and through the screen. In fact, much of the world has become so saturated by digital mediations that many individuals have adopted digitally inflected sensibilities. This gestures not simply toward posthumanism, but more fundamentally toward an altogether post-digital condition—one in which the boundaries between the “real” and the “digital” have become blurred and technology has fundamentally reconfigured how we make sense of the world.
Post-Digital Rhetoric and the New Aesthetic takes stock of these reconfigurations and their implications for rhetorical studies by taking up the New Aesthetic—a movement introduced by artist/digital futurist James Bridle that was meant to capture something of a digital way of seeing by identifying aesthetic values that could not exist without computational and digital technologies. Bringing together work in rhetoric, art, and digital media studies, Hodgson treats the New Aesthetic as a rhetorical ecology rather than simply an aesthetic movement, allowing him to provide operative guides for the knowing, doing, and making of rhetoric in a post-digital culture.

front cover of Postverbal Behavior
Postverbal Behavior
Thomas Wasow
CSLI, 2002
Compared to many languages, English has relatively fixed word order, but the ordering among phrases following the verb exhibits a good deal of variation. This monograph explores factors that influence the choice among possible orders of postverbal elements, testing hypotheses using a combination of corpus studies and psycholinguistic experiments. Wasow's final chapters explore how studies of language use bear on issues in linguistic theory, with attention to the roles of quantitative data and Chomsky's arguments against the use of statistics and probability in linguistics.

front cover of The Problem with Education Technology (Hint
The Problem with Education Technology (Hint
It's Not the Technology)
Ben Fink
Utah State University Press, 2016

Education is in crisis—at least, so we hear. And at the center of this crisis is technology. New technologies like computer-based classroom instruction, online K–12 schools, MOOCs (massive open online courses), and automated essay scoring may be our last great hope—or the greatest threat we have ever faced.

In The Problem with Education Technology, Ben Fink and Robin Brown look behind the hype to explain the problems—and potential—of these technologies. Focusing on the case of automated essay scoring, they explain the technology, how it works, and what it does and doesn’t do. They explain its origins, its evolution (both in the classroom and in our culture), and the controversy that surrounds it. Most significantly, they expose the real problem—the complicity of teachers and curriculum-builders in creating an education system so mechanical that machines can in fact often replace humans—and how teachers, students, and other citizens can work together to solve it.

Offering a new perspective on the change that educators can hope, organize, and lobby for, The Problem with Education Technology challenges teachers and activists on “our side,” even as it provides new evidence to counter the profit-making, labor-saving logics that drive the current push for technology in the classroom.


front cover of Putting Linguistics into Speech Recognition
Putting Linguistics into Speech Recognition
The Regulus Grammar Compiler
Manny Rayner, Beth Ann Hockey, and Pierrette Bouillon
CSLI, 2006
Most computer programs that analyze spoken dialogue use a spoken command grammar, which limits what the user can say when talking to the system. To make this process simpler, more automated, and effective for command grammars even at initial stages of a project, the Regulus grammar compiler was developed by a consortium of experts—including NASA scientists. This book presents a complete description of both the practical and theoretical aspects of Regulus and will be extremely helpful for students and scholars working in computational linguistics as well as software engineering.

front cover of The Quality of the Archaeological Record
The Quality of the Archaeological Record
Charles Perreault
University of Chicago Press, 2019
Paleobiology struggled for decades to influence our understanding of evolution and the history of life because it was stymied by a focus on microevolution and an incredibly patchy fossil record. But in the 1970s, the field took a radical turn, as paleobiologists began to investigate processes that could only be recognized in the fossil record across larger scales of time and space. That turn led to a new wave of macroevolutionary investigations, novel insights into the evolution of species, and a growing prominence for the field among the biological sciences.

In The Quality of the Archaeological Record, Charles Perreault shows that archaeology not only faces a parallel problem, but may also find a model in the rise of paleobiology for a shift in the science and theory of the field. To get there, he proposes a more macroscale approach to making sense of the archaeological record, an approach that reveals patterns and processes not visible within the span of a human lifetime, but rather across an observation window thousands of years long and thousands of kilometers wide. Just as with the fossil record, the archaeological record has the scope necessary to detect macroscale cultural phenomena because it can provide samples that are large enough to cancel out the noise generated by micro-scale events. By recalibrating their research to the quality of the archaeological record and developing a true macroarchaeology program, Perreault argues, archaeologists can finally unleash the full contributive value of their discipline.

front cover of Reading Machines
Reading Machines
Toward and Algorithmic Criticism
Stephen Ramsay
University of Illinois Press, 2011
Besides familiar and now-commonplace tasks that computers do all the time, what else are they capable of? Stephen Ramsay's intriguing study of computational text analysis examines how computers can be used as "reading machines" to open up entirely new possibilities for literary critics. Computer-based text analysis has been employed for the past several decades as a way of searching, collating, and indexing texts. Despite this, the digital revolution has not penetrated the core activity of literary studies: interpretive analysis of written texts.
Computers can handle vast amounts of data, allowing for the comparison of texts in ways that were previously too overwhelming for individuals, but they may also assist in enhancing the entirely necessary role of subjectivity in critical interpretation. Reading Machines discusses the importance of this new form of text analysis conducted with the assistance of computers. Ramsay suggests that the rigidity of computation can be enlisted in the project of intuition, subjectivity, and play.

front cover of Representation and Inference for Natural Language
Representation and Inference for Natural Language
A First Course in Computational Semantics
Patrick Blackburn and Johan Bos
CSLI, 2005
How can computers distinguish the coherent from the unintelligible, recognize new information in a sentence, or draw inferences from a natural language passage? Computational semantics is an exciting new field that seeks answers to these questions, and this volume is the first textbook wholly devoted to this growing subdiscipline. The book explains the underlying theoretical issues and fundamental techniques for computing semantic representations for fragments of natural language. This volume will be an essential text for computer scientists, linguists, and anyone interested in the development of computational semantics.

front cover of Rhetorical Code Studies
Rhetorical Code Studies
Discovering Arguments in and around Code
Kevin Brock
University of Michigan Press, 2019
Winner of the 2017 Sweetland Digital Rhetoric Collaborative Book Prize
Software developers work rhetorically to make meaning through the code they write. In some ways, writing code is like any other form of communication; in others, it proves to be new, exciting, and unique. In Rhetorical Code Studies, Kevin Brock explores how software code serves as meaningful communication through which software developers construct arguments that are made up of logical procedures and express both implicit and explicit claims as to how a given program operates.

Building on current scholarly work in digital rhetoric, software studies, and technical communication, Brock connects and continues ongoing conversations among rhetoricians, technical communicators, software studies scholars, and programming practitioners to demonstrate how software code and its surrounding discourse are highly rhetorical forms of communication. He considers examples ranging from large, well-known projects like Mozilla Firefox to small-scale programs like the “FizzBuzz” test common in many programming job interviews. Undertaking specific examinations of code texts as well as the contexts surrounding their composition, Brock illuminates the variety and depth of rhetorical activity taking place in and around code, from individual differences in style to changes in large-scale organizational and community norms.

Rhetorical Code Studies holds significant implications for digital communication, multimodal composition, and the cultural analysis of software and its creation. It will interest academics and students of writing, rhetoric, and software engineering as well as technical communicators and developers of all types of software.

front cover of Rhetorical Delivery as Technological Discourse
Rhetorical Delivery as Technological Discourse
A Cross-Historical Study
Ben McCorkle
Southern Illinois University Press, 2012

According to Ben McCorkle, the rhetorical canon of delivery—traditionally seen as the aspect of oratory pertaining to vocal tone, inflection, and physical gesture—has undergone a period of renewal within the last few decades to include the array of typefaces, color palettes, graphics, and other design elements used to convey a message to a chosen audience. McCorkle posits that this redefinition, while a noteworthy moment of modern rhetorical theory, is just the latest instance in a historical pattern of interaction between rhetoric and technology. In Rhetorical

Delivery as Technological Discourse: A Cross-Historical Study, McCorkle explores the symbiotic relationship between delivery and technologies of writing and communication. Aiming to enhance historical understanding by demonstrating how changes in writing technology have altered our conception of delivery, McCorkle reveals the ways in which oratory and the tools of written expression have directly affected one another throughout the ages.

To make his argument, the author examines case studies from significant historical moments in the Western  rhetorical tradition. Beginning with the ancient Greeks, McCorkle illustrates how the increasingly literate Greeks developed rhetorical theories intended for oratory that incorporated “writerly” tendencies, diminishing delivery’s once-prime status in the process. Also explored is the near-eradication of rhetorical delivery in the mid-fifteenth century—the period of transition from late manuscript to early print culture—and the implications of the burgeoning

print culture during the nineteenth century.

McCorkle then investigates the declining interest in delivery as technology designed to replace the human voice and gesture became prominent at the beginning of the 1900s. Situating scholarship on delivery within a broader  postmodern structure, he moves on to a discussion of the characteristics of contemporary hypertextual and digital communication and its role in reviving the canon, while also anticipating the future of communication technologies, the likely shifts in attitude toward delivery, and the implications of both on the future of teaching rhetoric.

Rhetorical Delivery as Technological Discourse traces a long-view perspective of rhetorical history to present readers a productive reading of the volatile treatment of delivery alongside the parallel history of writing and communication technologies. This rereading will expand knowledge of the canon by not only offering the most thorough treatment of the history of rhetorical delivery available but also inviting conversation about the reciprocal impacts of rhetorical theory and written communication on each other throughout this history.


front cover of Rhetorical Machines
Rhetorical Machines
Writing, Code, and Computational Ethics
Edited by John Jones and Lavinia Hirsu
University of Alabama Press, 2019
A landmark volume that explores the interconnected nature of technologies and rhetorical practice
Rhetorical Machines addresses new approaches to studying computational processes within the growing field of digital rhetoric. While computational code is often seen as value-neutral and mechanical, this volume explores the underlying, and often unexamined, modes of persuasion this code engages. In so doing, it argues that computation is in fact rife with the values of those who create it and thus has powerful ethical and moral implications. From Socrates’s critique of writing in Plato’s Phaedrus to emerging new media and internet culture, the scholars assembled here provide insight into how computation and rhetoric work together to produce social and cultural effects.
This multidisciplinary volume features contributions from scholar-practitioners across the fields of rhetoric, computer science, and writing studies. It is divided into four main sections: “Emergent Machines” examines how technologies and algorithms are framed and entangled in rhetorical processes, “Operational Codes” explores how computational processes are used to achieve rhetorical ends, “Ethical Decisions and Moral Protocols” considers the ethical implications involved in designing software and that software’s impact on computational culture, and the final section includes two scholars’ responses to the preceding chapters. Three of the sections are prefaced by brief conversations with chatbots (autonomous computational agents) addressing some of the primary questions raised in each section.
At the heart of these essays is a call for emerging and established scholars in a vast array of fields to reach interdisciplinary understandings of human-machine interactions. This innovative work will be valuable to scholars and students in a variety of disciplines, including but not limited to rhetoric, computer science, writing studies, and the digital humanities.

front cover of Rhetorics of the Digital Nonhumanities
Rhetorics of the Digital Nonhumanities
Alex Reid
Southern Illinois University Press, 2021
Redefining writing and communication in the digital cosmology
In Rhetorics of the Digital Nonhumanities, author Alex Reid fashions a potent vocabulary from new materialist theory, media theory, postmodern theory, and digital rhetoric to rethink the connections between humans and digital media. Addressed are the familiar concerns that scholars have with digital culture: how technologies affect attention spans, how digital media are used to compose, and how digital rhetoric is taught. 
Rhetoric is now regularly defined as including human and nonhuman actors. Each actor influences the thoughts, arguments, and sentiments that journey through systems of processors, algorithms, humans, air, and metal. The author’s arguments, even though they are unnerving, orient rhetorical practices to a more open, deliberate, and attentive awareness of what we are truly capable of and how we become capable. This volume moves beyond viewing digital media as an expression of human agency. Humans, formed into new collectives of user populations, must negotiate rather than command their way through digital media ecologies. 
Chapters centralize the most pressing questions: How do social media algorithms affect our judgment? How do smart phones shape our attention? These questions demand scholarly practice for attending the world around us. They explore attention and deliberation to embrace digital nonhuman composition. Once we see this brave new world, Reid argues, we are compelled to experiment.

front cover of Scanner Data and Price Indexes
Scanner Data and Price Indexes
Edited by Robert C. Feenstra and Matthew D. Shapiro
University of Chicago Press, 2003
Every time you buy a can of tuna or a new television, its bar code is scanned to record its price and other information. These "scanner data" offer a number of attractive features for economists and statisticians, because they are collected continuously, are available quickly, and record prices for all items sold, not just a statistical sample. But scanner data also present a number of difficulties for current statistical systems.

Scanner Data and Price Indexes assesses both the promise and the challenges of using scanner data to produce economic statistics. Three papers present the results of work in progress at statistical agencies in the U.S., United Kingdom, and Canada, including a project at the U.S. Bureau of Labor Statistics to investigate the feasibility of incorporating scanner data into the monthly Consumer Price Index. Other papers demonstrate the enormous potential of using scanner data to test economic theories and estimate the parameters of economic models, and provide solutions for some of the problems that arise when using scanner data, such as dealing with missing data.

front cover of Science in the Age of Computer Simulation
Science in the Age of Computer Simulation
Eric Winsberg
University of Chicago Press, 2010

Computer simulation was first pioneered as a scientific tool in meteorology and nuclear physics in the period following World War II, but it has grown rapidly to become indispensible in a wide variety of scientific disciplines, including astrophysics, high-energy physics, climate science, engineering, ecology, and economics. Digital computer simulation helps study phenomena of great complexity, but how much do we know about the limits and possibilities of this new scientific practice? How do simulations compare to traditional experiments? And are they reliable? Eric Winsberg seeks to answer these questions in Science in the Age of Computer Simulation.

Scrutinizing these issue with a philosophical lens, Winsberg explores the impact of simulation on such issues as the nature of scientific evidence; the role of values in science; the nature and role of fictions in science; and the relationship between simulation and experiment, theories and data, and theories at different levels of description. Science in the Age of Computer Simulation will transform many of the core issues in philosophy of science, as well as our basic understanding of the role of the digital computer in the sciences.


front cover of The Seductions of Quantification
The Seductions of Quantification
Measuring Human Rights, Gender Violence, and Sex Trafficking
Sally Engle Merry
University of Chicago Press, 2016
We live in a world where seemingly everything can be measured. We rely on indicators to translate social phenomena into simple, quantified terms, which in turn can be used to guide individuals, organizations, and governments in establishing policy. Yet counting things requires finding a way to make them comparable. And in the process of translating the confusion of social life into neat categories, we inevitably strip it of context and meaning—and risk hiding or distorting as much as we reveal.

With The Seductions of Quantification, leading legal anthropologist Sally Engle Merry investigates the techniques by which information is gathered and analyzed in the production of global indicators on human rights, gender violence, and sex trafficking. Although such numbers convey an aura of objective truth and scientific validity, Merry argues persuasively that measurement systems constitute a form of power by incorporating theories about social change in their design but rarely explicitly acknowledging them. For instance, the US State Department’s Trafficking in Persons Report, which ranks countries in terms of their compliance with antitrafficking activities, assumes that prosecuting traffickers as criminals is an effective corrective strategy—overlooking cultures where women and children are frequently sold by their own families. As Merry shows, indicators are indeed seductive in their promise of providing concrete knowledge about how the world works, but they are implemented most successfully when paired with context-rich qualitative accounts grounded in local knowledge.

front cover of Seeing Like a Rover
Seeing Like a Rover
How Robots, Teams, and Images Craft Knowledge of Mars
Janet Vertesi
University of Chicago Press, 2015
In the years since the Mars Exploration Rover Spirit and Opportunity first began transmitting images from the surface of Mars, we have become familiar with the harsh, rocky, rusty-red Martian landscape. But those images are much less straightforward than they may seem to a layperson: each one is the result of a complicated set of decisions and processes involving the large team behind the Rovers.

With Seeing Like a Rover, Janet Vertesi takes us behind the scenes to reveal the work that goes into creating our knowledge of Mars. Every photograph that the Rovers take, she shows, must be processed, manipulated, and interpreted—and all that comes after team members negotiate with each other about what they should even be taking photographs of in the first place. Vertesi’s account of the inspiringly successful Rover project reveals science in action, a world where digital processing uncovers scientific truths, where images are used to craft consensus, and where team members develop an uncanny intimacy with the sensory apparatus of a robot that is millions of miles away. Ultimately, Vertesi shows, every image taken by the Mars Rovers is not merely a picture of Mars—it’s a portrait of the whole Rover team, as well.

front cover of Social Knowledge Creation in the Humanities
Social Knowledge Creation in the Humanities
Volume 1
Edited by Alyssa Arbuckle, Aaron Mauro, and Daniel Powell
Iter Press, 2017

The ubiquity of social media has transformed the scope and scale of scholarly communication in the arts and humanities. The consequences of this new participatory and collaborative environment for humanities research has allowed for fresh approaches to communicating research. Social Knowledge Creation takes up the norms and customs of online life to reorient, redistribute, and oftentimes flatten traditional academic hierarchies. This book discusses the implications of how humanists communicate with the world and looks to how social media shapes research methods. This volume addresses peer-review, open access publishing, tenure and promotion, mentorship, teaching, collaboration, and interdisciplinarity as a comprehensive introduction to these rapidly changing trends in scholarly communication, digital pedagogy, and educational technology. Collaborative structures are rapidly augmenting disciplinary focus of humanities curriculum and the public impact of humanities research teams with new organizational and disciplinary thinking. Social Knowledge Creation represents a particularly dynamic and growing field in which the humanities seeks to find new ways to communicate the legacy and traditions of humanities based inquiry in a 21st century context.

New Technologies in Medieval and Renaissance Studies Volume 7.

Edited by Alyssa Arbuckle, Aaron Mauro, and Daniel Powell


front cover of Spatial Patterns in Landscape Archaeology
Spatial Patterns in Landscape Archaeology
A GIS Procedure to Study Settlement Organization in Early Roman Colonial Territories
Anita Casarotto
Leiden University Press, 2018
This 43rd volume of the ASLU series presents a useful GIS procedure to study settlement patterns in landscape archaeology. In several Mediterranean regions, archaeological sites have been mapped by fieldwalking surveys, producing large amounts of data. These legacy site-based survey data represent an important resource to study ancient settlement organization. Methodological procedures are necessary to cope with the limits of these data, and more importantly with the distortions on data patterns caused by biasing factors.
This book develops and applies a GIS procedure to use legacy survey data in settlement pattern analysis. It consists of two parts. One part regards the assessment of biases that can affect the spatial patterns exhibited by survey data. The other part aims to shed light on the location preferences and settlement strategy of ancient communities underlying site patterns. In this book, a case-study shows how the method works in practice. As part of the research by the Landscapes of Early Roman Colonization project (NWO, Leiden University, KNIR) site-based datasets produced by survey projects in central-southern Italy are examined in a comparative framework to investigate settlement patterns in the early Roman colonial period (3rd century B.C.).

front cover of SpecLab
Digital Aesthetics and Projects in Speculative Computing
Johanna Drucker
University of Chicago Press, 2009

Nearly a decade ago, Johanna Drucker cofounded the University of Virginia’s SpecLab, a digital humanities laboratory dedicated to risky projects with serious aims. In SpecLab she explores the implications of these radical efforts to use critical practices and aesthetic principles against the authority of technology based on analytic models of knowledge.

            Inspired by the imaginative frontiers of graphic arts and experimental literature and the technical possibilities of computation and information management, the projects Drucker engages range from Subjective Meteorology to Artists’ Books Online to the as yet unrealized ’Patacritical Demon, an interactive tool for exposing the structures that underlie our interpretations of text. Illuminating the kind of future such experiments could enable, SpecLab functions as more than a set of case studies at the intersection of computers and humanistic inquiry. It also exemplifies Drucker’s contention that humanists must play a role in designing models of knowledge for the digital age—models that will determine how our culture will function in years to come.


logo for American Library Association
Systems Librarian
Designing Roles, Defining Skills
American Library Association
American Library Association, 1998

logo for American Library Association
Technology for Small and One-Person Libraries
A LITA Guide
Rene J. Erlandson
American Library Association, 2013

front cover of Theater as Data
Theater as Data
Computational Journeys into Theater Research
Miguel Escobar Varela
University of Michigan Press, 2021
In Theater as Data, Miguel Escobar Varela explores the use of computational methods and digital data in theater research. He considers the implications of these new approaches, and explains the roles that statistics and visualizations play. Reflecting on recent debates in the humanities, the author suggests that there are two ways of using data, both of which have a place in theater research. Data-driven methods are closer to the pursuit of verifiable results common in the sciences; and data-assisted methods are closer to the interpretive traditions of the humanities. The book surveys four major areas within theater scholarship: texts (not only playscripts but also theater reviews and program booklets); relationships (both the links between fictional characters and the collaborative networks of artists and producers); motion (the movement of performers and objects on stage); and locations (the coordinates of performance events, venues, and touring circuits). Theater as Data examines important contributions to theater studies from similar computational research, including in classical French drama, collaboration networks in Australian theater, contemporary Portuguese choreography, and global productions of Ibsen. This overview is complemented by short descriptions of the author’s own work in the computational analysis of theater practices in Singapore and Indonesia. The author ends by considering the future of computational theater research, underlining the importance of open data and digital sustainability practices, and encouraging readers to consider the benefits of learning to code. A web companion offers illustrative data, programming tutorials, and videos. 

front cover of Thinking Globally, Composing Locally
Thinking Globally, Composing Locally
Rethinking Online Writing in the Age of the Global Internet
Rich Rice
Utah State University Press, 2018

Thinking Globally, Composing Locally explores how writing and its pedagogy should adapt to the ever-expanding environment of international online communication. Communication to a global audience presents a number of new challenges; writers seeking to connect with individuals from many different cultures must rethink their concept of audience. They must also prepare to address friction that may arise from cross-cultural rhetorical situations, variation in available technology and in access between interlocutors, and disparate legal environments.

The volume offers a pedagogical framework that addresses three interconnected and overarching objectives: using online media to contact audiences from other cultures to share ideas; presenting ideas in a manner that invites audiences from other cultures to recognize, understand, and convey or act upon them; and composing ideas to connect with global audiences to engage in ongoing and meaningful exchanges via online media. Chapters explore a diverse range of pedagogical techniques, including digital notebooks designed to create a space for active dialogic and multicultural inquiry, experience mapping to identify communication disruption points in international customer service, and online forums used in global distance education.

Thinking Globally, Composing Locally will prove an invaluable resource for instructors seeking to address the many exigencies of online writing situations in global environments.

Contributors: Suzanne Blum Malley, Katherine Bridgman, Maury Elizabeth Brown, Kaitlin Clinnin, Cynthia Davidson, Susan Delagrange, Scott Lloyd Dewitt, Amber Engelson, Kay Halasek, Lavinia Hirsu, Daniel Hocutt, Vassiliki Kourbani, Tika Lamsal, Liz Lane, Ben Lauren, J. C. Lee, Ben McCorkle, Jen Michaels, Minh-Tam Nguyen, Beau S. Pihlaja, Mª Pilar Milagros, Cynthia L. Selfe, Heather Turner, Don Unger, Josephine Walwema


front cover of Twining
Critical and Creative Approaches to Hypertext Narratives
Anastasia Salter
Amherst College Press, 2021
Hypertext is now commonplace: links and linking structure nearly all of our experiences online. Yet the literary, as opposed to commercial, potential of hypertext has receded. One of the few tools still focused on hypertext as a means for digital storytelling is Twine, a platform for building choice-driven stories without relying heavily on code. In Twining, Anastasia Salter and Stuart Moulthrop lead readers on a journey at once technical, critical, contextual, and personal. The book’s chapters alternate careful, stepwise discussion of adaptable Twine projects, offer commentary on exemplary Twine works, and discuss Twine’s technological and cultural background. Beyond telling the story of Twine and how to make Twine stories, Twining reflects on the ongoing process of making.

"While there have certainly been attempts to study Twine historically and theoretically... no single publication has provided such a detailed account of it. And no publication has even attempted to situate Twine amongst its many different conversations and traditions, something this book does masterfully." —James Brown, Rutgers University, Camden

logo for American Library Association
Understanding Data and Information Systems for Recordkeeping
Philip C. Bantin
American Library Association, 2008

logo for American Library Association
Using Digital Analytics for Smart Assessment
Tabatha Farney
American Library Association, 2017

front cover of VALU, AVX and GPU Acceleration Techniques for Parallel FDTD Methods
VALU, AVX and GPU Acceleration Techniques for Parallel FDTD Methods
Wenhua Yu
The Institution of Engineering and Technology, 2014
Development of computer science techniques has significantly enhanced computational electromagnetic methods in recent years. The multi-core CPU computers and multiple CPU work stations are popular today for scientific research and engineering computing. How to achieve the best performance on the existing hardware platforms, however, is a major challenge. In addition to the multi-core computers and multiple CPU workstations, distributed computing has become a primary trend due to the low cost of the hardware and the high performance of network systems. In this book we introduce a general hardware acceleration technique that can significantly speed up FDTD simulations and their applications to engineering problems without requiring any additional hardware devices.

front cover of Voices in the Code
Voices in the Code
A Story about People, Their Values, and the Algorithm They Made
David G. Robinson
Russell Sage Foundation, 2022
Algorithms—rules written into software—shape key moments in our lives: from who gets hired or admitted to a top public school, to who should go to jail or receive scarce public benefits. Such decisions are both technical and moral. Today, the logic of high stakes software is rarely open to scrutiny, and central moral questions are often left for the technical experts to answer. Policymakers and scholars are seeking better ways to share the moral decisionmaking within high stakes software—exploring ideas like public participation, transparency, forecasting, and algorithmic audits. But there are few real examples of those techniques in use. 
In Voices in the Code, scholar David G. Robinson tells the story of how one community built a life-and-death algorithm in an inclusive, accountable way. Between 2004 and 2014, a diverse group of patients, surgeons, clinicians, data scientists, public officials and advocates collaborated and compromised to build a new kidney transplant matching algorithm—a system to offer donated kidneys to particular patients from the U.S. national waiting list. Drawing on interviews with key stakeholders, unpublished archives, and a wide scholarly literature, Robinson shows how this new Kidney Allocation System emerged and evolved over time, as participants gradually built a shared understanding both of what was possible, and of what would be fair. Robinson finds much to criticize, but also much to admire, in this story. It ultimately illustrates both the promise and the limits of participation, transparency, forecasting and auditing of high stakes software. The book’s final chapter draws out lessons for the broader struggle to build technology in a democratic and accountable way.

logo for American Library Association
Web 2.0 Tools and Strategies for Archives and Local History Collections
American Library Association
American Library Association, 2010

front cover of Wiring The Writing Center
Wiring The Writing Center
Eric Hobson
Utah State University Press, 1998
Published in 1998, Wiring the Writing Center was one of the first few books to address the theory and application of electronics in the college writing center. Many of the contributors explore particular features of their own "wired" centers, discussing theoretical foundations, pragmatic choices, and practical strengths. Others review a range of centers for the approaches they represent. A strong annotated bibliography of signal work in the area is also included.

front cover of A World of Fiction
A World of Fiction
Digital Collections and the Future of Literary History
Katherine Bode
University of Michigan Press, 2018

During the 19th century, throughout the Anglophone world, most fiction was first published in periodicals. In Australia, newspapers were not only the main source of periodical fiction, but the main source of fiction in general. Because of their importance as fiction publishers, and because they provided Australian readers with access to stories from around the world—from Britain, America and Australia, as well as Austria, Canada, France, Germany, New Zealand, Russia, South Africa, and beyond—Australian newspapers represent an important record of the transnational circulation and reception of fiction in this period.

Investigating almost 10,000 works of fiction in the world’s largest collection of mass-digitized historical newspapers (the National Library of Australia’s Trove database), A World of Fiction reconceptualizes how fiction traveled globally, and was received and understood locally, in the 19th century. Katherine Bode’s innovative approach to the new digital collections that are transforming research in the humanities are a model of how digital tools can transform how we understand digital collections and interpret literatures in the past.


front cover of Writing History in the Digital Age
Writing History in the Digital Age
Jack Dougherty and Kristen Nawrotzki, editors
University of Michigan Press, 2013

Writing History in the Digital Age began as a “what-if” experiment by posing a question: How have Internet technologies influenced how historians think, teach, author, and publish? To illustrate their answer, the contributors agreed to share the stages of their book-in-progress as it was constructed on the public web.

To facilitate this innovative volume, editors Jack Dougherty and Kristen Nawrotzki designed a born-digital, open-access, and open peer review process to capture commentary from appointed experts and general readers. A customized WordPress plug-in allowed audiences to add page- and paragraph-level comments to the manuscript, transforming it into a socially networked text. The initial six-week proposal phase generated over 250 comments, and the subsequent eight-week public review of full drafts drew 942 additional comments from readers across different parts of the globe.

The finished product now presents 20 essays from a wide array of notable scholars, each examining (and then breaking apart and reexamining) if and how digital and emergent technologies have changed the historical profession.


front cover of Writing New Media
Writing New Media
Theory and Applications for Expanding the Teaching of Composition
Anne Frances Wysocki, Johndan Johnson-Eilola, Cynthia L. Selfe, & Geoffrey Sirc
Utah State University Press, 2004

As new media mature, the changes they bring to writing in college are many and suggest implications not only for the tools of writing, but also for the contexts, personae, and conventions of writing. An especially visible change has been the increase of visual elements-from typographic flexibility to the easy use and manipulation of color and images. Another would be in the scenes of writing-web sites, presentation "slides," email, online conferencing and coursework, even help files, all reflect non-traditional venues that new media have brought to writing. By one logic, we must reconsider traditional views even of what counts as writing; a database, for example, could be a new form of written work.

The authors of Writing New Media bring these ideas and the changes they imply for writing instruction to the audience of rhetoric/composition scholars. Their aim is to expand the college writing teacher's understanding of new media and to help teachers prepare students to write effectively with new media beyond the classroom. Each chapter in the volume includes a lengthy discussion of rhetorical and technological background, and then follows with classroom-tested assignments from the authors' own teaching.


Send via email Share on Facebook Share on Twitter