Wednesday, March 25

College of Letters & Science, University of Wisconsin-Madison


The use of technology and data continues at a rapid pace, requiring serious conversations about the ethics behind some of the ways in which they’re being used. How are, for instance, the algorithmic tools impacting race and class inequalities, the criminal justice system and child welfare systems? Social scientists like Martin Eiermann are working to answer questions like this, using a combination of data analysis tools, archival research and on-the-ground observations to understand the integration of machine-generated knowledge with human expertise.

Martin Eiermann

Eiermann, an assistant professor of sociology at UW–Madison, was born and raised in Germany but has now spent half of his life stateside. After leaving Germany to pursue an undergraduate degree at Harvard and PhD at the University of California, Berkeley, Eiermann stayed in the United States to do a postdoctoral degree at Duke University before joining UW–Madison’s Department of Sociology.

“This really feels like home, and hopefully an intellectual home, as well,” Eiermann explains.

Eiermann’s research has focused on the politics of personal data, particularly its impact on the rapid evolution of technology. Eiermann’s work is particularly crucial considering the increasing use of algorithmic and computational tools in public administration roles, like artificial intelligence.

“I would often describe myself as a historical social scientist, and I’m interested in trying to understand how we have ended up in this particular world we inhabit,” Eiermann says.

His most recent book, The Limiting Principle: How Privacy became a Public Issue, explores the politics of personal data in an increasingly information and data-driven world.

Based on Eiermann’s PhD dissertation, his current research looks at a period known as the Information Age between the 1880s and 1930s, when there were rapid technological transformations and urbanization in the United States. Specifically, Eiermann’s book explores the impact the Information Age had on the concept of privacy.

During this period of rapid transformation, privacy shifted from an informal set of cultural norms to something that Eiermann describes as legally codified and salient in politics, meaning that a more robust legal framework was created to regulate the use of personal data.

“Privacy becomes more expansive and institutionalized,” Eiermann says. “It becomes a thing that really affects the experience of millions of people who participate in social life, as citizens and as consumers.”

Through Eiermann’s extensive research into the past and the transformation of privacy over decades, we can also gain insight into contemporary issues.

“You often hear that privacy is dead,” Eiermann says of the modern era. “The more advances you have in the technological capacity to collect and analyze personal data, the less privacy you have, right? The more one goes up, the more the other comes down.”

But Eiermann wants to push back against those common misconceptions around privacy. He argues that privacy is not shaped by technological advancements alone but also through political and institutional choices in general.

For example, a patient’s right to privacy in terms of their medical information is harder to define than it seems, because several sectoral laws — laws that pertain to certain industries or sectors — govern whether that information is private or not. There is no single, overarching law when it comes to medical information; laws are established on the federal level, such as HIPAA, but there are also state-level regulations impacting medical privacy. In cases like these, it’s political and policy decisions that are driving the level of privacy an individual can expect.

Eiermann’s work investigating the intersection of privacy and data also has real contemporary relevance, as more governing agencies and corporations use data to inform decision making.

For example, agencies use computational tools to make predictions about certain outcomes, such as criminal recidivism, child maltreatment and foster care placement. These tools are starting to be implemented in the criminal justice system and the child welfare system, areas on which Eiermann has focused.

The computed predictions are reported in the form of risk scores, with the distribution color-coded based on the score. The scores are then used to target state interventions or influence placement decisions.

“Caseworkers and child welfare workers are overworked and underpaid, so there’s a clear rationale for why you may want to use tools to augment human decision making,” Eiermann explains. “The research that I do is trying to understand how those tools are playing out and what the effects of having that information available are on, for example, racial disparities.”

Humans, in general, have biases which can unjustly impact one community over another. If those biases are baked into automated rating systems — as a growing pile of research has begun to suggest — the algorithm to predict certain outcomes could also end up being biased, creating a wide swath of unintended consequences.

To conduct research into the use of this data in social welfare settings, Eiermann employs what he refers to as a “mixed methods” approach to studying the impacts.

For example, Eiermann may use computational methods to identify larger patterns and archival research to look at how these tools have evolved, while also conducting in-person observations to see how people are working and developing these tools.

In teaching his first UW–Madison classes this semester “Surveillance Culture” and the introductory methods courses for sociology majors — Eiermann’s goal is for his students to understand what it means to think scientifically about social issues. Eiermann also wants to expose his students early on to computational tools, like Python and R, as they can potentially be very powerful tools for research. In

As data use and technology continue to speed ahead, Eiermann’s research is becoming more important than ever to understand the impact on privacy and how it could transform social and political conversations.

“Informational residue is left behind, as we go about our increasingly digital lives, so there’s a real shift away from data collected through forms and surveys and toward the real-time collection and analysis of behavioral data,” Eiermann says. “The challenge is to keep those laws up to speed with the realities of the online economy and ensure you get more than symbolic compliance, so that there’s a substantive and robust protection.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *