Preview Mode Links will not work in preview mode

Future Hindsight is a weekly podcastthat takes big ideas in civic life and democracy and turns them into action items for everyday citizens.

Sep 7, 2019

Fourth Amendment

The Fourth amendment protects people from unlawful searches and seizures. For example, in the 1970s the Supreme Court ruled that a warrant is necessary in order to listen in on telephone conversations, but not to collect the phone numbers. This is the precedent that allows for big data to collect a vast amount of information about people on the internet. Further, the Foreign Intelligence Surveillance Court has determined that the legal analysis for the Fourth amendment is the same, whether the right is applied to millions of people or to just one.

Data privacy and literacy

The issue with collecting data at scale is that it becomes granular and social. At that point, the data is no longer innocuous but is invasive of privacy. It turns out that our every-day seemingly trivial interactions matter profoundly in the aggregate, and our habit of almost blindly agreeing to arcane privacy policies on the internet is misguided. We need newer forms of transparency that really tell us how the data is being used and how it affects our online profile, as well as a collective effort to prioritize data and technological literacy. We also need to have a conversation about what kind of analyses are and are not allowed.

Technological Determinism

Technological determinism is a vision of history in which technology leads the way and pushes a narrative that certain changes in technology are inevitable to the point of altering the people’s expectations. It’s also a reminder that decisions are always being made along the way, whether consciously or not, to yield the current system. We now accept the model of advertising services based on the surveillance of users' everyday interactions, but there were actually technological developments in the 1990s that would have made cash transactions largely anonymous. The internet could have developed differently. 

Find out more:

Matthew L. Jones is the James R. Barker Professor of Contemporary Civilization at Columbia University. He studies the history of science and technology, focused on early modern Europe and on recent information technologies. 

A Guggenheim Fellow for 2012-13 and a Mellon New Directions fellow for 2012-15, he is writing a book on computing and state surveillance of communications, and is working on Data Mining: The Critique of Artificial Reason, 1963-2005, a historical and ethnographic account of “big data,” its relation to statistics and machine learning, and its growth as a fundamental new form of technical expertise in business and scientific research. He was also a Data & Society Fellow for 2017-2018 and authored numerous other papers.

Follow Matthew L. Jones on Twitter @nescioquid