In 2018, users of a fitness tracking app discovered that mapping their casual runs could be used for something far more sinister than pinpointing the most popular trails in a city鈥攖he app鈥檚 data could reveal the location of secret military bases around the world, where service members were logging their daily exercise routines. The same year, the 鈥淕olden State Killer鈥 was arrested after authorities matched decades-old DNA from a crime scene to genetic information that distant relatives of his had uploaded to a public genealogy website.听
These are just a few examples of how seemingly innocuous data can be used in ways for which it was not originally intended. What if your smart refrigerator sends data on your diet to your health insurance company? What if a menstrual tracking app makes data on women鈥檚 periods or pregnancies available to prosecutors in states that have banned abortion?听
鈥淭oday is just the first day of a very long life for your data,鈥 says Aram Sinnreich, a professor and chair of communication studies in the School of Communication. 鈥淲hatever your data is being collected for right now, many other uses and abuses are going to happen downstream.鈥
For several decades, Sinnreich has been studying the intersection of technology, law, culture, and media. In 2019, he and longtime friend and collaborator Jesse Gilbert published an academic paper on a theoretical new idea suggesting that, because of the exponential growth in new computational technologies, there is no limit to how much knowledge can ultimately be produced from any object, event, or interaction.
Sinnreich and Gilbert knew, though, that few people outside their small research field would read their paper. They began to grow concerned that the general public and mainstream media sources were missing important parts of the story about data privacy in our interconnected, always-online society.听
鈥淲e wanted people to start thinking more about the second- and third-order consequences of sharing their data,鈥 Sinnreich says. 鈥淚t鈥檚 not only about whether you鈥檙e comfortable sharing your biometric data with Apple or sharing pictures of your face with Facebook; it鈥檚 whether you are OK with where that data ultimately ends up after its initial use.鈥
So the duo began interviewing technology, law, ethics, privacy, and public health experts around the world and writing a book for popular consumption.听The Secret Life of Data:听Navigating Hype and Uncertainty in the Age of Algorithmic Surveillance听was published on April 30, a few days after Sinnreich and Gilbert spoke with an audience at 17吃瓜在线 about the book.听
鈥淒ata have no expiration date,鈥 they told students, faculty, and other attendees at the event. 鈥淭his book isn鈥檛 necessarily [about] the initial uses of data, but [rather] what happens after those applications, and what happens when the data becomes added to a universal network that gets correlated, corresponded, and co-analyzed with other forms of data.鈥
Gilbert, an interdisciplinary artist who uses software and technology to design installations that change in real time, says that most technology is 鈥渁 double-edged sword.鈥
鈥淢aybe one of your parents uploads their genetic information to a database and is found to have some kind of disease predisposition,鈥 Gilbert says. 鈥淭hat information could save their life, but it also could affect your insurance coverage.鈥
In听The Secret Life of Data,听Sinnreich and Gilbert encourage readers to have a healthy dose of caution when it comes to sharing data鈥攂ut they admit that it is nearly impossible to be completely cut off from the interconnected, ever-present world of data that we live in.听
As they write in the book鈥檚 introduction: 鈥淲hatever we think we鈥檙e sharing when we upload a selfie, write an email, shop online, stream a video, look up driving directions, track our sleep, 鈥榣ike鈥 a post, write a book, or spit into a test tube, that鈥檚 only the tip of the proverbial iceberg. Both the artifacts we produce intentionally and the data traces we leave in our wake as we go about our daily lives can鈥攁nd likely will鈥攂e recorded, archived, analyzed, combined, and cross-referenced with other data and used to generate new forms of knowledge without our awareness or consent.鈥
Rather than give up all technology, the authors want readers to be cognizant of what they鈥檙e sharing and to what end. That means reading the fine print on apps and websites and being aware of the ways鈥攂oth positive and negative鈥攖hat data can be used. It also means gaining a better understanding of how personal decisions about data can impact people around you.听
鈥淵ou might be OK with the idea of a voice-activated sensor in the privacy of your own home, but are all your visitors OK with it? We need to become more comfortable with developing an ethos of transparency around things like this,鈥 Gilbert says.
A common refrain is that if you have nothing nefarious to hide, there is no reason to worry about digital surveillance. But Sinnreich says that view is shortsighted because of how hard it is to predict the future of data.
鈥淭he idea that you have nothing to hide is contextual to the current moment and to a technological and legal framework that could change in the future,鈥 he says.
A woman using a fertility tracking app in Texas in 2022 had nothing to hide; by 2023 the data from that app could be used to prove an illegal abortion. When the US military collected biometric data like fingerprints, facial IDs, and retinal scans from 25 million Afghans, they were doing so to help support aid programs. But when that database was seized by the Taliban, it became a dangerous tool to expose citizens who had aided the American war effort.听
During the process of researching and writing their book, Sinnreich and Gilbert say they were both surprised at how rarely experts in different fields communicate about issues. Technology developers often don鈥檛 think about the long-term regulations surrounding the products they are making, for instance, and regulatory agents struggle to incorporate thinking about culture and society into their legal frameworks.听
鈥淚t just seems like nobody is really thinking about the higher-level impact of their contribution to the system,鈥 Sinnreich says. 鈥淎nd if nobody鈥攅ven the biggest decision makers鈥攊s thinking about what听could听go wrong, then it鈥檚 almost inevitable that听everything听will go wrong.鈥
That warning is part of what they want readers to take away鈥攅specially those who work in the tech industry or policy arena.听
Long term, preventing data misuse and invasions of privacy can鈥檛 be solely in the hands of consumers exercising caution, they say. Policymakers and companies must take responsibility for shaping the future of technology.
鈥淲e don鈥檛 currently have adequate regulation in this country, which has to do with the influence of corporations in the political process, as well as this unfettered growth mentality鈥攚e want tech companies to keep being on the leading edge of growth,鈥 says Gilbert. 鈥淚 think there is also a lack of ethical training within fields like computer science.鈥
In the years they were researching the book, and the months since they finalized a manuscript, technology has continued to evolve at a dizzying pace. But Sinnreich and Gilbert say they don鈥檛 expect their ideas to be outdated any time soon; even as new technologies and devices hit the market, the issues surrounding data privacy and sharing remain largely the same.
At 17吃瓜在线, Sinnreich tries to promote the same kind of critical thinking about technology that he endorses in听The Secret Life of Data. He has hosted cybersecurity speakers and mentors graduate students who work on issues surrounding digital privacy and data security. He also tries to model a technological caution in his everyday life: he doesn鈥檛 have an Alexa device in his house, he cancelled his Dropbox cloud storage account after learning that his documents could be used to train AI systems, and he doesn鈥檛 use networked printers at the university because he couldn鈥檛 verify how data sent to the printers was protected.听
鈥淪ince the day I came to 17吃瓜在线, I鈥檝e been involved in trying to further the public鈥檚 understanding of data,鈥 he says.听听鈥淭his book is one more way to do that.鈥
How much data is collected from internet users each year?
Says Sinnreich: 鈥淚t鈥檚 literally incalculable, but a reasonable figure is probably hundreds of zettabytes. A hundred zettabytes is听100 sextillion bytes鈥攐r 1023听(100,000,000,000,000,000,000,000) bytes. But as our book argues, even if this number was accurate, it wouldn鈥檛 tell the whole story because it鈥檚 the听connections听between data that give tech its social power. The number of connections between 1023听bytes is so high, we probably don鈥檛 have a word for it.鈥