We are being watched. Our search histories are logged; every tap and swipe is monitored. From our religious views to our porn preferences, every facet of our identity is digitally observed and archived. We can rarely escape modern data tracking from governments and corporations alike — as recently published internal Facebook documents made abundantly clear.
You might not be visibly affected if Google infers your sexuality from your searches or if the U.S. government has your DNA, but not everyone has that luxury. Through my work with Ethical Tech, which explores the intersection of ethics and technology, I’ve found that people are surprisingly ambivalent about how their information can be used against them.
So here’s your New Year’s Resolution: Start thinking about how privacy violations harm the most vulnerable among us. And then call your elected representatives and demand they think about it as well. And while you’re at it, scroll through the privacy settings on your smartphone and make sure you’re comfortable with the permissions you’re granting your apps.
People are also reading…
There is a color to surveillance. After 9/11, the U.S. government worked with the New York Police Department to aggressively surveil ethnic communities, monitoring daily life in bookstores and coffee shops. They even used so-called “mosque crawlers” to track sermons. During the Cold War, the federal government spied upon civil rights activists like Cesar Chavez and Martin Luther King Jr. (as well as civil rights movements broadly), using the information to undermine their rhetoric. Before Japanese Americans were interned during World War II, they were surveilled through so-called custodial detention lists that identified who should be arrested if war broke out. And in the early 1900s, police heavily monitored leaders of the women’s suffrage movement, who were subsequently targeted for harassment and arrest.
From sex workers on the border to homosexuals in the military to women visiting abortion clinics to slaves on plantations (tracked through so-called plantation ledgers), privacy violations have long had the worst impact on marginalized groups.
It’s not just history, or just the U.S. Look at the Uighurs in China’s Xinjiang region: a religious minority thrown into “re-education camps” by the hundreds of thousands, a horrific ethnic cleansing as part of the state’s broader persecution of Islam. Outside the camps, millions of other Uighurs have been subjected to “arbitrary arrest, draconian surveillance or systemic discrimination.” Systems that track people’s every move enable these human rights abuses.
Within the U.S., the government is collecting the DNA of immigrant children and their alleged parents at the border for paternity tests. The only assurance this will not be used for other ends, either by the government or the for-profit entities doing the testing, is spoken promises. Genetic information is systematically collected, processed, and stored — today, in the service of decisions about a child’s life; tomorrow, in the service of the unknown.
There’s much that can be done. As recommended in a recent report by the AI Now Institute at New York University, the government should audit algorithms used in public institutions like courts, looking for issues of bias and fairness. Government at all levels should tell citizens how their data is used by private and public institutions alike. Legislators should evaluate current methods of privacy invasions, from facial recognition technology on city streets that result in racist policing to sexist hiring algorithms that deny certain groups job opportunities.
There are certainly millions of white, middle-class American citizens who might use illegal drugs or have unprotected sex. For them, privacy might not matter. But take the case of the homeless woman whose drug use was discovered and used to deny access to life-saving resources. Or the parent whose monitored decisions leads to his family being denied welfare benefits.
A belief that data privacy protections “might not matter” is only a lack of fear that the information won’t be used against you. That’s privilege — and is precisely why we need more accountability, transparency, and oversight over how information is processed and used to make decisions.
Justin Sherman is a junior studying computer science and political science at Duke University. He is the co-founder and vice president of nonpartisan initiative Ethical Tech. Contact him at Justin.sherman@duke.edu.
Scroll through the privacy settings on your smartphone and make sure you’re comfortable with the permissions you’re granting your apps.

