Meet the Scholars 2021: Marissa Gerchick

A few months ago, I went to pick up some furniture I bought online. When I met the person I was buying a table and chair from, I noticed they were wearing a green sweater with a cartoon drawing of a dinosaur across the front. I made note of the sweater at the time, thinking it was unique -- unlike anything I’d seen before. 

The next day, as I was scrolling through news stories interspersed with online advertisements on my phone, I saw an ad that made me pause. Between articles was an advertisement for (you guessed it!) the exact green sweater the person who sold me furniture had been wearing the day before. The sweater was from a brand I had never heard of, and I couldn’t recall ever having seen that advertisement before. 

I set down my phone and reflected. After a while, I came up with a possible explanation for this too-coincidental coincidence. I guessed that advertisers had decided to show me the green sweater ad by using my location data, the other person’s location data, our purchasing histories, and demographic data to target ads -- surveillance we might have both agreed to via largely inscrutable terms of service.

While this experience was unnerving for me in some ways, the stakes of me seeing a targeted clothing ad were arguably low. The stakes are much higher when we consider the many ways in which technology, including targeted advertisements, mediates our access to opportunity. For example, algorithmic systems related to the ones that showed me the sweater ad increasingly govern the ways in which we receive health care, access housing and employment, and interact with financial institutions. 

And while technology can certainly open new doors and increase access to opportunity, many of the algorithmic systems used in these high stakes settings have been shown to be deeply flawed, disproportionately harming Black communities and other historically marginalized groups. The communities forced to endure these harms are often the same ones that are locked out of the positive opportunities technology presents, and these harms that will continue to occur unless we take action to end them. 

I believe that balancing the benefits technology presents while mitigating these very real harms will be crucial to ensuring a future where technology works for everyone. As a TechCongress fellow, my goals are to listen to the communities affected by issues like ineffectual data privacy protections and algorithmic discrimination, learn from the many experts shaping this field, and hopefully contribute towards stronger protections.