Arrow-right Camera
Go to e-Edition Sign up for newsletters Customer service
Subscribe now

This column reflects the opinion of the writer. Learn about the differences between a news story and an opinion column.

Opinion >  Column

Commentary: Facebook helped police with an abortion investigation in Nebraska. That’s troubling news everywhere

Reports emerged this month that Facebook handed over to law enforcement messages between a Nebraska teenager and her mother about the teen’s pregnancy loss. (Olivier Douliery/AFP/Getty Images/TNS)  (OLIVIER DOULIERY/AFP/Getty Images North America/TNS)
Reports emerged this month that Facebook handed over to law enforcement messages between a Nebraska teenager and her mother about the teen’s pregnancy loss. (Olivier Douliery/AFP/Getty Images/TNS) (OLIVIER DOULIERY/AFP/Getty Images North America/TNS)

After reports revealed this month that Facebook handed over to law enforcement messages between a Nebraska teenager and her mother about the teen’s pregnancy loss, the company released a statement claiming ignorance about the nature of the investigation. “The warrants did not mention abortion at all. Court documents indicate that police were at that time investigating the alleged illegal burning and burial of a stillborn infant,” the company wrote.

But that argument would set a disturbing precedent. Tech companies cannot be allowed to hide behind the omission of the word “abortion” to abandon their responsibility to protect people’s sensitive data.

With federal protections for reproductive rights rolled back, data privacy protections are more important than ever for health care and abortion access. Facebook and other tech companies routinely cooperate with police demands for information they collect, including messages and keyword searches. For the Nebraska case, Facebook claimed it wasn’t aware that the police were seeking information relevant to a person’s abortion. That raises the question: What would Facebook have done if the warrant included the word “abortion”?

According to Facebook’s “Information for Law Enforcement Authorities” page, it promises to “conduct a careful review of each law enforcement request to disclose user data for consistency with international human rights standards.” That standard should guide reviews whether or not the request specifies “abortion.” Given that multiple international human rights bodies have agreed that criminalizing abortion violates human rights – including the rights to sexual and reproductive health, privacy, and freedom from cruel, inhuman or degrading treatment or punishment – Facebook’s own guidelines recommend a different response than they gave in the Nebraska case.

Facebook reportedly provided the requested information within two days of local law enforcement serving the warrant. The messages between mother and daughter that Facebook produced referenced pills used for abortion – something the company presumably would have known if it reviewed the messages before handing them over to law enforcement.

Particularly with police making this request more than a month after the Dobbs opinion was leaked, laying the groundwork for harsher criminalization of abortion nationwide, the messages’ contents should have been enough to alert the company that the end of a teenager’s pregnancy was being investigated. Readily handing over this private information does not demonstrate a careful review or regard for human rights standards.

When Facebook and other tech companies receive a warrant, they must look beyond the specific statute being investigated and determine whether the allegations include criminalization for conduct that is protected by international human rights standards. The affidavit supporting the June search warrant, for example, specifies that law enforcement was looking into whether the baby was stillborn or asphyxiated, making clear a teen’s pregnancy was under investigation. If the companies find that cooperation with law enforcement would punish a person’s reproductive rights, they must fight it using every legal means at their disposal.

Online platforms are well aware that law enforcements’ demands often come in camouflage and so won’t outright state a mission to thwart bodily autonomy. It’s critical for big tech to exercise close scrutiny in these cases because the statutes invoked in a warrant and supporting materials won’t necessarily be abortion laws. Organizations such as National Advocates for Pregnant Women, IfWhenHow and the National Association of Criminal Defense Lawyers have closely documented how police and prosecutors wield laws and data that don’t “mention abortion at all” to punish pregnant people for their reproductive decisions. Laws frequently cited include homicide, for example, or “Prohibited Acts with Skeletal Remains” as referenced in the Nebraska search warrant.

Since the Supreme Court officially toppled Roe v. Wade in late June, law enforcement is now even more empowered to investigate and criminalize reproductive decisions than it was when local officials launched the Nebraska investigation. Companies should accordingly reduce the potentially sensitive data they collect and keep. That means offering default end-to-end encryption across all services, including internet searches, to reduce the amount of private information collected in the first place and purging whatever data remains that can be weaponized in court against people seeking criminalized health care.

Sheryl Sandberg, Meta’s former chief operating offer, said the Supreme Court’s decision to reverse Roe “threatens to undo the progress women have made in the workplace and to strip women of economic power.” Facebook’s porous privacy policies are helping to reconstruct the barriers women face in achieving bodily autonomy, digital autonomy and economic power.

Meanwhile, the mother in Nebraska accused of helping her teen daughter obtain an abortion lost her job. Both mother and daughter have been charged with at least one felony and multiple misdemeanors. And the daughter, 17 at the time of the alleged abortion, is being tried as an adult.

Cynthia Conti-Cook is a civil rights lawyer and tech fellow at the Ford Foundation. Kate Bertash is the director of the Digital Defense Fund.

More from this author