The following letter -- "Boycott collaboration with police" -- was sent to the Notices of the American Mathematical Society on 15 June 2020 and has now been published here.

Here is the list of signatories (pdf) as of 28 September 2020.

Media coverage of the letter:

- Nature.
- Inside Higher Ed.
- Deutschlandfunk.
- U. Washington Daily.
- Popular Mechanics.
- Salon.
- New Hampshire Union Leader.
- Daily Kos.
- Collectif oppose a brutalite policiere.


The letter:

To the Mathematics Community,

In light of the extrajudicial murders by police of George Floyd, Breonna Taylor, Tony McDade and numerous others before them, and the subsequent brutality of the police response to protests, we call on the mathematics community to boycott working with police departments.

This is not an abstract call. Many of our colleagues can and do work with police departments to provide modeling and data work. For example, ICERM sponsored a workshop on Predictive Policing (https://icerm.brown.edu/topical_workshops/tw16-7-pp/) which included ride-alongs with the Providence Police Department.

One of the organizers of this workshop is the founder of PredPol (https://www.theverge.com/2018/4/26/17285058/predictive-policing-predpol-pentagon-ai-racial-bias), a for-profit company which has lucrative contracts with police departments across the country, providing software that claims, among other things, to predict where crimes occur, and when they are gang related. Another organizer is an investor in PredPol (https://dailybruin.com/2019/05/20/letter-to-the-editor-public-perception-of-predictive-policing-is-wrong-it-can-help-reduce-crime).

An excellent summary of the feedback loops created by PredPol, and the racist consequences, can be found here: https://www.vice.com/en_us/article/xwbag4/academics-confirm-major-predictive-policing-algorithm-is-fundamentally-flawed

There are also deep concerns about the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression. See, for example:

https://www.technologyreview.com/2019/12/20/79/ai-face-recognition-racist-us-government-nist-study/

and

https://www.nytimes.com/2019/07/10/opinion/facial-recognition-race.html

Given the structural racism and brutality in US policing, we do not believe that mathematicians should be collaborating with police departments in this manner. It is simply too easy to create a "scientific" veneer for racism. Please join us in committing to not collaborating with police. It is, at this moment, the very least we can do as a community.

We demand that any algorithm with potential high impact face a public audit. For those who’d like to do more, participating in this audit process is potentially a proactive way to use mathematical expertise to prevent abuses of power. We also encourage mathematicians to work with community groups, oversight boards, and other organizations dedicated to developing alternatives to oppressive and racist practices. Examples of data science organizations to work with include Data 4 Black Lives (http://d4bl.org/) and Black in AI (https://blackinai.github.io/).

Finally, we call on departments with data science courses to implement learning outcomes that address the ethical, legal, and social implications of these tools.

A full list of signatures to this letter is available at https://www.math-boycotts-police.net/. Most of the signatories hold, or are working towards, a PhD in mathematics. All signatories are invested in the ethical practices of the mathematics community.

Tarik Aougab (Haverford College)
Federico Ardila (San Francisco State University)
Jayadev Athreya (University of Washington)
Edray Goins (Pomona College)
Christopher Hoffman (University of Washington)
Autumn Kent (University of Wisconsin)
Lily Khadjavi (Loyola Marymount University)
Cathy O'Neil (CEO, ORCAA)
Priyam Patel (University of Utah)
Katrin Wehrheim (University of California, Berkeley)