Angwin, Julia and Ariana Tobin. 2017. “Facebook (Still) Letting Housing Advertisers Exclude….” ProPublica. November 21, 2017.

Julia Angwin et. al., “Machine Bias,” ProPublica, May 23, 2016, article/machine-bias-risk-assessments-in-criminal-sentencing.

Angwin, Julia, Noam Scheiber, and Ariana Tobin. 2017. “Facebook Job Ads Raise Concerns About Age Discrimination.” The New York Times, December 20, 2017, sec. Business Day. facebook-job-ads.html.

Batya Friedman, Peter H. Kahn Jr, and Alan Borning, “Value Sensitive Design and Information Systems,” in Human-Computer Interaction and Management Information Systems: Foundations, ed. Ping Zhang and Dennis F. Galletta (Abingdon: Routledge, 2006): 348-372.

Blomberg, Thomas, William Bales, Karen Mann, Ryan Meldrum, and Joe Nedelec. “Validation of the COMPAS Risk Assessment Classification Instrument.” College of Criminology and Criminal Justice, Florida State University, Tallahassee, FL, 2010. wp-content/uploads/Validation-of-the-COMPAS-Risk-Assessment-Classification-Instrument.pdf.

Brennan, Tim, Bill Dieterich, Beate Ehret, “Research Synthesis: Reliability and validity of COMPAS,” Northpointe Inc., September, 2007

Brennan, Tim, William Dieterich, and Beate Ehret. “Evaluating the Predictive Validity of the Compas Risk and Needs Assessment System.” Criminal Justice and Behavior 36, no. 1 (January 2009): 21–40.

Chouldechova, Alexandra. “Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments.” arXiv Preprint arXiv:1610.07524, 2016. abs/1610.07524

Christin, Angele, Alex Rosenblat, and danah boyd. “Courts and Predictive Algorithms.” CRIMINAL Justice Policy Program 38 (2015). Courts_and_Predictive_Algorithms.pdf.

Elish, Madeleine and Tim Hwang, “When Your Self-Driving Car Crashes, You Could Still be the One Who Gets Sued,” Quartz, July 25, 2015,

Eubanks, Virginia. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York, NY: St. Martin’s Press.

Facebook. “An Update on Our Plans to Restrict Data Access on Facebook | Facebook Newsroom.” Published April 4 2018.

Fairness, Accountability, and Transparency in Machine Learning. n.d. “Principles for Accountable Algorithms and a Social Impact Statement for Algorithms :: FAT ML.” Accessed April 11, 2018.

Friedman, Bayta and Helen Nissenbaum, “Bias in Computer Systems,” ACM Transactions on Information Systems 14, no. 3 (1996): 330-347.

Gillespie, Tarleton. “The Relevance of Algorithms.” Media Technologies: Essays on Communication, Materiality, and Society 167. (2014)

Huet, Ellen. 2015. “Server and Protect: Predictive Policing Firm PredPol Promises to Map Crime Before It Happens.” Forbes. Accessed April 10, 2018.

 Lartey, Jamiles. “Predictive Policing Practices Labeled as ‘flawed’ by Civil Rights Coalition.” The Guardian, August 31, 2016.

Milner, Yeshimabeit, “An Open Letter to Facebook from the Data for Black Lives Movement.” Medium (blog), April 4, 2018.

Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. 1 edition. New York: NYU Press.

O’Neil, Cathy. 2016. Weapons of Math Destruction. Crown.

Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press.

Saurwein, Florian, Natascha Just, and Michael Latzer. “Governance of Algorithms: Options and Limitations.” Info 17, no. 6 (September 14, 2015): 35–49.

Simpson, Scott. “Muslim Advocates and Color Of Change Demand Independent Civil Rights Audit of Facebook.” Muslim Advocates, April 3, 2018. muslim-advocates-and-color-of-change-demand-independent-civil-rights-audit-of-facebook/.

“Statement on Algorithmic Transparency and Accountability,” ACM US Public Policy Council, January 12, 2017,

 “The New York City Council – File #: Int 1696-2017.” 2017. http:// 9C42-461253F9C6D0

Venkatsburamanian, Suresh . “When an algorithm isn’t,” Medium, October 1, 2015, https://