As language technologies have become increasingly prevalent, there is a growing awareness that decisions we make about our data, methods, and tools are often tied up with their impact on people and societies. This course introduces students to real-world applications of language technologies and the potential ethical implications associated with them. We discuss philosophical foundations of ethical research along with advanced state-of-the art techniques. Discussion topics include:
The schedule is subject to change. Sign in to Google Drive with your Andrew ID to access the videos.
|1||1/14||Introduction||Motivation, requirements, and overview [slides] [video]||Hovy & Spruit (2016), Barocas & Selbst (2016)|
|1/16||Project Introductions||Introduction to possible projects. [slides]|
|2||1/21||Foundations||Philosophical foundations, history: medical, psychological experiments, IRB and human subjects [slides] [video]|
|1/23||Foundations||Philosophical foundations, history: medical, psychological experiments, IRB and human subjects [slides] [video]||Project pre-proposal due.|
|3||1/28||Foundations||Invited talk by David Danks, Professor of Philosophy [video]|
|1/30||Objectivity and Bias||Stereotypes, prejudice, and discrimination: background [slides] [video]||Buolamwini & Gebru (2018), Stanovsky et al. (2019), Jurgens et al. (2017)||HW 1 Released|
|4||2/4||Objectivity and Bias||Bias detection and debiasing of data, algorithms, and models [slides] [video]||Sap et al. (2019), Zhao et al. (2017), Sun et al. (2019)|
|2/6||Objectivity and Bias||Bias detection and debiasing of data, algorithms, and models [slides] [video]||Chouldechova (2017), Geva et al. (2019), Bender & Friedman (2018), Field et al. (2019)|
|5||2/11||Objectivity and Bias||Invited talk on fairness by Maria De-Arteaga [slides] [video]||Romanov et al. (2019), De-Arteaga et al. (2019), Swinger/De-Arteaga et al. (2019)||HW 1 due|
|2/13||Objectivity and Bias||Invited talk on psychological approaches by Geoff Kaufman [slides] [video]|
|6||2/18||Civility in communication||Identification of trolling, hate speech, abusive language, toxic comments [slides] [video]||Warner & Hirschberg (2012), Schmidt & Wiegand (2017), Jurgens et al. (2019)||HW 2 released|
|2/20||Civility in communication||Identification of trolling, hate speech, abusive language, toxic comments [slides] [slides on toxic language in chatbots] [video]||Zhang et al. (2018), Breitfeller et al.(2019), Fessler (2017), Henderson et al. (2017), Metz (2017)|
|7||2/25||Lang. of Manipulation||Computational propaganda [slides] [video]||Project Proposal due|
|2/27||Lang. of Manipulation||Computational Solutions [slides] [video]||King et al. (2017); Field et al. (2018); Starbird (2018)|
|8||3/3||Lang. of Manipulation||Targetered ads, fake news, US elections[slides] [video]||Zellers et al. (2019), Schuster et al. (2019), Nakamura et al. (2019)||HW 2 due|
|3/5||Lang. of Manipulation||Invited talk by Rayid Ghani [video]|
|9||3/10||No class (Spring Break)|
|3/12||No class (Spring Break)|
|10||3/17||No class||University-wide cancellation, preparation for online instruction|
|3/19||Privacy, Profiling, Security||Privacy and anonymity. Writer profiling. Transition to online course [slides] [video]||Jurgens et al.(2017), Coavoux et al.'18, PrivateNLP|
|11||3/24||Student Presentations||Midterm Project Presentations|
|3/26||Student Presentations||Midterm Project Presentations||HW 3 Released|
|12||3/31||Privacy, Profiling, Security||Privacy and anonymity. COVID-19 privacy and Cambridge Analytica discussion. [slides] [video]||The Guardian (2017); Brexit: The Uncivil War. (2019) Dramatization; The Great Hack (2019) Documentary|
|4/2||Privacy, Profiling, Security||Privacy and anonymity. Speech: speaker id and de-identification. [slides] [video]|
|13||4/7||Privacy, Profiling, Security||Demographic and persona inference techniques [slides] [video part 1] [video part 2]||Bamman et al. (2014); Wang and Jurgens (2018); Kang et al. (2019)||HW 3 due,
HW 4 released
|4/9||NLP for Social Good||Endangered Languages. [slides] [video]|
|14||4/14||NLP for Social Good||Invited talk by Alex Hauptmann. Computational Tools for Human Rights and Public Safety [video]|
|4/16||Intellectual Property||Plagiarism and plagiarism detection. Patents. [slides][video]||HW 4 due|
|15||4/21||Class Discussion||Code of Ethics [slides] [video]|
|16||4/28||Final Project Presentations|
|4/30||Final Project Presentations|
|5/8||Final Report Due|
Homework assignments. (4 assignments; 15% each) Each assignment will correspond to a lecture unit: Objectivity and Bias, Civility in Communication, Privacy and Obfuscation, and Computational Social Science.
Each assignment contains a combination of coding, analysis, and discussion. For each assignment, completing the baseline requirements will obtain a passing (B-range) grade. A-range grades can be obtained through completing the open-ended “Advanced Analysis” part of the assignment. Assignments are not necessarily designed to focus on technical solutions, but instead to encourage students to think critically about the course material and understand how to approach ethical problems in NLP, while also allowing for exploration of various methodologies.
Project. (30%) a semester-long (normally) 3-person team project (see below).
Participation in class. (10%) classes will include discussions of reading assignments. Students will be expected to read relevant papers and participate in discussions.
A major component of this course is a team project. It will be a substantial research effort carried out by each student or groups of students (expected group size = 3; 2-4 is acceptable). See this document for possible project ideas. All assignments are due at 11:59pm on the specified date. The components of the project are:
Late policy. Students will have 4 late days that may be used for HW assignments at any point during the semseter. Late days may not be used for project benchmarks. After all late days have been used, no credit will be given for homework submitted late.
Academic honesty. Homework assignments are to be completed individually. Verbal collaboration on homework assignments is acceptable, as well as re-implementation of relevant algorithms from research papers, but everything you turn in must be your own work, and you must note the names of anyone you collaborated with on each problem and cite resources that you used to learn about the problem. The project is to be completed by a team. You are encouraged to use existing NLP components in your project; you must acknowledge these appropriately in the documentation. Suspected violations of academic integrity rules will be handled in accordance with the CMU guidelines on collaboration and cheating.
Take care of yourself! As a student, you may experience a range of challenges that can interfere with learning, such as strained relationships, increased anxiety, substance use, feeling down, difficulty concentrating and/or lack of motivation. All of us benefit from support during times of struggle. There are many helpful resources available on campus and an important part of having a healthy life is learning how to ask for help. Asking for support sooner rather than later is almost always helpful. CMU services are available, and treatment does work. You can learn more about confidential mental health services available on campus at: http://www.cmu.edu/counseling/. Support is always available (24/7) from Counseling and Psychological Services: 412-268-2922.
Accommodations for Students with Disabilities:
If you have a disability and have an accommodations letter from the Disability Resources office, I encourage you to discuss your accommodations and needs with me as early in the semester as possible. I will work with you to ensure that accommodations are provided as appropriate. If you suspect that you may have a disability and would benefit from accommodations but are not yet registered with the Office of Disability Resources, I encourage you to contact them at firstname.lastname@example.org.