
External Resources
Tracking changes in GenAI policy and impacts
​Legal Case Trackers​
ChatGPT is Eating the World, tracks pending legal cases regarding AI
Database of AI Litigation (DAIL), a database from George Washington Law School that presents information about ongoing and completed litigation involving artificial intelligence, including machine learning
WIRED Copyright Case Tracker, tracks every copyright battle involving the AI industry—along with some visualizations that will be updated as the cases progress
European Union AI Law Tracker, summarizes the EU’s proposed regulatory framework for AI; Although still evolving, it may affect any production that uses AI or sources AI services from the EU
Standards​
BBC Standards for AI Transparency
Partnership on AI (PAI) recommendations and 10-step guide
Standard Setting Bodies, a list highlighting both technical and ethical standards (via Starling Lab)
Artificial Intelligence (AI) and Your National History Day (NHD Project)
SAG-AFTRA — Artificial Intelligence Resource Center
UK Government Digital Service AI Guide
Data Provenance​
Content Authenticity Initiative, works to create a cross-industry system for providing media transparency to allow for better evaluation of content
Starling Lab, an academic research lab innovating cryptographic methods to meet the technical and ethical challenges of establishing trust in our most sensitive digital records
Project Origin, a collaboration between media organizations to combat disinformation by verifying source authenticity of video/audio ​
Related Organizations​​
Knowing Machines, a research project tracing the histories, practices, and politics of how machine learning (AI) systems are trained to interpret the world
Partnership on AI, a consortium of tech companies, nonprofits, and media organizations that develop best practices and frameworks for responsible AI
Witness Media Lab, an organization that develops, models, and supports innovative approaches to sourcing, verifying, and contextualizing eyewitness videos and ensuring that footage taken by average citizens can serve as an effective tool for human rights
The Synthesis, a monthly column from the International Documentary Association exploring the intersection of Artificial Intelligence and documentary practice.
Ada Lovelace Institute, an explainer for anyone who wants to learn more about foundation models, also known as 'general-purpose artificial intelligence' or 'GPAI'​
Impacts of Generative AI
General​ Reading
The Unbelievable Scale of AI’s Pirated-Books Problem (The Atlantic, March 2025)
Inside the Race to Protect Artists from Artificial Intelligence (Scientific American, June 2024)
How to Stop Your Data From Being Used to Train AI (Wired, October 2024)
The Tricky Truth about How Generative AI uses your Data (Vox, July 2023)
Mapping the issues and uses of AI for journalism (Report by Public Service Medias: Alliance for Facts)
​
Algorithmic Bias
OpenAI's Sora is Plagued by Sexist, Racist and Ableist Biases​ (Wired, March 2025)
How AI reduces the world to stereotypes (Rest of the World, October 2023)
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis (The Verge, February 2024)
​
AI and the Environment
Explained: Generative AI’s environmental impact (MIT News, January 2025)
The Climate and Sustainability Implications of Generative AI (MIT, March 2024)
​
AI and Labor
How GenAI is Already Impacting the Labor Market (Harvard Business Review, November 2024)
​
Beyond USA
​Spain imposes fines for not labeling GenAI (Reuters, March 2025)
​
​