Inspecting Algorithms in Social Media Platforms: Bridging Regulators and Independent Auditors
New legislation is paving the way for regulatory inspection of social media platforms, including through the UK’s Online Harms Bill and the EU’s Digital Services Act. A regulator with audit powers will fundamentally reshape the inspection ecosystem, and create new opportunities for external auditors – such as civil society, journalists and academic researchers (many represented in the FAccT community) – to access platform data in reliable, legal ways. This may include the need for regulators to inspect the underlying algorithms that power these systems, such as recommendation engines, ad delivery systems, automated moderation detection systems, and others. As of now, no regulator has conducted an algorithm inspection of this kind, which raises a number of questions: - What could a regulatory algorithm inspection look like? - How could it be informed by lessons from external, independent actors who have conducted these types of inspections? - Moving forward, how could a regulator further support these external actors? Through these questions, the workshop will begin to develop requirements for a regulatory algorithm inspection. Following expert lightning talks, participants will respond to a case study and set of prompts in interactive breakout groups. Participants will draw on their own experiences to identify the specifications for - and hurdles to - a robust regulatory algorithm inspection.