YouTube is quietly rewriting its rules, allowing more videos that break its policies to remain online if they're deemed "in the public interest"—marking a major shift in how the platform polices political and controversial speech, reports the New York Times. This shift, introduced in mid-December training materials reviewed by the Times but not publicly announced, comes as major social platforms face political pressure to reduce policing of online speech, especially from Republicans following President Trump's return to office. Moderators are now instructed to leave up videos with potential rule violations as long as at least half of the content complies with YouTube's rules. Previously, videos were taken down if a quarter of their content broke the guidelines.
YouTube's exceptions now extend to topics like elections, social issues, and public figures, echoing moves by Meta and X, which have shifted content policing to users. YouTube says adjustments are made to "protect free expression" and keep policies in line with evolving definitions of public interest. Still, the relaxed approach has prompted criticism. Experts warn that scaling back moderation may accelerate the spread of misinformation and hate speech, citing examples where false or inflammatory content has reached millions. "What we're seeing is a rapid race to the bottom," Imran Ahmed, head of the Center for Countering Digital Hate, tells the Times. "This is not about free speech. It's about advertising, amplification, and ultimately profits." (This content was created with the help of AI. Read our AI policy.)