AI’s dark side… Explicit AI-generated images of Taylor Swift went viral last week, with one post on X getting 45M views and 24K reposts before being taken down. To rein in the images’ reach, X said it had temporarily blocked all searches for “Taylor Swift” and announced it was hiring more moderators (Musk had laid off most of the trust-and-safety team when he took over). The images were first spotted on Telegram, a social platform with 800M+ users, and is now being criticized for its notoriously lax moderation.
Deeper than fake: Deepfakes are AI-generated images, video, or audio created to resemble a real person.
Real-life examples: Fake videos of celebs like MrBeast and Tom Hanks pitching scams have been plaguing TikTok and YouTube, and recent fake audio of President Biden telling people not to vote has raised concerns about misinfo in this election year.
High stakes: The White House said it was alarmed by the Swift deepfakes and asked Congress to take legislative action.
Frankensteinian crisis… Tech cos are scrambling to control their AI creations. Microsoft added extra guardrails to its genAI tool, Designer, which 404 Media reported was used to create the Swift images (Microsoft said it’s investigating the claim). Meanwhile, Congress introduced a bill earlier this month that would effectively ban AI content that nonconsensually imitates any real person. Deepfake porn was recently found at the top of Google and Bing search results, and AI-generated nonconsensual nude photos have been used to bully teens.
Playing catch-up is a losing game… While the dangers of AI (and deepfakes specifically) have been predicted for years, legislators and tech cos are struggling to keep it in check. Last year, 1K+ tech CEOs including Musk signed a letter calling for a pause in advanced AI development until potential risks could be studied… but that pause didn’t happen. As more problems arise, federal and state laws could facilitate more legal actions against those creating fake content, and against the platforms used to create and distribute that content.