In the hours following the shooting of 37-year-old Renee Nicole Good by a masked federal agent in Minneapolis, social media exploded with AI-manipulated images claiming to “unmask” the officer, incorrectly revealing their identity. The federal agent was later identified by Department of Homeland Security spokesperson Tricia McLaughlin as an officer from Immigrations and Customs Enforcement (ICE). The incident took place on Wednesday morning, with social media footage showing two masked agents approaching an SUV parked on the road in a suburb south of downtown Minneapolis. One officer seemed to ask the driver to exit the vehicle before attempting to open the door. The driver then appears to reverse and turn forward. A third masked officer, positioned near the front of the SUV, pulled out a weapon and fired, resulting in Good’s death.
Immediately after the shooting, videos circulating on social media didn’t show the agents without their masks. However, various altered images showcasing an unmasked agent quickly emerged online. These images appeared to be screenshots from the footage but had been modified using AI tools to create a version of the officer’s face.
WIRED examined numerous AI-altered images that popped up across major social media platforms, including X, Facebook, Threads, Instagram, BlueSky, and TikTok. “We need his name,” said Claude Taylor, founder of the anti-Trump Mad Dog PAC, in a post on X that featured an AI-altered image of the agent. This post garnered over 1.2 million views. Taylor did not respond to requests for comment.
On Threads, an account named “Influencer_Queeen” shared an AI-altered image of the officer, stating, “Let’s get his address. But only focus on HIM. Not his kids.” This post received nearly 3,500 likes.
Hany Farid, a professor at UC Berkeley who has studied AI’s capabilities in enhancing facial images, noted, “AI-powered enhancement tends to hallucinate facial details, resulting in an image that may look clear but lacks accuracy regarding biometric identification. In this case, where part of the face is covered, I believe AI or any method cannot accurately reconstruct the individual’s identity.”
Some users who shared these images also falsely claimed to have identified the agent, posting names of real individuals and sometimes linking to their social media accounts. WIRED verified that two names circulating online don’t appear connected to anyone from ICE. While many posts sharing these altered images received limited interaction, some gained considerable traction.
One of the names incorrectly tied to the shooting is Steve Grove, the CEO and publisher of the Minnesota Star Tribune, who previously served in Minnesota Governor Tim Walz’s administration. Chris Iles, the Star Tribune’s vice president of communications, stated, “We are currently monitoring a coordinated online disinformation campaign incorrectly identifying the ICE agent involved in yesterday’s shooting. To be clear, the ICE agent has no known affiliation with the Minnesota Star Tribune and is certainly not our publisher and CEO Steve Grove.”
This incident isn’t unique; AI has caused similar issues following past shootings. A comparable situation arose in September when Charlie Kirk was killed, prompting the widespread sharing of an AI-altered image of the shooter, which bore no resemblance to the individual ultimately apprehended and charged with Kirk’s murder.
