Current:Home > StocksFake babies, real horror: Deepfakes from the Gaza war increase fears about AI’s power to mislead -AssetLink
Fake babies, real horror: Deepfakes from the Gaza war increase fears about AI’s power to mislead
View
Date:2025-04-14 00:30:14
WASHINGTON (AP) — Among images of the bombed out homes and ravaged streets of Gaza, some stood out for the utter horror: Bloodied, abandoned infants.
Viewed millions of times online since the war began, these images are deepfakes created using artificial intelligence. If you look closely you can see clues: fingers that curl oddly, or eyes that shimmer with an unnatural light — all telltale signs of digital deception.
The outrage the images were created to provoke, however, is all too real.
Pictures from the Israel-Hamas war have vividly and painfully illustrated AI’s potential as a propaganda tool, used to create lifelike images of carnage. Since the war began last month, digitally altered ones spread on social media have been used to make false claims about responsibility for casualties or to deceive people about atrocities that never happened.
While most of the false claims circulating online about the war didn’t require AI to create and came from more conventional sources, technological advances are coming with increasing frequency and little oversight. That’s made the potential of AI to become another form of weapon starkly apparent, and offered a glimpse of what’s to come during future conflicts, elections and other big events.
“It’s going to get worse — a lot worse — before it gets better,” said Jean-Claude Goldenstein, CEO of CREOpoint, a tech company based in San Francisco and Paris that uses AI to assess the validity of online claims. The company has created a database of the most viral deepfakes to emerge from Gaza. “Pictures, video and audio: with generative AI it’s going to be an escalation you haven’t seen.”
In some cases, photos from other conflicts or disasters have been repurposed and passed off as new. In others, generative AI programs have been used to create images from scratch, such as one of a baby crying amidst bombing wreckage that went viral in the conflict’s earliest days.
Other examples of AI-generated images include videos showing supposed Israeli missile strikes, or tanks rolling through ruined neighborhoods, or families combing through rubble for survivors.
In many cases, the fakes seem designed to evoke a strong emotional reaction by including the bodies of babies, children or families. In the bloody first days of the war, supporters of both Israel and Hamas alleged the other side had victimized children and babies; deepfake images of wailing infants offered photographic ‘evidence’ that was quickly held up as proof.
The propagandists who create such images are skilled at targeting people’s deepest impulses and anxieties, said Imran Ahmed, CEO of the Center for Countering Digital Hate, a nonprofit that has tracked disinformation from the war. Whether it’s a deepfake baby, or an actual image of an infant from another conflict, the emotional impact on the viewer is the same.
The more abhorrent the image, the more likely a user is to remember it and to share it, unwittingly spreading the disinformation further.
“People are being told right now: Look at this picture of a baby,” Ahmed said. “The disinformation is designed to make you engage with it.”
Similarly deceptive AI-generated content began to spread after Russia invaded Ukraine in 2022. One altered video appeared to show Ukrainian President Volodymyr Zelenskyy ordering Ukrainians to surrender. Such claims have continued to circulate as recently as last week, showing just how persistent even easily debunked misinformation can be.
Each new conflict, or election season, provides new opportunities for disinformation peddlers to demonstrate the latest AI advances. That has many AI experts and political scientists warning of the risks next year, when several countries hold major elections, including the U.S., India, Pakistan, Ukraine, Taiwan, Indonesia and Mexico.
The risk that AI and social media could be used to spread lies to U.S. voters has alarmed lawmakers from both parties in Washington. At a recent hearing on the dangers of deepfake technology, U.S. Rep. Gerry Connolly, Democrat of Virginia, said the U.S. must invest in funding the development of AI tools designed to counter other AI.
“We as a nation need to get this right,” Connolly said.
Around the world a number of startup tech firms are working on new programs that can sniff out deepfakes, affix watermarks to images to prove their origin, or scan text to verify any specious claims that may have been inserted by AI.
“The next wave of AI will be: How can we verify the content that is out there. How can you detect misinformation, how can you analyze text to determine if it is trustworthy?” said Maria Amelie, co-founder of Factiverse, a Norwegian company that has created an AI program that can scan content for inaccuracies or bias introduced by other AI programs.
Such programs would be of immediate interest to educators, journalists, financial analysts and others interested in rooting out falsehoods, plagiarism or fraud. Similar programs are being designed to sniff out doctored photos or video.
While this technology shows promise, those using AI to lie are often a step ahead, according to David Doermann, a computer scientist who led an effort at the Defense Advanced Research Projects Agency to respond to the national security threats posed by AI-manipulated images.
Doermann, who is now a professor at the University at Buffalo, said effectively responding to the political and social challenges posed by AI disinformation will require both better technology and better regulations, voluntary industry standards and extensive investments in digital literacy programs to help internet users figure out ways to tell truth from fantasy.
“Every time we release a tool that detects this, our adversaries can use AI to cover up that trace evidence,” said Doermann. “Detection and trying to pull this stuff down is no longer the solution. We need to have a much bigger solution.”
veryGood! (874)
Related
- USA women's basketball live updates at Olympics: Start time vs Nigeria, how to watch
- The 5-minute daily playtime ritual that can get your kids to listen better
- What we know about Ajike AJ Owens, the Florida mom fatally shot through a neighbor's door
- Don't Be Tardy Looking Back at Kim Zolciak and Kroy Biermann's Romance Before Breakup
- NCAA hands former Michigan coach Jim Harbaugh a 4-year show cause order for recruiting violations
- The Mystery of the Global Methane Rise: Asian Agriculture or U.S. Fracking?
- Today’s Climate: July 30, 2010
- GM to Be First in U.S. to Air Condition Autos with Climate Friendly Coolant
- JoJo Siwa reflects on Candace Cameron Bure feud: 'If I saw her, I would not say hi'
- How to Clean Your Hairbrush: An Easy Guide to Remove Hair, Lint, Product Build-Up and Dead Skin
Ranking
- Illinois Gov. Pritzker calls for sheriff to resign after Sonya Massey shooting
- U.S. Pipeline Agency Pressed to Regulate Underground Gas Storage
- Pat Robertson, broadcaster who helped make religion central to GOP politics, dies at age 93
- Concussion protocols are based on research of mostly men. What about women?
- Connie Chiume, South African 'Black Panther' actress, dies at 72
- In California, Climate Change Is an ‘Immediate and Escalating’ Threat
- Today’s Climate: Juy 17-18, 2010
- Shipping’s Heavy Fuel Oil Puts the Arctic at Risk. Could It Be Banned?
Recommendation
Working Well: When holidays present rude customers, taking breaks and the high road preserve peace
Arkansas family tries to navigate wave of anti-trans legislation
Paying for mental health care leaves families in debt and isolated
Shonda Rhimes Teases the Future of Grey’s Anatomy
House passes bill to add 66 new federal judgeships, but prospects murky after Biden veto threat
Two officers fired over treatment of man who became paralyzed in police van after 2022 arrest
Anti-Eminent Domain but Pro-Pipelines: A Republican Conundrum
Get 2 Bareminerals Tinted Moisturizers for the Less Than the Price of 1 and Replace 4 Products at Once