A 7.7-magnitude earthquake on March 28, killing and injuring another 4,500 according to recent estimates. At least 440 people are still missing, and authorities are expecting . Though the epicenter was in central Myanmar, the powerful quake also rocked neighboring Thailand, and was even felt in several other countries in the region.
In Myanmar and Thailand, the overall scale of destruction is massive. Online, users have been sharing images of the devastation, and of the people affected. While some images are real, there is also a lot of -generated content circulating, as well as old images that have been decontextualized, and bizarre theories about what caused the earthquake. DW Fact check looked into some of the more viral claims.
Does this video show a skyscraper in Thailand?
Claim: This TikTok video, which had over 780,000 views at the time of writing, claims to show a skyscraper in Thailand that was badly damaged by the earthquake.
: False.
The video is not real, it’s AI-generated. There are many clues that give this away.
First, the damage looks too clean. We do not see any debris on the ground, and although the exterior of the building appears severely damaged, its interior — visible through a gaping hole in the facade — looks strangely intact.
Also, the movement in the video looks artificial: The cars move unrealistically smoothly, and seem to drive right through pedestrians on the street. Halfway through the video, a white line, supposedly marking a lane on the road, starts to shift.
In breaking-news situations like a devastating natural disaster, it’s common for AI-generated videos on the event to appear online shortly after. Mostly, the aim is to generate clicks, or confuse people. Many videos circulating online can easily be identifed as AI-generated, as users frequently point out in the comment section (like hereor here).
Others look quite realistic. This AI-generated video, for example, was shared on X by a university professor who does not appear to have questioned its accuracy. It has circulated widely and was shared in several languages on different social media platforms (like TikTok, Instagram, Facebook).
We can conclude that the video of the skyscraper in Thailand is AI-generated because the people and cars depicted do not move in a natural, realistic manner, and the road surface marking is not static. The damage to the building is also unrealistic.
Decontextualization of real images
When natural disasters occur, next to a rise in AI-generated images, we often see that (real) images are removed from their original context and rehashed in new stories. This act changes their original meaning, which is why it is important to closely verify if image descriptions match what the image shows.
Claim: This post on Xclaims to show people being washed away by water flooding a street following in Bangkok, Thailand.
DW Fact check: Misleading.
Using the search engine TinEye, a reverse image search of a screenshot from the video points to the account of BNO News Live as one of the first to post the video. This news organization, based in the Netherlands, posted the video on March 28 and stated that it shows the Yunnan province in China.
The same search also revealed other early postings of the video, including an article by the British tabloid The Daily Mail. Here, we also found other material that depicts the same incident from various perspectives.
Most of them mention the online video marketplace Newsflare as the copyright holder. A simple Google keyword search inlcuding the terms “Newsflare” and “Yunnan China” brought us to a page on their website, that states that the video was captured on March 28, on Mengmao Road, presumably in Dehong Dai and Jingpo autonomous prefecture, China.
We could not verify this exact location, but by aligning visual cues from other videos on Newsflare, we we able to make out a surveillance camera that probably recorded the video in question and is located on the Ruili Fortune Plaza in China’s Yunnan province.
In this case, the video does show the aftermath of the March 28 earthquake, but users have incorrectly claimed the footage stems from Thailand rather than China.
Was a secret US weapon behind the disaster?
Claim: A post, followed by a thread on the social media platform X, argues that the High-frequency Active Auroral Research Program (HAARP) of the University of Alaska Fairbanks is behind the earthquakes in Myanmar and Thailand.
DW Fact check: False.
This isn’t the first time conspiracy theories about HAARP or similar technologies triggering natural disasters have appeared online. The DW Fact check team debunked similar assertions in and 2023, when other natural disasters were making the headlines.
Still, some conspiracy theorists believe HAARP is a weather-controlling weapon, capable of causing natural disasters like the recent earthquake in Myanmar. A former professor of astronomy at Boston University, Jeffrey Hughes, told the press agency Agence France-Presse (AFP) that there is “there is no way” a system like HAARP could be used to create such an effect.
HAARPis a high-frequency transmitter that was designed in the US in the 1990s. Located in Alaska, the system consists of 180 antennas that send radio waves to the ionosphere, a layer of the Earth’s atmosphere, to study its behavior.
Earthquakes are caused by the movement of tectonic plates. When these plates interact, they can collide, slide past each other, or pull apart, creating stress and strain in the Earth’s crust. This process is common and has happened for ages, but the intensity of such interactions can vary. You can read more about them .
So, no, HAARP did not cause the devastating earthquakes in Myanmar, Thailand and China. It was a natural disaster, caused by the Earth’s shifting tectonic plates.
Edited by: Kathrin Wesolowski, Rachel Baig
The post Fact check: False content on Myanmar, Thailand earthquake appeared first on Deutsche Welle.