TrueMedia.Org

BETA
Share
Try our deepfake detector with your work email
Is this real?
Summary
TrueMedia.Org detected substantial evidence of manipulation.
.jpg
File type and size
129KB
AI tools queried
Partners and in house
8
Processing time
Time to analyze media
5492m 33s
First analyzed on
Fri, Mar 1, 2024
Disclaimer: TrueMedia.Org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
AI manipulation analysis
Highly Suspicious
100% confidence
AI Generated Image Detection
The model was trained on a large dataset comprising millions of artificially generated images and human-created images such as photographs, digital and traditional art, and memes sourced from across the web.
Highly Suspicious
100% confidence
Diffusion
Detects the signature of images created by Stable Diffusion.
Highly Suspicious
98% confidence
Visual Noise Analysis
Tests for random variation of brightness or color information in pixel values of images.
Highly Suspicious
97% confidence
Image Generator Analysis
Analyzes image for indications that it was generated by popular AI image generators, like MidJourney, Dall-E, Stable Diffusion and thispersondoesnotexist.com.
No Evidence
Diffusion-Generated Image Detection
Evaluates the discrepancy between the original image and the version reconstructed by pre-trained diffusion models. Such models are known to potentially capture visual noise, commonly associated with the diffusion process.
No Evidence
GANs
Detects the signature of images created by Generative Adversarial Networks.
No Evidence
Faceswaps
Face swapping transfers the identity of one face to another, while keeping the other details, such as the expression.
No Evidence
AI Generated Image Detection
Detects AI-generated photo-realistic images, created for example by Generative Adversarial Networks and Diffusion Models like Stable Diffusion, MidJourney, DALL·E 2 and others.
Join Waitlist
Get beta access
Test your own media for deepfakes.