Share
Try our deepfake detector with your work email
Is this real?
TrueMedia.org verdict: substantial evidence of manipulation.
Verified by human analyst
Analysis
Results
Visual Noise
Uncertain
Generative AI
Little Evidence
Disclaimer: TrueMedia.org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
Visual Noise1
Variations in pixels and color
Substantial Evidence
95% confidence
Diffusion-Generated Image Detection
Evaluates the discrepancy between the original image and the version reconstructed by pre-trained diffusion models. Such models are known to potentially capture visual noise, commonly associated with the diffusion process.
Generative AI4
Detects signatures of GenAI tools
Little Evidence
25% confidence
Image Generator Analysis
Analyzes image for indications that it was generated by popular AI image generators, like MidJourney, Dall-E, Stable Diffusion and thispersondoesnotexist.com.
Little Evidence
AI Generated Image Detection
The model was trained on a large dataset comprising millions of artificially generated images and human-created images such as photographs, digital and traditional art, and memes sourced from across the web.
Little Evidence
AI Generated Image Detection
Detects AI-generated photo-realistic images, created for example by Generative Adversarial Networks and Diffusion Models like Stable Diffusion, MidJourney, DALL·E 2 and others.
Little Evidence
Multi-Modal AI
Analyzes the image for indications it was generated by AI or otherwise digitally manipulated.
Details
.webp
File type and size
79KB
Processing time
Time to analyze media
> 90m
Analyzed on
When media was analyzed
Mon, Mar 11, 2024
Join Waitlist
Get beta access
Test your own media for deepfakes.