Share
Try our deepfake detector with your work email
Is this real?
TrueMedia.org verdict: substantial evidence of manipulation.
Verified by human analyst
Analysis
Results
Generative AI
Substantial Evidence
Visual Noise
Uncertain
Disclaimer: TrueMedia.org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
Generative AI4
Detects signatures of GenAI tools
Substantial Evidence
100% confidence
AI Generated Image Detection
MidJourney
Likely source
The model was trained on a large dataset comprising millions of artificially generated images and human-created images such as photographs, digital and traditional art, and memes sourced from across the web.
Substantial Evidence
95% confidence
Universal Fake Detector Analysis
Using the feature space of a large, pretrained vision-language model, this model analyzes images to determine if they were generated by a variety of popular generative and autoregressive models.
Substantial Evidence
76% confidence
AI Generated Image Detection
Detects AI-generated photo-realistic images, created for example by Generative Adversarial Networks and Diffusion Models like Stable Diffusion, MidJourney, DALL·E 2 and others.
Little Evidence
24% confidence
Image Generator Analysis
Analyzes image for indications that it was generated by popular AI image generators, like MidJourney, Dall-E, Stable Diffusion and thispersondoesnotexist.com.
Visual Noise1
Variations in pixels and color
Substantial Evidence
51% confidence
Diffusion-Generated Image Detection
Evaluates the discrepancy between the original image and the version reconstructed by pre-trained diffusion models. Such models are known to potentially capture visual noise, commonly associated with the diffusion process.
Details
.jpeg
File type and size
41KB
Processing time
Time to analyze media
> 90m
Analyzed on
When media was analyzed
Fri, Mar 1, 2024
Join Waitlist
Get beta access
Test your own media for deepfakes.