TrueMedia.org

BETA
Share
Try our deepfake detector with your work email
Is this real?
TrueMedia.org detected substantial evidence of manipulation.
Analysis
Results
Faces
Highly Suspicious
Voices
Little Evidence
Disclaimer: TrueMedia.org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
Faces5
Generation or manipulation of faces
Highly Suspicious
86% confidence
Generative Convolutional Vision Transformer
Analyzes video for visual artifacts and uses latent data distribution to detect AI manipulation.
Highly Suspicious
74% confidence
Face Manipulation
Detects potential AI manipulation of faces present in images and videos, as in the case of face swaps and face reenactment.
Highly Suspicious
67% confidence
Face Blending
Face blending is a process where AI takes two or more images and creates a new composite facial image.
Highly Suspicious
63% confidence
Universal
Distinguishes real faces from deepfake faces.
Little Evidence
14% confidence
Deepfake Face Detection
Visual detection models localize an object of interest in a video by returning a box that bounds that object. Any faces in the query are passed through an additional classification step to identify whether or not they are deepfakes.
Voices3
Voice cloning or generation
Little Evidence
2% confidence
AI Generated Audio Detection
The model detects AI generated audio.
Not Applicable
Foundational Features
Detects AI-synthesized audio.
Not Applicable
Advanced Foundational Features
Detects AI-synthesized audio using a larger foundation model trained over more diverse data than the foundational audio model.
Details
.mp4
File type and size
920KB
Processing time
Time to analyze media
3m 22s
Analyzed on
When media was analyzed
Mon, Mar 18, 2024
Join Waitlist
Get beta access
Test your own media for deepfakes.