Try our deepfake detector with your work email
Is this real?
TrueMedia.Org detected substantial evidence of manipulation.
File type and size
AI tools queried
Partners and in house
Processing time
Time to analyze media
21m 57s
First analyzed on
Thu, Feb 29, 2024
Disclaimer: TrueMedia.Org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
AI manipulation analysis
Highly Suspicious
97% confidence
Advanced Foundational Features
Detects AI-synthesized audio using a larger foundation model trained over more diverse data than the foundational audio model.
Highly Suspicious
88% confidence
Face Manipulation
Detects potential AI manipulation of faces present in images and videos, as in the case of face swaps and face reenactment.
Highly Suspicious
81% confidence
Face Blending
Face blending is a process where AI takes two or more images and creates a new composite facial image.
Highly Suspicious
80% confidence
Deepfake Face Detection
Visual detection models localize an object of interest in a video by returning a box that bounds that object. Any faces in the query are passed through an additional classification step to identify whether or not they are deepfakes.
Highly Suspicious
72% confidence
Foundational Features
Detects AI-synthesized audio.
Little Evidence
29% confidence
Generative Convolutional Vision Transformer
Analyzes video for visual artifacts and uses latent data distribution to detect AI manipulation.
Little Evidence
12% confidence
Distinguishes real faces from deepfake faces.
Join Waitlist
Get beta access
Test your own media for deepfakes.