Is this real?
Downloading Media
0.0MB - 0%
TrueMedia.Org detected substantial evidence of manipulation.
Analysis
Results
Faces
Generation or manipulation of faces
Highly Suspicious
Voices
Voice cloning or generation
Disclaimer: TrueMedia.Org uses both leading vendors and state-of-the-art academic AI methods. However, errors can occur.
Faces5
Generation or manipulation of faces
Highly Suspicious
86% confidence Generative Convolutional Vision Transformer
Analyzes video for visual artifacts and uses latent data distribution to detect AI manipulation.
Highly Suspicious
74% confidence Face Manipulation
Detects potential AI manipulation of faces present in images and videos, as in the case of face swaps and face reenactment.
Highly Suspicious
67% confidence Face Blending
Face blending is a process where AI takes two or more images and creates a new composite facial image.
Highly Suspicious
63% confidence Universal
Distinguishes real faces from deepfake faces.
Little Evidence
14% confidence Deepfake Face Detection
Visual detection models localize an object of interest in a video by returning a box that bounds that object. Any faces in the query are passed through an additional classification step to identify whether or not they are deepfakes.
Voices3
Voice cloning or generation
Little Evidence
2% confidence AI Generated Audio Detection
The model detects AI generated audio.
Unknown
Foundational Features
Detects AI-synthesized audio.
Unknown
Advanced Foundational Features
Detects AI-synthesized audio using a larger foundation model trained over more diverse data than the foundational audio model.