Can You Trust What You See? The Rise of Deepfakes and What It Means for Justice

by | Feb 15, 2026 | Uncategorized | 0 comments

Can You Trust What You See? The Rise of Deepfakes and What It Means for Justice

Copyright 2026, Steve Burgess

I’ve been working with digital evidence since 1985, and I’ve seen a lot of changes. Back then, the biggest challenge was recovering data from a 10 MB hard drive—yes, megabytes, for you youngsters who think gigabytes are small. But nothing compares to what’s happening right now with artificial intelligence.

Here’s something that should concern everyone: we’ve reached a point where anyone with a smartphone can create fake videos, photos, and audio recordings that look and sound completely real. Not “pretty good” fakes that experts can spot. I’m talking about fakes so convincing that even professionals like me have trouble telling the difference. And trust me, after 40 years of staring at computer screens, my ability to spot weird pixels is legendary.

This technology is called “deepfake” AI, and it’s already being used in courtrooms, whether judges and juries realize it or not.

What Are Deepfakes?

Think of deepfakes as Photoshop on steroids. Photoshop that went to Gold’s Gym, got a PhD in computer science from University of Phoenix, and decided to cause chaos. But instead of just touching up a photo, this technology can:

  • Put your face on someone else’s body in a video
  • Make you appear to say things you never said
  • Create recordings of your voice saying anything
  • Generate completely fake images that look like real photographs

The scary part? This used to require expensive equipment and expert knowledge. Now there are apps that can do it on your phone in minutes. The same phone you use to watch cat videos and argue with strangers on the Internet.

Why This Matters in Court

Imagine you’re on a jury. The prosecutor shows you a video of the defendant at the crime scene. It looks real. It sounds real. The metadata (the hidden information in the file) says it was recorded at the right time and place.

But what if that video was created by AI? What if the defendant was never actually there?

Or flip it around: what if you’re accused of something, and there’s a fake video showing you doing it? How do you prove it’s not real when it looks so convincing?

This isn’t hypothetical. These situations are already happening in courtrooms. Welcome to the future, where “the camera doesn’t lie” is officially retired as a saying.

The Old Rules Don’t Work Anymore

For decades, we’ve had pretty good ways to tell if photos and videos were tampered with. We could look at the file’s internal structure, check when and where it was created, and spot the digital fingerprints left behind when someone edits an image.

But AI-generated fakes don’t leave those fingerprints. They’re not edited versions of real photos or videos—they’re created from scratch. There’s nothing to compare them to because there was never an original.

It’s like trying to prove a painting is a forgery when there’s no original painting to compare it to.

What’s Being Done About It

The good news is that courts are starting to take this seriously.

In 2025, new federal rules were put in place that require stricter verification of digital evidence that might involve AI. Judges can now demand that experts test evidence with multiple detection tools before it’s allowed in court.

Some tech companies are also developing authentication systems. Think of it like a tamper-proof seal on a medicine bottle, but for digital files. These systems create a kind of digital signature the moment a photo or video is captured, proving it’s authentic.

The problem? Most phones and cameras don’t have this technology yet, and it could be years before it becomes standard. So, we’re in that awkward phase where the bad guys have a bunch of cool toys and the good guys are still waiting for their equipment to ship.

How Do Experts Catch Fakes?

I wish I could tell you we have a foolproof method, but we don’t. If I had one, I’d be drinking mojitos on a beach somewhere instead of writing this article. What we do is use multiple approaches:

Run detection software. There are programs specifically designed to spot AI-generated content. They’re not perfect, but they can catch many fakes.

Look for impossible things. Sometimes AI makes mistakes—lighting that doesn’t match, shadows falling the wrong way, people with eleven fingers, or physics that just don’t work in the real world. AI is smart, but it still occasionally forgets how gravity works.

Check the story. Where did this evidence come from? Who had access to it? Does it make sense that this file exists?

Compare with other evidence. If you have ten photos from an event and nine look normal but one looks suspicious, that’s a red flag.

The truth is, detecting deepfakes is getting harder as the technology improves. The detecting tech gets better, too. But it’s an arms race, and right now, the fakers are ahead. It’s like playing whack-a-mole, except the moles have PhDs and keep getting smarter.

What This Means for You

You might be thinking, “I’m not a lawyer or a forensics expert. Why should I care?”

Here’s why: we’re all going to be affected by this technology.

Maybe you’ll serve on a jury and need to decide if evidence is real. Maybe you’ll see a video of a public figure saying something outrageous (that they never actually said). Maybe someone will use this technology to fake evidence against you or someone you love. Or maybe you’ll just want to know if that video of your uncle “dancing” at the wedding is real or if someone’s playing a prank.

We’re entering a era where “seeing is believing” is no longer true. That’s a fundamental shift in how we understand reality and truth.

What You Can Do

Be skeptical. Just because a video looks real doesn’t mean it is. This is especially important for dramatic claims or shocking content. If it seems too crazy to be true, it might be—but then again, it’s 2026, so who knows anymore.

Check sources. Before believing or sharing something, try to verify where it came from. Is it from a credible source?

Understand the stakes. In legal situations—whether it’s a court case, an insurance claim, or a business dispute—insist on proper verification of digital evidence.

Stay informed. This technology is evolving rapidly. What’s true today might be outdated in six months.

Looking to the Futureblank

I’ve been doing this work for over 40 years, and I’ve learned that technology always creates new challenges. But we adapt. We develop new tools, new methods, new ways of finding the truth. It’s what keeps forensic experts like me from having to learn how to do something sensible, like accounting, or racing cars.

Courts are updating their rules. Researchers are building better detection tools. Tech companies are working on authentication systems. It’s going to take time, but we’ll figure this out.

In the meantime, we all need to be more careful about what we believe and more demanding about proof. The old saying “don’t believe everything you see on the Internet” has never been more important.

Because in 2026, sometimes you can’t believe your own lyin’ eyes. Which is a little ironic, because your eyes aren’t the ones lyin – it’s the computers. But try explaining that to a jury.

Steve Burgess has been working in digital forensics since 1985 and has examined evidence in over 20,000 cases. If you have questions about digital evidence or think you might be dealing with AI-generated content, Burgess Forensics can help. We also promise not to create any deepfakes of you. That’s in our mission statement.
steve@burgessforensics.com     

 

Related Posts

Deepfakes, AI, and the New Frontier of Digital Evidence

Deepfakes, AI, and the New Frontier of Digital Evidence Copyright 2026, Steve Burgess It was true forty years ago and it's truer today: "Just because it's digital doesn't mean it's true." We're now facing a challenge that would have seemed like science fiction when I...

Top 5 Mistakes Lawyers Make With Digital Evidence

Top 5 Mistakes Lawyers Make With Digital Evidence  -      Copyright 2026, Steve Burgess After forty years working with attorneys on digital evidence, I've seen the same mistakes cost cases time and again. Here are five of the most common – and how to avoid them....

20 Digital Forensics Facts for Attorneys

20 Digital Forensics Facts for Attorneys copyright 2025 Steve Burgess Deleted ≠ gone: Most deleted files remain recoverable until overwritten. Every case is a data case: Even “non-digital” disputes usually contain text messages, emails, or documents. Forensic imaging:...

Screenshots Are Barely Evidence: How to Authenticate Digital Data in Court

Screenshots Are Barely Evidence: How to Authenticate Digital Data in Court By Steve Burgess, Copyright 2025 Screenshots are convenient. They’re quick, visual, and easy for clients to share — but in the courtroom, convenience can be a trap. Screenshots alone rarely...

How to Dodge Pegasus Spyware

How to Dodge Pegasus Spyware, copyright 2025 by Steve Burgess Pegasus was a superfast magical horse from Greek mythology that could fly over barriers, see everything from above, avoid detection, and had a really cute family in Disney’s Fantasia. The other Pegasus is a...

Picture This: Keep Your Kids Safe Online.

Copyright 2025, Steve Burgess Yes, social media is fun. It helps to keep us in touch and stay in relationship with friends and loved ones. Even folks you haven’t seen or heard from in decades. And, as we all know, our kids are the most beautiful, creative, intelligent...

Two Factor Authentication Fraud

Two Factor Authentication Fraud - copyright Steve Burgess, 2025 One of the better ways to protect yourself from online fraud is Two-Factor Authentication (2FA). This scheme is also known as 2-Step (or dual-step) Verification or Authentication, or Multi-Factor...

AI and Elder Abuse

AI and Elder Abuse, copyright 20025, Steve Burgess The news is full of AI (Artificial Intelligence) stories. How will it empower us in our jobs? Whose job will it take next? Is it creating actual fake news?  While there’s a lot of “we’ll see” in the answers to these...

Email spoofing, scamming, and hacking

Email spoofing, scamming, and hacking, Copyright 2024 by Steve Burgess Email domain spoofing scams With fortunes, privacy, and identity fraud at stake, we have had a number of cases involving phishing and spoofing in the past few years and into the present where...

AT&T Data Breach and Hack: What Does it Mean to Me?

AT&T Data Breach and Hack: What Does it Mean to Me? copyright 2024, Steven Burgess It was ginormous. It included almost all wireless customers from 2022. Did you have an AT&T phone or other account in 2022? You’re one of 110 million (gasp). You be hacked, my...

Pin It on Pinterest

Share This