The Consequences of Artificial Intelligence Deepfakes
The dangers of deepfake technology and its impact on society
Long-time readers of this Substack know that artificial intelligence (AI) is a topic that both fascinates and scares me. On one hand, it's a fantastic resource that can help make our lives easier. But on the other hand, it can also be used to hurt others.
This problem isn't unique to AI, of course. Technology is neutral. It all comes down to how it's used. For example, let’s look at social media.
On the one hand, social media is a resource that can connect like-minded individuals from around the world who usually wouldn't be connected. On the other hand, it can also be used to bully people.
Instead of writing letters and sending them through the mail, we can now instantly communicate with whoever we want, no matter where we are. The downside is that we are inundated with messages all day, every day, which can get overwhelming at times.
What concerns me about AI is its potential to create realistic deepfakes, such as videos, recordings, or pictures of people doing and saying things that never happened. This can lead to severe embarrassment, shattered trust in relationships, or even the complete destruction of careers.
Deepfakes have been around for a while, but they have often been used for comedic purposes or to show what could have been in a movie or TV show. One of my favorite deepfakes was in the season 2 finale of The Mandalorian.
The last scene shows Luke Skywalker, but they used deep fake technology to make Mark Hamill look and sound younger, as though the scene was shot in the 1980s when the original trilogy was released.
Before his death, James Earl Jones signed over the rights to his voice for Darth Vader so Lucasfilm could use AI to recreate his voice for future Star Wars projects. I wouldn't be surprised if more actors and actresses start to do this more frequently in the future.
While it may be cool to see Mark Hamill "de-aged" or an exciting prospect to think that James Earl Jones' legacy as Darth Vader will continue even after he's gone, deepfake technology powered by AI is another double-edged sword that can and will be used for nefarious purposes. For instance, it can be used to create fake news, manipulate politics, or even frame innocent individuals for crimes they didn't commit.
A high school principal in Baltimore found this out the hard way when he was framed for saying something he didn't say. The deepfake audio was so convincing that it led to his suspension and a public outcry before the truth was revealed.
CNN has the story: A school principal faced threats after being accused of offensive language on a recording. Now police say it was a deepfake
"The recording went viral in January, provoking rage in suburban Baltimore. It seemed that Pikesville High School Principal Eric Eiswert had been caught making racist and antisemitic comments. Angry phone calls overwhelmed the front desk. Employees felt afraid. Security was tightened.
Eiswert was placed on administrative leave pending an investigation. He received various threats of violence. A police report said one person told him the “world would be a better place if you were on the other side of the dirt.”
All along, Eiswert denied making the offensive remarks. He said that wasn’t his voice on the recording. He believed it was an AI deepfake. And on Thursday, law-enforcement authorities announced they believe he was right.
The recording was indeed a fake, according to the Baltimore County Police Department. And the man accused of making it — a school employee who had clashed with the principal — was arrested on charges that included disturbing the operation of a school.
“Today, we are relieved to have some closure on the origins of this audio,” Baltimore County Executive Johnny Olszewski said at a news conference on Thursday that hinted at the disturbing possibilities of artificial intelligence. “However, it is clear that we are also entering a new, deeply concerning frontier.”
This story is a reminder of the uncharted territory we find ourselves in. We are entering a period where AI can create realistic recordings, photos, and even videos, yet no legal guardrails exist.
It’s true that this man who allegedly created and distributed the fake recording of the principal was charged. What he allegedly did was immoral and defamatory. But is the act of creating deepfakes for nefarious purposes a criminal act? Not yet (as far as I can tell). That's why they didn't charge him with using AI. AI is an area where the law hasn’t caught up to the technology. I hope that something is done before things get out of hand.
Technology is wonderful. For example, I couldn't efficiently and cheaply produce this Substack or work an office job without modern computers and Internet access. Throughout history, new technology and innovations have propelled our society forward.
Whenever we invent a new technology, it is almost always used in harmful ways. It feels like we are on the brink of a 1984 dystopian future where you can’t believe what you see or hear anymore.
“The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.”
What do you think about AI-generated deepfakes?
Let me know in the comments below👇
About Dad Think
My name is Joe. I’m a husband, dad, writer, and podcaster. If you're new here, welcome! If you are a returning reader, it's great to have you back!
Dad Think is my personal corner of the internet. Here, I explore how technology is reshaping society and culture from a father’s point of view—and try to make sense of it all. From artificial intelligence to social media and digital life, I break down how the tools shaping our world affect parents, families, and future generations. And sometimes, I also share funny and meaningful moments of raising two young kids.
Subscribe to this newsletter and follow me on X, Facebook, and Instagram for more.
Read the Latest From Dad Think with Joe Ludwig
If you enjoyed reading this, then check out my other work.
Technology
Society and Culture
Parenting Stories
Thank you so much for reading!
Until next time,
Joe