Five Signs to Watch for When It Comes to Deepfakes

  • October 2, 2025

Imagine you're at your desk when a coworker video calls you. They're not the type to reach out unexpectedly, but you answer, just in case it's something important.

Right away, something feels off. They're sitting in a dimly lit room, their voice sounds slightly different, they're not responding to your humor, and they ask you to urgently send over important company documents. Without questioning the request, you send the files. The call ends abruptly.

Unfortunately, you've just been targeted by a deepfake, a growing cyber threat that every housing organization should understand and be prepared to respond to. Keep reading to learn more about this evolving form of cyberattack and practical tips your agency can use to protect employees and sensitive data.

What is a deepfake?

A deepfake is a type of synthetic media that uses artificial intelligence to alter someone’s appearance, voice, or actions in a way that looks real. This technology is advancing rapidly, and cybercriminals are already using it to impersonate trusted colleagues or leaders in order to trick people into revealing sensitive information or transferring funds.

As Mike Konopka, Manager of Information Security at HAI Group, explains: “Deepfake technology has already developed to an advanced stage and soon may not be detectable. In a virtual setting, it’s more important than ever to be alert to context clues and ask, ‘What is being asked of me, and why?’ Our own humanity is the final line of defense.”

While spotting manipulated media is becoming more difficult, there are still some common signs that may give it away. Here are five to watch for:

1. Unnatural facial expressions or gestures

One of the biggest giveaways is movement that feels slightly off. Look out for expressions that do not match the tone of voice, facial movements that seem too stiff or exaggerated, or gestures that appear out of sync with what the person is saying. These small inconsistencies can signal that you are not looking at a genuine video.

2. Disproportionate head and body composition

Artificial intelligence often struggles with proportion. A head that looks too large or small compared to the body, shoulders that do not quite align, or an awkward posture may indicate the video has been manipulated. If something about the person’s physical appearance does not look natural, take a closer look before acting on the message.

3. Mismatched voice and lip movement

Even advanced forgeries can have trouble perfectly syncing audio and visual elements. If the speaker’s lip movements do not line up with their words, or the voice sounds slightly robotic or unnatural, it could be a sign of tampering. Pay attention to the speaker to determine whether what you hear and see matches up.

4. Inconsistent lighting, image quality, or complexion

Lighting and skin tone are some of the hardest elements for AI to perfect. Watch for shadows that do not align correctly, patchy or blurred skin tones, or sudden changes in image quality. These visual glitches often suggest that the video has been artificially created.

5. Video motion “glitches”

Finally, look out for unexpected glitches in video motion. A person’s face may blur briefly, parts of the image may flicker, or the background may shift unnaturally. These technical flaws can appear for only a moment, but they are a strong clue that what you see is not authentic.

 

Bonus: Watch HAI Group's deepfake awareness video

HAI Group Online Training created a short scenario to help raise awareness about the risks of deepfakes in professional settings. As you'll see, even a friendly gesture can become dangerous. This video, also available on our YouTube channel, is designed to help housing professionals understand the risks of deepfakes and think twice before sharing sensitive data.

 

 

Staying vigilant

These kinds of scams are designed to manipulate and often use urgency or emotional triggers to push you into quick action. If you receive a request that feels rushed, unusual, or guilt-inducing, pause before responding. Verify the request through another trusted channel, such as a phone call or direct message.

By staying alert to these signs and asking yourself, “What is being asked of me, and why?” you can help protect yourself and your organization from falling victim to deepfake scams.

To increase awareness, HAI Group has created a dedicated Cybersecurity Center that provides resources, training, and tools to help housing organizations strengthen their defenses.


This article is for general information only. HAI Group® makes no representation or warranty about the accuracy or applicability of this information for any particular use or circumstance. Your use of this information is at your own discretion and risk. HAI Group® and any author or contributor identified herein assume no responsibility for your use of this information. You should consult with your attorney or subject matter advisor before adopting any risk management strategy or policy. 

HAI Group® is a marketing name used to refer to insurers, a producer, and related service providers affiliated through a common mission, management, and governance. Property-casualty insurance and related services are written or provided by Housing Authority Property Insurance, A Mutual Company; Housing Enterprise Insurance Company, Inc.; Housing Specialty Insurance Company, Inc.; Housing Investment Group, Inc.; and Housing Insurance Services (DBA Housing Insurance Agency Services in NY and MI).

Don't Miss This

Related Content

The State of Affordable Housing: Key Findings from the 2024 Picture of Preservation

December 3, 2024
The report emphasizes the need for improved policies to preserve the nation’s federally assisted housing stock,...

Preparing for Disasters and Navigating Insurance Challenges in Affordable Housing: Key Takeaways from HAI Group’s PHADA Presentation

January 16, 2025
At the Public Housing Authority Directors Association (PHADA) Commissioners’ Conference this January, HAI Group’s...

From Pests to Policies: The Top Risk Management Topics of 2024

December 17, 2024
Housing inherently involves risks. Individuals and families comprise communities, bringing unique lives, social...