Back to Blog Housing Industry News

‘An endless battle’: How real estate pros can stay ahead of deepfake fraudsters

April 10, 2026 at 3:24 PM Jonathan Delozier HousingWire

A vacant land transaction in Maryland nearly became the latest victim of deepfake fraud last week when artificial intelligence (AI) was used to impersonate the property owner during a live video session.

That roughly $100,000 deepfake fraud attempt was stopped by identity verification and transaction security platform Proof during a remote notarization, the company said.

The incident highlights a continued vulnerability in real estate transactions, particularly those involving out-of-state sellers or vacant land where in-person verification is impossible.

“Real estate fraud isn’t always a high-volume issue, but it’s an incredibly high-impact issue,” Proof CEO and co-founder Pat Kinsel told HousingWire. “A lot of title companies will say, ‘It hasn’t happened to me,’ and then when it does happen to them, it’s devastating.

“I know a title agent that had a million-dollar loss, and they ended up having to personally cover this.”

Tech-enabled fraud reached $13.7 billion in 2024, according to the FBI’s Internet Crime Complaint Center. Meanwhile, deepfake-related scams are rising quickly — jumping 40% year-over-year, per Entrust’s 2026 Identity Fraud Report.

How deepfakes evade human detection

In a demonstration, Kurt Ernst, product manager at Proof, showed how easily commercially available software can create convincing deepfake videos.

Ernst placed a deepfake face over his own in real time, noting the setup took just 15 minutes using off-the-shelf technology.

“I don’t have a supercomputer sitting in my closet here running the latest in video drivers,” Ernst said. “We set this up very quickly. It’s using commercial, off-the-shelf software that you can get, or your fraudsters, your friendly neighborhood fraudsters, can get as well.”

A recent study by Deloitte estimated fraud losses tied to AI-generated deepfakes could reach billions annually, with financial services and real estate among the most exposed sectors.

Ernst also cited common flaws in deepfake videos, including hand warping when crossing the face, imperfections in facial hair rendering and inconsistencies in mouth movement.

“You can see how there’s warping there. You can see my face over my fingers,” he said. “These different types of technologies are terrible with fingers and hands, and they’ll look kind of creepy.”

Proof’s technology flagged the deepfake video as fraudulent within seconds during the demonstration.

The system scans video frames in real time while also analyzing device information, location data and email addresses, Ernst added.

Multilayered detection approach

The Maryland transaction involved a vacant lot where the seller claimed to be local but, as often is the case, was operating elsewhere.

“They don’t always have the exact details on their ID correctly,” Ernst said of scammers. “Their email address — that’s a classic one. You can spin up a new email address in five seconds. So their email address will have never been seen by anyone.”

Detection systems can flag suspicious transactions and red flags autonomously.

“We can say, ‘We think this one needs to be reviewed,’ even if the notary doesn’t see it,” Ernst said. “That can happen almost instantaneously.”

Best practices for professionals

Kinsel recommended that real estate teams and title offices add identity verification at every step of the transaction process — and even across avenues once thought to be relics of the past.

“It’s been proven that fraud is returning to paper channels,” Kinsel said. “You can provide a better customer experience and a more secure experience by actually securing the credential.”

The company has invested in deepfake detection for four years — training models on synthetic videos created from monitoring the dark web and platforms like Telegram.

“It’s going to be an endless battle, but we think it’s really core to our mission as a company,” Kinsel said.

Proof’s long-term strategy in staying ahead of evolving fraud technology involves persistent digital identities secured by cryptographic keys, he added.  

“Every single time that someone is enrolled or the identity is verified represents the opportunity for fraud,” Kinsel said. “Every single time when you go from the Realtor to the title company, to the mortgage lender, these handoffs are an opportunity for someone to steal your identity.”

Originally reported by HousingWire.
Disclosure: Any rates, payments, or loan terms referenced in this article are for informational and educational purposes only and are not a loan offer, rate lock, or commitment to lend. Actual rates, APR, and terms depend on credit profile, property type, loan amount, and other factors. All loans subject to credit and property approval. Blue Sky Lending, LC is a licensed mortgage broker, not a direct lender. NMLS# 289106. Phil Long NMLS# 286973. Equal Housing Lender. Terms of ServicePrivacy Policy

Ready to see what you qualify for?

Get a free personalized rate quote in minutes. No credit pull. No SSN required to get started.

256-bit encryption • Phil Long NMLS #286973 • Equal Housing Lender

Related Articles

All Articles Call Phil: (214) 507-8478