In a recent scam (CNN Story), a worker at a major company in Hong Kong was tricked into giving $25 million to criminals. They used advanced deep-faked video to look and sound like the company’s CFO during a fake online meeting. The police in Hong Kong said this is just one of many scams where criminals use this technology to fool people into giving them money. They caught six people involved in these scams. This problem is getting worse, and even famous people like Taylor Swift have been targeted with fake images made using this technology. The police are working hard to stop these crimes and keep people safe from these tricks.
This is just the beginning. Here’s what CFOs and InfoSec Teams can do about it:
Use Strong Security Checks: Add extra steps to check who makes a large money transfer, like using codes sent to your phone and asking more than one team to agree.
Teach Staff About Deep Fake Videos: Run training sessions to help employees learn about fake video technology and how to tell if a call or message might not be real.
Stick to Secure Comms tools: Make sure everyone uses company-approved ways to chat and knows how to check if the person they’re talking to is who they say they are.
Have Rules for Secret Payments: Make steps to follow when someone asks for money quickly or in secret, including getting a direct okay from the people involved.
Watch Out for Odd Behaviour: Set up a system to keep an eye out for strange activity and have a team ready to look into it. Also, make it easy for people to report anything dodgy without worrying about consequences.
I work with organisations to help them understand the reality of working with AI. If you’re looking for a dynamic keynote speaker or workshop facilitator to build further understanding of the opportunities and risks, get in touch!