Hackers leveraging AI: vulnerability for law firms.

Byline: Don Nokes

Hackers are upping their game when it comes to cybersecurity, going beyond email tactics. No longer are the attacks confined to email account takeover, ransomware threats, or extra sneaky attacks gaining access via trusted third parties such as title companies and payroll vendors.

The bad actors are now using artificial intelligence to devise very clever ways to employ voice-messaging to recreate your actual voice. Even just a few seconds of video content can give scammers what they require to recreate someone's voice.

This is one kind of social engineering hack called spoofing. (Social engineering involves fooling the target via psychological manipulation resulting in human error, rather than using technical or digital system vulnerabilities.) Spoofing is when hackers pretend to be someone else to perpetrate a scam.

Sadly, we've all heard about fraudulent requests in the news in which scammers use AI to clone kids' voices often from a snippet on social media and persuade parents to pay out large sums of money for a health care emergency or kidnapping situation. These nefarious strategies, however, go beyond one's personal life to businesses, and more specifically, to law firms.

Communicating with a trusted source

One of the key ingredients of a successful social engineering hack is establishing confidence that the request for private information or a directive to send funds is coming from a trusted, appropriate source.

Imagine you are a legal assistant, paralegal, bookkeeper or other law firm employee authorized to deal with money. You look down at your phone and you see a text coming in from your boss. You open the text message and can see the entire thread you've had with your boss in the past. It's easy to believe that you are communicating with your boss.

In these situations, when a request comes in from an email address that you're certain of, or if a text message arrives and the ID indicates a trusted source, victims tend to let their guard down and respond to the malicious request. Tools are readily available today to alter the phone number from which a call or text is coming.

Here's an example of how the scam works using AI: Once the bad actors learn (possibly from first hacking a firm's email) that a financial transaction is taking place, they send an AI-generated voice message to confirm where to send the funds. The fund transferer hears the familiar voice confirming the financial transfer and sends the money.

...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT