Cybercriminals are Using AI in Scams. Organizations Need to Prepare for Additional AI Cybersecurity Risks.

 

Cybercriminals are using AI to make their scams more convincing and sophisticated than ever. The result is a slew of new AI cybersecurity threats to both businesses and consumers.

In January, Forbes reported that several major hacking communities were using ChatGPT to develop malware that could spy on users’ keyboards. Fraudsters are also using AI tools to simulate voices in imposter scams; NBCNews reported an incident in March, where a man received a phone call from his daughter, who said she was being held hostage. The daughter’s voice was actually a replication created by AI.

While ChatGPT and other AI tools do pose cybersecurity risks for businesses and consumers, it also offers many benefits including increased efficiency, improved data monitoring, and reduction of human error. Companies see the potential and are exploring ways to integrate these tools into their business practices, and ways to balance the benefits and risks of this technology.

As cybercriminals are using AI to enhance their scams, organizations want to know what they need to do to prepare for bad actors, and the potential negative uses for AI tools. Doriel Abrahams, U.S. Head of Risk at Forter, shares his thoughts about what AI cybersecurity trends will play out this year.

 

Doriel’s Thoughts

“With the rise of ChatGPT and its new iteration ChatGPT-4, there are a lot of interesting and important conversations to be had surrounding generative AI. One of those is about potential use of those tools in fraud. The reality is that online crime has one weakest link and that link is human intervention.

ChaptGPT, or other tools, can manipulate and prey on human emotion and can become part of an ongoing fraud war. It can be used to make scams more convincing. Think about Roman scams, where people try to befriend you. Or even business email, corporate scams, where people send emails to finance teams, trying to get them to change bank accounts, from which you pay vendors. What happens if you reroute funds to other accounts, or how do you convince people to do that? The use and power of AI can make those emails look a lot more authentic.

We’ve already seen in the past holiday season a huge, massive manipulator problem, and they were very scalable. Just thinking about the ability to use those tools and get more scale, it can be a big thing and a game changer for the online crime community.

It already has been used to create all sorts of different materials and broadcasts, and it’s likely going to get even worse. It can significantly impact crime in the service industry; all the things that are being sold on the dark web. You can either buy data, or buy access to data that is being achieved by simple AI manipulation on people. It can be used to get addresses of people. It can be used to get payment information of people, social security numbers. Potentially, if the scam is good enough, you might unknowingly expose your entire identity and private details to those fraudsters. 

[Generative AI] has so much potential to help us evolve with people, and all the new cool things that we’re seeing day to day that these types of tools can be applied for. But when you think about it there’s a lot of fraud that you can do with those tools at the same time.

Organizations need to be prepared, because it is coming. You know what happens when the criminal experts start exploring the ways they can use generative AI. Are there reasons to panic? Maybe, and maybe not. Most important is that we know what this tool is capable of, and how to think differently when we’re encountering things that might look like a scam. It’s a big deal. It’s a big deal for better, but also for worse, so we just need to be ready.”

 

Article written by Angela Thoma.

Follow us on social media for the latest updates in B2B!

Image

Latest

Career success
A CEO’s Blueprint for Career Success: Leading with Love to Drive Performance and Culture
March 10, 2026

Leadership right now feels heavier than it did just a few years ago. Teams are stretched, expectations are high, and many employees are quietly disengaged. In fact, Gallup’s 2025 U.S. data shows that only about 31% of employees are actively engaged at work, leaving the majority feeling disconnected or indifferent. For CEOs and senior…

Read More
employer-sponsored apprenticeships
The Degree That Pays You Back: How Employer-Sponsored Apprenticeships Are Rewriting Higher Ed
March 9, 2026

Higher education is under pressure. Over the past few years, public confidence in the value of a four-year degree has declined significantly, with fewer Americans expressing a strong belief that traditional higher education delivers a worthwhile return on investment. At the same time, employers consistently report that graduates lack job-ready skills—particularly the “durable skills”…

Read More
Denial Data
Turning Denial Data Into Action: How Healthcare Organizations Can Fight Back Against Payer Denials
March 5, 2026

Healthcare providers across the U.S. are facing a growing wave of claim denials that is putting pressure on already strained hospital finances. Industry research from the American Hospital Association shows that nearly 15% of medical claims submitted to private payers are initially denied, forcing hospitals and health systems to spend about $19.7 billion annually attempting…

Read More
Jabra
ISE 2026: Jabra Unveils Scalable Room Solutions for the Hybrid Workplace
March 5, 2026

At ISE 2026, Jabra highlighted how meeting technology is evolving to support the realities of hybrid work, where the experience must be equally effective for people inside and outside the room. In a conversation with Craig Durr, Chief Analyst and Founder of The Collab Collective, Jabra’s VP of Video Product Olly Henderson explained that…

Read More