Cybercriminals are Using AI in Scams. Organizations Need to Prepare for Additional AI Cybersecurity Risks.

 

Cybercriminals are using AI to make their scams more convincing and sophisticated than ever. The result is a slew of new AI cybersecurity threats to both businesses and consumers.

In January, Forbes reported that several major hacking communities were using ChatGPT to develop malware that could spy on users’ keyboards. Fraudsters are also using AI tools to simulate voices in imposter scams; NBCNews reported an incident in March, where a man received a phone call from his daughter, who said she was being held hostage. The daughter’s voice was actually a replication created by AI.

While ChatGPT and other AI tools do pose cybersecurity risks for businesses and consumers, it also offers many benefits including increased efficiency, improved data monitoring, and reduction of human error. Companies see the potential and are exploring ways to integrate these tools into their business practices, and ways to balance the benefits and risks of this technology.

As cybercriminals are using AI to enhance their scams, organizations want to know what they need to do to prepare for bad actors, and the potential negative uses for AI tools. Doriel Abrahams, U.S. Head of Risk at Forter, shares his thoughts about what AI cybersecurity trends will play out this year.

 

Doriel’s Thoughts

“With the rise of ChatGPT and its new iteration ChatGPT-4, there are a lot of interesting and important conversations to be had surrounding generative AI. One of those is about potential use of those tools in fraud. The reality is that online crime has one weakest link and that link is human intervention.

ChaptGPT, or other tools, can manipulate and prey on human emotion and can become part of an ongoing fraud war. It can be used to make scams more convincing. Think about Roman scams, where people try to befriend you. Or even business email, corporate scams, where people send emails to finance teams, trying to get them to change bank accounts, from which you pay vendors. What happens if you reroute funds to other accounts, or how do you convince people to do that? The use and power of AI can make those emails look a lot more authentic.

We’ve already seen in the past holiday season a huge, massive manipulator problem, and they were very scalable. Just thinking about the ability to use those tools and get more scale, it can be a big thing and a game changer for the online crime community.

It already has been used to create all sorts of different materials and broadcasts, and it’s likely going to get even worse. It can significantly impact crime in the service industry; all the things that are being sold on the dark web. You can either buy data, or buy access to data that is being achieved by simple AI manipulation on people. It can be used to get addresses of people. It can be used to get payment information of people, social security numbers. Potentially, if the scam is good enough, you might unknowingly expose your entire identity and private details to those fraudsters. 

[Generative AI] has so much potential to help us evolve with people, and all the new cool things that we’re seeing day to day that these types of tools can be applied for. But when you think about it there’s a lot of fraud that you can do with those tools at the same time.

Organizations need to be prepared, because it is coming. You know what happens when the criminal experts start exploring the ways they can use generative AI. Are there reasons to panic? Maybe, and maybe not. Most important is that we know what this tool is capable of, and how to think differently when we’re encountering things that might look like a scam. It’s a big deal. It’s a big deal for better, but also for worse, so we just need to be ready.”

 

Article written by Angela Thoma.

Follow us on social media for the latest updates in B2B!

Image

Latest

Investment
Business Investment in Solar and Battery Storage
October 21, 2025

Commercial and industrial electrical rates are rising, and this trend is expected to accelerate due to the massive energy demands from new data centers supporting the AI industry. For example, the Texas electrical grid is projected to need to double its capacity by 2030 to meet this new demand. This significant capital investment…

Read More
cardiovascular
Technology Is Transforming Cardiovascular Care But Can Access Keep Up?
October 21, 2025

Cardiovascular care is entering one of its most transformative periods in decades. Advances in AI imaging and minimally invasive procedures are transforming the diagnosis and treatment of heart disease. According to the World Health Organization, an estimated 19.8 million people died from cardiovascular diseases in 2022, representing approximately 32% of all global deaths. This…

Read More
rural healthcare
Hot Takes on Rural Healthcare: Lessons from the Frontlines of a System in Decline
October 21, 2025

Across America, rural hospitals are facing an existential crisis. From physician burnout and recruitment struggles to malpractice insurance woes and shrinking OB units, the challenges facing small health systems are multiplying. According to the National Rural Health Association, roughly 190 rural hospitals have closed down or discontinued inpatient care since 2010 — and many more…

Read More
private schools
Protecting Private Schools and Faith-Based Communities
October 21, 2025

School Safety Today podcast, presented by Raptor Technologies. In this episode of School Safety Today, host Dr. Amy Grosso speaks with Adam Coughran, founder of Safe Kids, Inc., about how private and faith-based schools can strengthen safety practices despite limited resources. KEY POINTS: Physical systems, community engagement, and knowledge must work together for effective…

Read More