It’s Impossible to Regulate AI in a Vacuum. It’s Inextricably Linked to People and Their Existing Laws.

 

Is it possible to regulate AI and put limits on this seemingly boundless technology? Senate Majority Leader Chuck Schumer has announced nine “AI Insight Forums” this fall as a first step towards understanding such an overwhelming and revolutionary technology. The initiative aims to educate Congress members on topics ranging from copyright to national security and the impact of AI on democracy. The objective is clear: to create regulations that prevent undesirable outcomes and promote beneficial ones. This is a complex task that requires regulators to be well-versed in the technology, rather than relying solely on industry experts who might have vested interests.

The existing legal framework provides a starting point, but legislators and other industry leaders  must also identify and address the most pressing needs, particularly those that could cause harm. For example, one of Schumer’s major concerns includes the potential for deepfakes to undermine democracy.

Ultimately, it’s crucial to remember that this technology is not an isolated entity; it’s deeply intertwined with people, organizations, and governments. The challenge of regulating AI isn’t just about understanding the technology, but also about comprehending its societal implications and potential risks.

Nick White, Data Strategy Director at Kin + Carta, helps bridge the gap between technology and policy to further navigate this uncharted territory.

Nick’s Thoughts:

“It is a technology, but it involves people, organizations, government, involves everybody. So look at the laws that exist. How do these relate to the existing laws? And really work from there. And then, of course, where do you start? Like, what are the most pressing needs? Start with things that are going to cause harm to people. As regulators think about how they are going to regulate AI and create a sustainable framework, it has to start with them understanding the technology and how it relates to people and process and things that are very important.

So first, they need to have an understanding. They cannot rely on industry experts that could, you know, gain something from the regulations that get made. From there, what are the objectives? Be very clear on these are the outcomes we want. These are the outcomes we don’t want. And make sure that those are guiding stars. Another thing to think about is this stuff is, it is a technology, but it involves people, organizations, government, involves everybody. So look at the laws that exist. How do these relate to the existing laws? And really work from there.

And then, of course, where do you start? Like, what are the most pressing needs? Start with things that are going to cause harm to people. That is the most important. From there, understanding the risks and where there is low risk and high risk, you will ultimately start creating regulations and start creating laws that actually have a positive impact and contain this and enhance people’s lives like AI should.”

Follow us on social media for the latest updates in B2B!

Image

Latest

career
What to Do When Your Career Feels Stuck: Invest in Yourself, Stay Intentional, and Build the Right Network
April 29, 2026

Work doesn’t feel the way it used to. Between new tech, changing expectations, and the constant pressure to keep up, a lot of people—even those who look successful on paper—are quietly wondering what’s next. In fact, recent workforce studies suggest a large share of employees feel disengaged or uncertain about their next move, despite…

Read More
Rural School
How Rural Schools Are Redefining School Safety Through Relationships and Proactive Systems
April 28, 2026

On Principles of Change, a podcast by Raptor Technologies, host Dr. Amy Grosso sits down with Dr. Miguel Salazar, principal of Sundown Middle School in Sundown, Texas, to explore how one rural district is redefining school safety through culture, systems, and human connection. Together, they unpack how proactive frameworks, community values, and intentional relationship-building can…

Read More
StudentSafe
Understanding Raptor StudentSafe
April 28, 2026

In this episode of School Safety Today, host Dr. Amy Grosso speaks with Chris Noell, Chief Product Officer at Raptor Technologies, and Will Durgin, Director of Student Well-Being, about the vision behind StudentSafe and how it helps schools move from reactive responses to proactive student support. Together, they emphasize that safer schools depend on giving staff…

Read More
school safety
Going Slow to Go Fast in School Safety Leadership
April 28, 2026

In this episode of the Principles of Change podcast, presented by Raptor Technologies, host Dr. Amy Grosso talks with Tim Dykes, Assistant Principal for Culture and Climate at York Community High School in Elmhurst, Illinois. The conversation highlights how strong relationships, student voice, and steady long-term leadership can help schools build environments where people feel…

Read More