It’s Impossible to Regulate AI in a Vacuum. It’s Inextricably Linked to People and Their Existing Laws.

 

Is it possible to regulate AI and put limits on this seemingly boundless technology? Senate Majority Leader Chuck Schumer has announced nine “AI Insight Forums” this fall as a first step towards understanding such an overwhelming and revolutionary technology. The initiative aims to educate Congress members on topics ranging from copyright to national security and the impact of AI on democracy. The objective is clear: to create regulations that prevent undesirable outcomes and promote beneficial ones. This is a complex task that requires regulators to be well-versed in the technology, rather than relying solely on industry experts who might have vested interests.

The existing legal framework provides a starting point, but legislators and other industry leaders  must also identify and address the most pressing needs, particularly those that could cause harm. For example, one of Schumer’s major concerns includes the potential for deepfakes to undermine democracy.

Ultimately, it’s crucial to remember that this technology is not an isolated entity; it’s deeply intertwined with people, organizations, and governments. The challenge of regulating AI isn’t just about understanding the technology, but also about comprehending its societal implications and potential risks.

Nick White, Data Strategy Director at Kin + Carta, helps bridge the gap between technology and policy to further navigate this uncharted territory.

Nick’s Thoughts:

“It is a technology, but it involves people, organizations, government, involves everybody. So look at the laws that exist. How do these relate to the existing laws? And really work from there. And then, of course, where do you start? Like, what are the most pressing needs? Start with things that are going to cause harm to people. As regulators think about how they are going to regulate AI and create a sustainable framework, it has to start with them understanding the technology and how it relates to people and process and things that are very important.

So first, they need to have an understanding. They cannot rely on industry experts that could, you know, gain something from the regulations that get made. From there, what are the objectives? Be very clear on these are the outcomes we want. These are the outcomes we don’t want. And make sure that those are guiding stars. Another thing to think about is this stuff is, it is a technology, but it involves people, organizations, government, involves everybody. So look at the laws that exist. How do these relate to the existing laws? And really work from there.

And then, of course, where do you start? Like, what are the most pressing needs? Start with things that are going to cause harm to people. That is the most important. From there, understanding the risks and where there is low risk and high risk, you will ultimately start creating regulations and start creating laws that actually have a positive impact and contain this and enhance people’s lives like AI should.”

Follow us on social media for the latest updates in B2B!

Image

Latest

team
When Your Team Becomes the Bottleneck
February 25, 2026

In a candid take on organizational blind spots, Mollie Gaby, Principal at CG Infinity, highlights a hard truth many leaders avoid: sometimes your biggest pain point isn’t your technology or your strategy — it’s your staff. A common red flag is resistance to change. When team members are unwilling to explore new tools, automate…

Read More
asset visibility
Diagnosing Your Capital Asset Health: Why Asset Visibility Is the New Financial Imperative in Healthcare
February 25, 2026

Hospitals and surgery centers own millions of dollars in equipment — but owning assets and having actionable visibility into them are two different things. Most systems maintain inventories, yet many struggle with outdated records, fragmented tracking, and limited insight into useful life or service contracts. With nearly half of U.S. hospitals reporting negative operating…

Read More
CFO
From Public Accounting to CFO: The Leadership Wake-Up Call
February 25, 2026

The CFO seat is being rewritten in real time. Today’s finance leaders are expected to drive growth, lead enterprise-wide systems transformations, and shape AI strategy—while still keeping the close, controls, and capital story airtight. Gartner reports that 59% of finance leaders are already using AI in the finance function, underscoring how rapidly the role is…

Read More
restorative practices
Building Safer Schools Through Restorative Practices
February 24, 2026

School Safety Today podcast, presented by Raptor Technologies. In this episode of Principals of Change, host Dr. Amy Grosso sits down with D’Jon Pitchford, Assistant Principal at Kelly Lane Middle School in Pflugerville ISD, to explore what school safety really means. Pitchford reframes safety as more than physical security—emphasizing trust, restorative practices, campus culture,…

Read More