The Need for Radar and Visual Cameras for Autonomous Driving

 

The promise of fully autonomous driving, glimpsed by the emergence of Tesla, may have hit a speed bump.

The FCC granted approval for Tesla to use a new radar in its vehicles. And Tesla recently filed for an extension to this confidential treatment. But didn’t Tesla remove radar on some of its cars in favor of ultrasonic sensors earlier this year? That change, known as Tesla Vision, removed radar on the 3 and Y models. The new radar device will hit the market mid-January to provide an additional level of passenger safety.

The moves call into question Tesla’s approach to autonomous driving and the current need for radar and visual cameras for safe self-driving until technology advancements catch up with Tesla’s long-term strategy.

Chuck Gershman, Co-Founder & CEO at Owl Autonomous Imaging, wonders what the right approach to is keep forward momentum on autonomous driving.

Chuck’s Thoughts

“Hi, this is Chuck Gershman, co-founder and CEO at Owl Autonomous Imaging Previous. Previously, we endorsed Tesla adding radar sensors as a compliment to their vision only camera sensor suite for safe vehicle operation. However, at the time I did mention that we do not believe that radar and visual cameras are the final step.

So, the question has now been raised with Tesla oscillating on their sensor hardware plans, what type of sensors are truly required to ensure safe hands-free Adas operation and beyond? A little background, simply matching the level of safety of today’s driving public is not a reasonable goal. Robotic safety systems must be dramatically better than humans.

To put this in perspective, annually, 1.3 million people are killed in automobile accidents with more than half of the fatalities being persons not in the vehicle. Moreover, when it comes to nighttime operation, we humans really degrade. Of the approximately 700,000 pedestrians and cyclists killed each year, 76% of them are killed at night.

To be safer than humans, sensors must deliver robust operation under a very broad set of environmental operating conditions. With the true challenge of ensuring that no object of interest or vulnerable road user goes unidentified under all driving conditions.

Sensors in an instant must be able to answer the following questions:

Is there something out there? What is it? Where is it? Where’s it going?

Furthermore, the system must be modality redundant. No one’s sensor modality may be relied on to answer any one of these questions alone. Up until recently, three modalities were considered sufficient, that being the fusion of vision cameras, which provide rich detail and color radar, which provides the ability to detect and locate objects under all environmental conditions and lidar, which provides more detail than radar and adds very accurate range.

Those in the know and we at Owl concur that an additional camera modality is required known as 3D thermal imaging. This camera modality delivers passive rich detail both at night and day with operation under degraded visual environments, thereby dramatically supplementing visual cameras.

It also delivers location and range information, thereby supplementing LIDAR and IMP and improving radar fusion. So, in summary, the fusion of visual cameras, lidar, radar, and 3D thermal cameras are the required sensor suite to save lives.”

Follow us on social media for the latest updates in B2B!

Image

Latest

mobile gaming
From Flip Phones to Free-to-Play Empires: How Mobile Gaming Reshaped Business Models, Communities, and Esports
September 17, 2025

Mobile gaming has quietly become the largest segment of the global gaming industry, generating about $92 billion annually—more than both PC and console games. Yet for decades, many brands and agencies underestimated its reach, focusing instead on arena-filling esports tournaments or blockbuster console titles. With nearly everyone carrying a smartphone, however, mobile has become…

Read More
Revenue Cycle
Transformation Without Disruption: How Access Healthcare Is Rewiring the Revenue Cycle with Agentic AI
September 17, 2025

Hospitals are juggling shrinking margins and rising costs while denial volumes remain stubbornly high. In the revenue cycle alone, hundreds of billions are lost annually to preventable errors and inefficiencies—in fact, Access Healthcare CEO Shaji Ravi cites more than $250 billion wasted each year. Meanwhile, payers have accelerated their use of AI to adjudicate…

Read More
leading with intention
Making Meaning Out of Life’s Pause: Billie Whitehouse on Finding Strength, Setting Boundaries, and Leading With Intention
September 17, 2025

In June, Forbes profiled Billie Whitehouse, CEO and Creative Director of Wearable X, as she broke her silence about leading through a devastating health crisis. Diagnosed with stage 4 colon cancer at 27 while 22 weeks pregnant, Whitehouse underwent emergency surgery that ensured her survival, but came with the profound heartbreak of losing her…

Read More
Critical Care
Transforming the ICU Through Technology: Advances in Critical Care Telehealth Delivering Gold-Standard Care Anywhere
September 17, 2025

Critical care in the United States faces a mounting crisis. With a shortage of board-certified intensivists and younger, less experienced nurses filling ICUs, hospitals often struggle to provide timely, gold-standard care. Studies show that hospitals with board-certified intensivists in their ICUs see a 30% reduction in patient mortality, yet thousands of facilities still lack…

Read More