The Need for Radar and Visual Cameras for Autonomous Driving

 

The promise of fully autonomous driving, glimpsed by the emergence of Tesla, may have hit a speed bump.

The FCC granted approval for Tesla to use a new radar in its vehicles. And Tesla recently filed for an extension to this confidential treatment. But didn’t Tesla remove radar on some of its cars in favor of ultrasonic sensors earlier this year? That change, known as Tesla Vision, removed radar on the 3 and Y models. The new radar device will hit the market mid-January to provide an additional level of passenger safety.

The moves call into question Tesla’s approach to autonomous driving and the current need for radar and visual cameras for safe self-driving until technology advancements catch up with Tesla’s long-term strategy.

Chuck Gershman, Co-Founder & CEO at Owl Autonomous Imaging, wonders what the right approach to is keep forward momentum on autonomous driving.

Chuck’s Thoughts

“Hi, this is Chuck Gershman, co-founder and CEO at Owl Autonomous Imaging Previous. Previously, we endorsed Tesla adding radar sensors as a compliment to their vision only camera sensor suite for safe vehicle operation. However, at the time I did mention that we do not believe that radar and visual cameras are the final step.

So, the question has now been raised with Tesla oscillating on their sensor hardware plans, what type of sensors are truly required to ensure safe hands-free Adas operation and beyond? A little background, simply matching the level of safety of today’s driving public is not a reasonable goal. Robotic safety systems must be dramatically better than humans.

To put this in perspective, annually, 1.3 million people are killed in automobile accidents with more than half of the fatalities being persons not in the vehicle. Moreover, when it comes to nighttime operation, we humans really degrade. Of the approximately 700,000 pedestrians and cyclists killed each year, 76% of them are killed at night.

To be safer than humans, sensors must deliver robust operation under a very broad set of environmental operating conditions. With the true challenge of ensuring that no object of interest or vulnerable road user goes unidentified under all driving conditions.

Sensors in an instant must be able to answer the following questions:

Is there something out there? What is it? Where is it? Where’s it going?

Furthermore, the system must be modality redundant. No one’s sensor modality may be relied on to answer any one of these questions alone. Up until recently, three modalities were considered sufficient, that being the fusion of vision cameras, which provide rich detail and color radar, which provides the ability to detect and locate objects under all environmental conditions and lidar, which provides more detail than radar and adds very accurate range.

Those in the know and we at Owl concur that an additional camera modality is required known as 3D thermal imaging. This camera modality delivers passive rich detail both at night and day with operation under degraded visual environments, thereby dramatically supplementing visual cameras.

It also delivers location and range information, thereby supplementing LIDAR and IMP and improving radar fusion. So, in summary, the fusion of visual cameras, lidar, radar, and 3D thermal cameras are the required sensor suite to save lives.”

Follow us on social media for the latest updates in B2B!

Image

Latest

promoted
How to Succeed After Getting Promoted: Seeking Feedback, Acting with Intention, and Leading with Perspective
April 16, 2026

Stepping into a leadership role today isn’t just a step up—it’s a shift into constant visibility, where expectations arrive immediately and the margin for error narrows. As organizations flatten structures and demand faster decisions, newly promoted leaders are expected to deliver impact from the outset, often without the space to fully adjust. According to…

Read More
AI in business
A Practical Conversation About AI in Business: From Hype to Real-World Impact
April 15, 2026

Artificial intelligence has moved from buzzword to boardroom priority at a staggering pace. Yet despite widespread adoption, many organizations are still struggling to turn experimentation into measurable business value—some estimates suggest the majority of enterprise AI initiatives fail to scale successfully. As AI becomes “table stakes” across industries, the real challenge is no longer…

Read More
weekly drive-in
Metropolis: Weekly Drive-in
April 15, 2026

Metropolis “Weekly Drive In” reflects a new era of storytelling where AI meets real-world execution, turning everyday field performance into momentum. Centered on genuine conversions and local wins, the series highlights how the company is scaling not just through technology, but through visibility and shared recognition. In an emerging recognition economy, these updates act…

Read More
Drive In, Drive Out: The Rhythm of Metropolis
April 15, 2026

Behind the seemingly mundane choreography of a drive-in lies a broader story about how modern cities script behavior, turning even the simplest actions into rehearsed routines. What looks like repetition is really a quiet testament to systems designed for flow and control, where efficiency often outweighs individuality. In places like Metropolis, the rhythm of…

Read More