Connected World: Toward a Sustainable Future for Mobility and EV Architectures Part 1 of 3

There’s a lot to be excited about in the world of autonomous vehicles. Still, there are many challenges ahead, including misconceptions to clear up and technology hurdles to climb before EV’s go mainstream. Ralf Klaedtke, The Chief Technical Officer for Transportation Solutions at TE Connectivity, shared his insights on those challenges and the trends shaping the industry with a push towards full autonomy. With so much great content to cover, Klaedtke joined Connected World’s Tyler Kern for this first of three separate episodes on the topic.

Klaedtke told Kern it wasn’t long ago that people began dreaming of the day when autonomous driving was the norm. But that day isn’t here yet. So, what’s the hold-up?

“A lot of development has taken place,” Klaedtke said. “A lot of energy and time has gone into that, but I think issues like weather, snow, mixed traffic have been much stronger than most of us thought, and therefore the whole process is safety-oriented and taking longer than expected.”

There are many reasons why the world is waiting for autonomous vehicles to hit the mainstream. One of the primary benefits will be the supply chain, which is in desperate need of solutions to combat the present-day lack of qualified truck drivers.

And yes, there are varieties of autonomous vehicles on the road, but what is primarily available today are various levels from 1-3, which require different needs from a physical driver. Getting to full autonomy requires technology progressing to levels 4 and 5. Several barriers, from technical and logistical to liability, must be overcome to achieve those levels. And if there is an accident, who takes over that liability is something that car manufacturers need to work out.

Follow us on social media for the latest updates in B2B!

Image

Latest

AI costs
QumulusAI Brings Fixed Monthly Pricing to Unpredictable AI Costs in Private LLM Deployment
February 18, 2026

Unpredictable AI costs have become a growing concern for organizations running private LLM platforms. Usage-based pricing models can drive significant swings in monthly expenses as adoption increases. Budgeting becomes difficult when infrastructure spending rises with every new user interaction. Mazda Marvasti, CEO of Amberd, says pricing volatility created challenges as his team expanded its…

Read More
GPU infrastructure
Amberd Moves to the Front of the Line With QumulusAI’s GPU Infrastructure
February 18, 2026

Reliable GPU infrastructure determines how quickly AI companies can execute. Teams developing private LLM platforms depend on consistent high-performance compute. Shared cloud environments often create delays when demand exceeds available capacity Mazda Marvasti, CEO of Amberd, says waiting for GPU capacity did not align with his company’s pace. Amberd required guaranteed availability to support…

Read More
private LLM
QumulusAI Secures Priority GPU Infrastructure Amid AWS Capacity Constraints on Private LLM Development
February 18, 2026

Developing a private large language model(LLM) on AWS can expose infrastructure constraints, particularly around GPU access. For smaller companies, securing consistent access to high-performance computing often proves difficult when competing with larger cloud customers. Mazda Marvasti, CEO of Amberd AI,  encountered these challenges while scaling his company’s AI platform. Because Amberd operates its own…

Read More
custom AI chips
Custom AI Chips Signal Segmentation for AI Teams, While NVIDIA Sets the Performance Ceiling for Cutting-Edge AI
February 18, 2026

Microsoft’s introduction of the Maia 200 adds to a growing list of hyperscaler-developed processors, alongside offerings from AWS and Google. These custom AI chips are largely designed to improve inference efficiency and optimize internal cost structures, though some platforms also support large-scale training. Google’s offering is currently the most mature, with a longer production…

Read More