For the last year and a half, two hacked white Tesla Model 3 sedans, outfitted with five additional cameras and a palm-sized supercomputer, have been cruising around San Francisco. In a city rife with questions about artificial intelligence, the startup behind these modified Teslas aims to tackle a straightforward question: How quickly can a company develop autonomous vehicle software today? The startup, making its activities public for the first time, is called HyprLabs. With a team of 17—only eight of whom are full-time—split between Paris and San Francisco, it’s led by Tim Kentley-Klay, a veteran of the autonomous vehicle sector and co-founder of Zoox, who left the Amazon-owned company in 2018. Hypr has raised relatively modest funding of $5.5 million since 2022, but its ambitions are ambitious. The company eventually plans to create and operate its own robots. “Think of the love child of R2-D2 and Sonic the Hedgehog,” Kentley-Klay says. “It’s going to define a new category that doesn’t currently exist.”
For now, the startup is introducing its software product, Hyprdrive, which it promotes as a significant advancement in how engineers train vehicles to operate autonomously. Innovations in machine learning have sparked a wave of progress in robotics, aiming to reduce the costs and labor involved in training autonomous vehicle software. This evolution has revitalized a field that previously experienced a “trough of disillusionment,” as tech developers consistently missed deadlines for deploying robots in public. Nowadays, robotaxis are transporting paying passengers in an increasing number of cities, and automakers are making bolder claims about delivering self-driving options for personal vehicles.
However, the process of transitioning from "driving reasonably well" to "driving much safer than a human" presents its own challenges. “I can’t say to you, hand on heart, that this will work,” Kentley-Klay acknowledges. “But what we’ve built is a really solid signal. It just needs to be scaled up.”
Old Tech, New Tricks
HyprLabs’ approach to software training represents a shift from conventional techniques used by other robotics startups. To provide some context: For years, the primary debate in the realm of autonomous vehicles revolved around the use of just cameras—like Tesla—and those that relied on other sensors as well—like Waymo and Cruise—which included costly lidar and radar systems. However, deeper philosophical differences also existed.
Camera-only proponents like Tesla aimed to minimize costs while planning for a massive fleet of robots; for a decade, CEO Elon Musk has envisioned the potential to convert all of his customers’ cars to self-driving models with a simple software update. This strategy allowed companies to collect extensive data, as their not-yet autonomous vehicles collected images during their journeys. This data fed into an “end-to-end” machine learning model through reinforcement. The system receives images—a bike—and produces driving commands—turn the steering wheel left and ease off the accelerator to avoid a collision. “It’s like training a dog,” explains Philip Koopman, an autonomous vehicle software and safety researcher at Carnegie Mellon University. “At the end, you say, ‘Bad dog,’ or ‘Good dog.’”
