AE Hopes Its Lidar System Is Ahead of the Curve

If you learned all about self-driving cars from Elon Musk’s Twitter feed, you’re missing out on a lot. Startups such as Waymo (part of Google) and Zoox have unveiled autonomous shuttle concepts, as have traditional manufacturers including Cadillac and Toyota (it was one of Toyota’s shuttles that collided with a visually impaired athlete during the Tokyo Paralympics last month) Was). And Waymo is currently allowing the general public to hail its driverless shuttle in Phoenix. There is also a world of smaller companies working to build hardware that will help driverless cars, shuttles and delivery bots make sense of our world. One of those companies is AEE, a California-based Lidar (which stands for Light Detection and Ranging) firm with a sensor that can detect obstacles—even when mounted on a car. Even from short and long distances.

We saw the current iteration of AEye’s technology at work during testing in late June at a facility in Michigan. A Ford Fusion with AE lidar was mounted on top near a tunnel entrance, like you might come across on an urban freeway. The road turned curvy as we entered the tunnel, and AEye reps placed various obstacles in the shadows created by the overhangs. Two humanoid dummies and a canine were set up 361 feet away. There were five big bricks scattered on the road 33 feet ahead of the dummy.

From where we were standing – under a tent next to the car – no obstacles were clearly visible. AEye intended to demonstrate the system’s capabilities in both good and bad weather, installing a rainmaker between the car and the tunnel. But at the time of testing, the natural rain was falling so hard that AEye’s engineers had to raise their voice to hear the water pounding on the roof of the tent.

an eye liar installed on top of a car


Weather matters here because lidar works by sending laser pulses, in which a receiver senses the light reflected off any obstacle and is aided by a lot of code – that information is used to determine the location and obstacles in the car’s path. Used to indicate the type. The kind of driving rain we were experiencing during AEye testing could theoretically flummox a lidar system. The water can absorb some of the light sent by the laser, causing both less light and information to return to the system’s sensors.

watch test video

However, AEye’s system handled the rain well. When the car’s lidar was turned on, a display on the outside of the car – set specifically for display – showed the interpretation of feedback from the system’s sensors. We can identify roughly equidistant points representing the outline of the tunnel walls and the road surface. A few lines of close-up dots with a sheet of rain pouring down at the tunnel entrance. In addition, three groups of dots showed that the lidar system was registering dummies placed in its path. In addition, a few more point groups represented bricks on the ground.

“I’ve never seen a demo like this before – in a real-world scenario under inclement weather, behind a windshield, while still being able to achieve distance and detection,” said Sam Abuelsamid, a principal research analyst at Guidehouse Insights. AEye also exist for demonstration. “What we saw was really impressive,” he said.

Stephen Lambright, AEE’s chief marketing officer, said part of the company’s difference lies in the option of separating the part of the module that sends the laser pulses from the part that receives them. Lambright said other companies integrate both functions into a single part, meaning the laser can’t send a new pulse until feedback from the first one has returned. AEye’s two-part solution allows the laser to send more pulses in less time, which means more data. AEye has programmed its system so that when a laser pulse returns pointing at an object in the car’s path, the laser sends multiple repeat pulses in the same area to fill in the picture of the object in question.

AEye also stands out from the crowd in choosing its lasers. This lidar uses a 1550-nanometer laser (those measurements refer to the wavelength of the light emitted by the laser) as opposed to the cheap 905-nanometer laser preferred by many others in the world. Light in the 905-nanometer spectrum can damage the retina, so they are subject to laser regulations that limit their power, a necessary safety step that also limits the distance at which they can detect obstacles. The lasers AEye uses (Volvo’s lidar partner, Luminar, also uses 1550 lasers) are safer for the eyes, and the lasers can send out pulses that will travel longer than short-wave lasers.

If the Fusion On Display was actually driving itself, it would have picked up obstacles on the road before even a human could, especially given the heavy rain. Seeing trouble on the horizon is half of being a good driver. The second part is knowing what to do next. Those problems would be someone else’s; AEye designs and engineers lidar components, but does not design the self-driving systems they will eventually power, so the burden of programming the car’s driver-assistance systems to avoid obstacles will fall on its partners. Continental will manufacture and license AEye’s systems for the automotive market. Other applications include aerospace, construction, mining and smart city projects.

AEye still faces challenges on its path to widespread adoption, with cost perhaps being chief among them. Lambright says AEye is currently assuming a per-car cost of less than $1,000 for its lidar setup and that the company is on track to sell its modules for around $100 a piece. But while AEye’s technology is still in the early development stages, trajectories may change, and competitor Luminar says it already has a production-ready unit that costs just $500.

There may be another obstacle. Federal agencies regulating autonomous vehicle testing and transportation safety are still skeptical about how to regulate self-driving cars, and Tesla is now the subject of an investigation that has resulted in a large number of Autopilot-equipped cars. The scale may be missed. So, don’t expect to see AEye modules in your next new car. But the next time you see Elon Musk talking about artificial intelligence on Twitter, remember that he’s not the only one working on self-driving cars.

This content is created and maintained by a third party, and is imported to this page to help users provide their email addresses. You may be able to find more information about this and similar stuff on

nonton the naked director season 2

Leave a Comment