Home Transports AEye Secures $40m for Self-Driving Car Sensor that Sees Better than Humans

AEye Secures $40m for Self-Driving Car Sensor that Sees Better than Humans

AEye is a San Fransisco Bay-Area-based computer vision startup that is convinced that everybody else is wrong when it comes to implementing solid-state Lidar.

The existing Lidar systems are expensive, slow and mechanical.

Even with numerous companies rushing to mass-produce a solid-state LiDar product at a cheap cost, none of them is available on the market yet.

What’s more, companies moving towards products can only see a maximum of 250 meters in ideal circumstances.

Although such a distance may be ideal for self-driving cars to work at low speeds in highly mapped settings, it may not be sufficient for cars moving at high speeds to stop instantly.

These hurdles are simply part of what keeps self-driving cars from being introduced into the mainstream market.

Nonetheless, AEye claims it can fix this situation.

Its AE100 solid-state LiDar is not just a detection system – it is a computer vision tool that physically integrates LiDar with low-light, high-resolution camera into one unit, and uses AI in processing the data.

The Lidar voxels and camera pixels that form a 3D point cloud of the scanned setting are combined in real-time to create “dynamic vixels.”

Its feedback loop focuses on objects that are seen as important in its field of view in a manner that imitates what the visual cortex of the human eye can do.

“The ability of the 3D component of sensing to actually change its pattern and mechanically be merged with the camera allows for intelligence,” explains AEye Chief of Staff Blair LaCourte. “Just like your eye, it can simultaneously see things in its periphery and put more energy on things it’s focused on.”

Currently, there is a continuing debate regarding whether or not self-driving car systems can or are better compared to human beings when it comes to synthesizing and reacting to severe situations.

The human eye typically processes information at about 27hz, whereas most LiDar systems work at 10hz.

The reason is that people can use multiple senses to rapidly assess their environment and take advantage of past experiences in determining what is vital in the surrounding while disregarding the others in a bid to know how to react.

AEye has come up with a LiDar system that can do a similar thing.

It identifies objects positioned 1,000 meters away, performs at a rate of 100hz and utilizes a feedback loop in sending and receiving information to assist in prioritizing important information almost instantaneously in a bid to influence the car’s path planning.

“We’re searching 3,000 times faster and better than any system anywhere, but we’re also predicting where we want to interrogate,” says LaCourte in a phone interview. “So when I enter an intersection, it may focus on the entry points of the intersection, but it may also put extra energy in the direction I’m turning on an unprotected left-turn.”

The Lidar system integrates the ability to focus and search different things with the potential to draw feedback from the car’s pathfinding system in a bid to allow it to know what it’s doing and why it is doing it.

By doing so, AEye has formed a flexible solid-state LiDar boasting a structural design of intelligence.

With that, there could be a LiDar system with the potential to process information like people.

AEye tested the ability of its LiDar system to keep track of a 20ft moving truck and discovered that it followed the car consistently and continuously scanned the length of the airport runway and even beyond it, as well as detecting signs and markers placed along the route.

By setting the new bar for automotive-based LiDar, AEye has captured the attention of various Silicon Valley leading companies including Airbus Ventures and Intel Capital.

Also, one of AEye’s Co-founders, Jordan Greene, was recently featured in the 2019 FORBES 30 Under 30 Manufacturing list.

Recently, the company disclosed that it had closed its Series B financing exercise, which raised a whopping $40 million and was led by Taiwania Capital.

AEye plans to utilize the funds in scaling and venturing into production.

In the first six months of 2019, the company anticipates that its AE200 department will commence low volume production.

The company’s A100, which focuses on “robo-taxis” like Waymo, will start production in the second half of 2019.

The AE100 unit is anticipated to retail at $5,000 while the AE200 will be sold at about $1,000 depending on the configuration of the system.

AEye claims that there are about 30-60 cars that are currently road-testing its LiDar system.

Subscribe to our newsletter

Signup today for free and be the first to get notified on the latest news and insights on artificial intelligence

KC Cheung
KC Cheung has over 18 years experience in the technology industry including media, payments, and software and has a keen interest in artificial intelligence, machine learning, deep learning, neural networks and its applications in business. Over the years he has worked with some of the leading technology companies, building and growing dynamic teams in a fast moving international environment.
- Advertisment -

MOST POPULAR

AI Model Development isn’t the End; it’s the Beginning

AI model development isn’t the end; it’s the beginning. Like children, successful models need continuous nurturing and monitoring throughout their lifecycle. Parenting is exhilarating and, if...