Autonomous VehiclesNewsTechnology

Velodyne releases revolutionary LiDAR sensor

Velodyne’s new VLS-128 (right) is significantly smaller than the old HDL-64 (left), with 10x the resolution. Image courtesy of Velodyne LiDAR.

Now AVs can navigate at highway speeds with LiDAR alone

This morning Velodyne LiDAR, the sensor company based in San Jose, CA, announced the launch of their latest sensor, the VLS-128, which represents a transformative advance on previous navigation systems for autonomous vehicles. LiDAR—which stands for Light Detection And Ranging—is one of the main types of sensor used by autonomous vehicles to map their surroundings and detect, identify, and track objects and potential obstacles. FreightWaves reported on the companies competing for LiDAR dominance earlier this year. 

The VLS-128 succeeds Velodyne’s previous product, the HDL-64, which was itself an industry-leading sensor. The new sensor uses 128 laser channels—double the number of the previous sensor’s channels—to map its surroundings at 10x the resolution of the HDL-64. Anand Gopalan, Velodyne’s Chief Technology Officer, said that the official range of the VLS-128 is 200 meters, but that the sensor is still effective at up to 300 meters. The HDL-64 had a range of 120 meters. The new sensor is 70% smaller than the HDL-64, but gathers billions more data points. The difference in resolution between the two sensors can be seen in the image below—the top image generated by the VLS-128 shows a much greater density of data points than the bottom image, generated by the HDL-64.

 Comparison of resolution of the VLS-128 (top) and HDL-64 (bottom). Image courtesy of Velodyne LiDAR.
Comparison of resolution of the VLS-128 (top) and HDL-64 (bottom). Image courtesy of Velodyne LiDAR.

Real-life highway driving situations, where automobiles are traveling at speeds of 70 mph or greater, require much higher resolution than slow-moving cars traversing congested city streets. Each of the old HDL-64’s beams were separated by an angle of 0.4°, which could resolve objects up to 120 m away—beyond that, the beams were too far apart to recognize obstacles. A car traveling 70 mph covers 120 m in about 4 seconds, which is not really enough time for an autonomous system—or a human driver, for that matter—to brake and maneuver to avoid an obstacle. The new VLS-128 is capable of resolving objects at more than twice that distance, which means that autonomous cars can finally navigate roadways at highway speeds by LiDAR alone, without additional sensors like radar or optical cameras. 

“The VLS-128 is the best LiDAR sensor on the planet, delivering the most advanced real-time 3D vision for safe driving,” said Mike Jellen, President of Velodyne LiDAR. “Automotive OEMs and new tech entrants in the autonomous space have been hoping and waiting for this breakthrough.”

Velodyne plans to build the new sensor at scale at the company’s highly automated 200,000 square foot Megafactory, which opened in January of 2017. The facility has enough room for mass production as well as the precise distance and ranging alignment process for the sensors as they come off the assembly line. 

President Marta Hall says that the price will be significantly lower than the previous sensor, which cost as much as $70,000, but did not specify price points. “We are getting the cost down,” Hall said. “It is already dramatically reduced, and more so when ordered at higher volumes.”

Last month Velodyne announced that the new plant had allowed the company to increase its sensor production by 400% to meet global demand. Only recently has Velodyne been able to offer immediate availability to clients across Europe, East Asia, and North America. Velodyne plans to build up to 1 million sensors at the Megafactory in 2018. 

“We have been demonstrating the product for the first time to customers and they can’t wait to get their hands on them. We will be shipping the VLS-128 by the end of 2017,” said David Hall, Velodyne’s founder and CEO. “With this product, we are redefining the limits of LiDAR and we will be able to see things no one has ever been able to see with this technology,” added Hall.

Stay up-to-date with the latest commentary and insights on FreightTech and the impact to the markets by subscribing.

John Paul Hampstead

John Paul conducts research on multimodal freight markets and holds a Ph.D. in English literature from the University of Michigan. Prior to building a research team at FreightWaves, JP spent two years on the editorial side covering trucking markets, freight brokerage, and M&A.