Professors Building Self-Locating Autonomous Vehicles

Feb. 21, 2017
This deep learning system could allow self-driving cars to navigate, maneuver and respond to changing road conditions by mating data from onboard sensors to maps.

Feb. 21, 2017—Researchers at the NYU Tandon School of Engineering are developing a deep learning system that will allow self-driving cars to navigate, maneuver and respond to changing road conditions by mating data from onboard sensors to information on HERE HD Live Map, a cloud-based service for automated driving.

Self-driving cars could account for 21 million new vehicles sold every year by 2035. Over the next decade alone, such vehicles—and vehicles with assisted-driving technology—could deliver $1 trillion in societal and consumer benefits due to their improved safety.

The researchers said for autonomous vehicles to make good on that promise, they will need onboard artificial intelligence technology able to link to highly detailed maps that reflect every change in the status of lanes, hazards, obstacles, and speed limits in real time.

Yi Fang, a research assistant professor in the Department of Electrical and Computer Engineering and a faculty member at NYU Abu Dhabi, and Edward K. Wong, an associate professor in the NYU Tandon Department of Computer Science and Engineering, are leading the project. The NYU Multimedia and Visual Computing Lab directed by Professor Fang will house the collaborative project.

Fang and Wong recently received a gift fund from HERE, a global leader in mapping and location-based services owned by Audi, BMW, Daimler and Intel, with Tencent and NavInfo of China and GIC of Singapore also poised to become investors during 2017. NYU Tandon is one of HERE's first university research and development partners in HERE HD Live Map.

High-definition (HD) maps meant for machine-to-machine communication must be accurate to within 10-20 centimeters. Self-driving vehicles need to continuously update, or register, their location on these maps with an equally high degree of accuracy, according to Fang, who said that the goal of the collaborative research is to enhance car-to-map precision to within 10 centimeters.

"Our work involves employing computer vision techniques to refine the vehicle's ability to continually locate itself with respect to HERE's cloud-based service," said Wong. "That requires real-time images of the street and surrounding objects derived from cameras, LiDAR [a laser-based range-finding technology], and other on-board sensors."  

The researchers added that this precision is also important because automobiles connected to HERE's HD Live Map service will deliver data to the cloud on road conditions, traffic, weather, obstacles, speed limits, and other variables, allowing the service to upgrade nearly in real-time to reflect changing conditions.

Sponsored Recommendations

Best Body Shop and the 360-Degree-Concept

Spanesi ‘360-Degree-Concept’ Enables Kansas Body Shop to Complete High-Quality Repairs

ADAS Applications: What They Are & What They Do

Learn how ADAS utilizes sensors such as radar, sonar, lidar and cameras to perceive the world around the vehicle, and either provide critical information to the driver or take...

Banking on Bigger Profits with a Heavy-Duty Truck Paint Booth

The addition of a heavy-duty paint booth for oversized trucks & vehicles can open the door to new or expanded service opportunities.

Boosting Your Shop's Bottom Line with an Extended Height Paint Booths

Discover how the investment in an extended-height paint booth is a game-changer for most collision shops with this Free Guide.