Researcher says autonomous vehicles that ‘talk’ to each other could solve industry's problems

Jim Rogers

gulf cart converted vehicle for testing

While autonomous vehicle technology has advanced significantly over the last decade, one persistent problem remains: perception. Like human drivers, autonomous vehicles can only react to the information they collect. But one UNT researcher believes he can make data collection and processing more efficient and robust.

“Consensus is growing among engineers and researchers that self-driving cars aren’t yet perceptive enough to make them safe to drive on public roads,” said Qing Yang, an assistant professor in the Department of Computer Science and Engineering. “But, what if we linked two cars, or three or hundreds together?”

Autonomous vehicles collect and process sensor data using a machine learning process over several steps, layer by layer. Yang’s research involves sorting the processed data from the raw data and sharing only processed data with other vehicles. He says that method would reduce the amount of data to be transmitted and decrease the amount of processing needed at the receiving end, while still providing the most important data.

“Rather than one vehicle sending enormous amounts of raw data to another vehicle that is already processing its own enormous amounts of raw data, I believe that we should use the data already processed by the machine learning programs,” he said. “This would allow autonomous vehicles to process and understand data much faster.”

For example, Yang explained that a vehicle’s lidar sensor, a distance detecting device that uses lasers, may detect an object on the road ahead and provide a rough shape and distance; whereas a vehicle’s camera could provide a clear picture of the object and maybe a rough distance. But, the data must be combined to provide the best picture of what’s ahead.

“This data must be collected, coordinated and acted upon. Machine learning programs will utilize the collected data to build an informational map and use the end result to make a decision about what actions to take regarding the object on the road,” he said. “Why send everything when only the final map is needed?”

At this time, there is no centralized design for this process. Automakers use different systems to collect and process information. For a cooperative system to be successful, all vehicle data must be understood by all vehicles. Part of Yang’s research involves developing a standard format that could be used by the entire industry. He also is working on a communication system that would allow for connectivity between vehicles in non line-of-sight positions.

The principal investigator on this project, Yang received a $400,000 grant from the National Science Foundation for his research into “Enabling Machine Learning-Based Cooperative Perception with mmWave Communication for Autonomous Vehicle Safety.” As part of this grant, $150,000 will be provided to the University of Massachusetts at Dartmouth for their help with the project.

Instagram icon
Youtube icon
LinkedIn icon
Discord icon