Emphasizing the auto in automobile: A unified approach for automated vehicles
The idea of driverless cars continues to make headlines across the world, including the recent revelation that researchers don car seat costumes to observe how the public interacts with cars that appear driverless. Despite the apparent absurdity of such research techniques, driverless cars are approaching the on ramp to reality. A team of researchers have proposed an integrated framework to help cars interact without the human touch–quite literally.
The collaborative team includes researchers from Cranfield University, UK, Tsinghua University, Beijing Institute of Technology, Xi'an Jiaotong University, China. They published their approach in IEEE/CAA Journal of Automatica Sinica (JAS), a joint publication of the IEEE and Chinese Association of Automation.
"One of the main challenges associated with connected and automated driving is the lack of a systematic approach to integrate both vehicle connectivity and vehicle automation attributes for maximizing the performance benefits," said Dongpu Cao, a senior lecturer at the Advanced Vehicle Engineering Center at Cranfield University in the United Kingdom, and an author on the paper.
Cao and his team developed a framework that combines cyber, physical, and social systems, with the understanding that vehicles operate at different levels of automation and such levels can change with regards to the vehicle operator as well as external considerations. A person may switch from full automation to take over steering and braking operations when encountering a traffic jam, for example. Another person may prefer full control when driving through a residential area, but allows for full automation while zipping down the highway.
How can separate driving approaches, with different levels of automation, come together for smooth and safe vehicle interactions? The researchers propose the use of parallel learning theory. Using a machine-based learning system, computer networks can analyze information regarding the physical vehicle, the human driver, and the information related to the action of driving. These three spaces can be assessed and processed in parallel, through a cloud-based learning network.
"The developed parallel driving framework is a ground-breaking approach in both short- and long-term, whose full realization in the long term requires systematic collaborative efforts from multidisciplinary sectors," Cao said.
Cao noted that a simplified version of this cloud-based understanding and communication could be achieved in the short term with relative ease.
"The demonstration system of parallel driving is being developed, which is expected to demonstrate the functions and performance potentials in the real-world driving environment in January 2018," Cao said.
Fulltext of the paper is available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8039015&tag=1
IEEE/CAA Journal of Automatica Sinica (JAS) is a joint publication of the Institute of Electrical and Electronics Engineers, Inc (IEEE) and the Chinese Association of Automation. The objective of JAS is high quality and rapid publication of articles, with a strong focus on new trends, original theoretical and experimental research and developments, emerging technologies, and industrial standards in automation. The coverage of JAS includes but is not limited to: Automatic control,Artificial intelligence and intelligent control, Systems theory and engineering, Pattern recognition and intelligent systems, Automation engineering and applications, Information processing and information systems, Network based automation, Robotics, Computer-aided technologies for automation systems, Sensing and measurement, Navigation, guidance, and control. JAS is indexed by IEEE, ESCI, EI, Inspec, Scopus, SCImago, CSCD, CNKI. We are pleased to announce the new 2016 CiteScore (released by Elsevier) is 2.16, ranking 26% among 211 publications in Control and System Engineering category.
To learn more about JAS, please visit: http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6570654