Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATHCAD)
The scope of the PATHCAD project is for the provision of all weather sensing for driver assistance and ultimately autonomous vehicle operation, through the fusion of 3D low THz radar and video imagery.
The project is part of the £11M Towards Autonomy Smart and Connected Control (TASCC) programme, jointly funded by EPSRC and Jaguar Land Rover, made up of five consortia.
The University of Birmingham leads the PathCAd consortium, consisting of Jaguar Land Rover, the University of Edinburgh with world class expertise in advanced signal processing and radar imaging and Heriot-Watt University with equivalent skills in video analytics, LiDAR and accelerated algorithms.
Our novel system will fuse imagery from a LTHz radar (0.3 -0.7 THz) and video cameras to enable 'cross learning', where the system is trained to adapt in all weathers.
- LTHz sensing: 0.3-1 THz frequency spectrum is a novel region for automotive applications and relatively short wavelength and wide bandwidth of this LTHz spectrum with respect to existing automotive radar systems can bring key additional capabilities.
- Radar signal processing: To enhance azimuth resolution and provide image of nearly optical quality for fusion and, ultimately all-weather automated vehicle guidance
- 3D radar imaging: Relatively short wavelength and cm-order range resolution will be exploited to develop generic analytical and simulation models for 3D image reconstruction
- Video processing: High resolution and large legacy of processing algorithms to interpret the data. Not sufficient alone in the same conditions that challenge human drivers
- Classification and identification: Significant improvement in cross-road image resolution leading to substantially enhanced the ‘actor’ detection and classification
- Cross-learning: Train the radar algorithm in good weather by optical HD vision sensors so that the performance is enhanced when it acts alone in low visibility and adverse weather