Submission of Survey Paper on Joint Perception and Prediction for Autonomous Driving
I am excited to share a major milestone in my PhD journey: the submission of our survey paper, entitled “Joint Perception and Prediction for Autonomous Driving: A Survey”, to the prestigious IEEE Transactions on Intelligent Transportation Systems for possible publication. As a significant step towards sharing our research with the broader community, we have also published a preprint version on ArXiv, which is now accessible at the following link: ArXiv Preprint.
This paper represents a significant milestone in my research journey, exploring the integration of perception and prediction in autonomous driving systems. As vehicles navigate increasingly complex environments, the traditional approach of independently optimizing object detection, tracking, and motion prediction has proven to have critical limitations. These include inefficient resource utilization, error accumulation across pipeline stages, and poor uncertainty propagation.
To address these challenges, our survey dives deep into the emerging paradigm of joint perception and prediction. This unified framework leverages multi-task learning to combine these tasks into a single cohesive model, enabling richer interpretations of the environment while directly accessing raw sensor data.
Our paper introduces a novel taxonomy that organizes existing approaches based on input representation, scene context modeling, and output representation. We provide both qualitative analyses and quantitative comparisons of state-of-the-art methods, identifying their strengths, limitations, and potential areas for future exploration.
This work is particularly exciting because it is the first comprehensive survey dedicated to this paradigm. I believe it will serve as a valuable resource for researchers and practitioners striving to advance the capabilities of autonomous systems.
Feel free to check out the paper on ArXiv and stay tuned for more updates. The PDF will also be embedded here for easy access. Your feedback and thoughts are always welcome!