Skip to content
/ LaPred Public

[CVPR 2021] LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents

Notifications You must be signed in to change notification settings

bdokim/LaPred

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents

This repository contains the code for LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents by ByeoungDo Kim, Seong Hyeon Park, Seokhwan Lee, Elbek Khoshimjonov, Dongsuk Kum, Junsoo Kim, Jeong Soo Kim, Jun Won Choi

lapred_img

@InProceedings{Kim_2021_CVPR,
    author    = {Kim, ByeoungDo and Park, Seong Hyeon and Lee, Seokhwan and Khoshimjonov, Elbek and Kum, Dongsuk and Kim, Junsoo and Kim, Jeong Soo and Choi, Jun Won},
    title     = {LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2021},
    pages     = {14636-14645}
}

Dataset

Dataset Preprocessing

  • Run the script to extract preprocessed samples.
  • Provide the path of the downloaded data to --path(-p) option. (default : './nuscenes/dataset')
python dataset_preprocess.py -p [dataset-path]

Model Training - nuScenes

  • To train the LaPred model, run the run.py file.
python run.py -m Lapred_original

Model Evaluation - nuScenes

  • After training, You can evaluate the model with --eval(-e) option.
python run.py -m Lapred_original -e

About

[CVPR 2021] LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages