Skip to content

A Markov Decision Process (MDP) model for activity-based travel demand model

Notifications You must be signed in to change notification settings

wlxiong/PyMarkovActv

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Summary

This project proposed a unified modeling framework for the traveller's choice of activity type, timing and duration. In the decision-making process, the traveler exhibits forward-looking behavior, i.e. the traveler realize the impact of the current choice on the future utility and take into account the future utility that he can obtain. Therefore, the activity scheduling behavior is formulated as a Markov Decision Process. Dynamic programming technique is adopted to solve solve the problem and maximum likelihood method is employed to estimate the model parameters.

Publication

Yiliang Xiong and William H.K. Lam. Modeling Within-day Dynamics in Activity Scheduling: A Markov Decision Process Approach. Journal of the Eastern Asia Society for Transportation Studies, Vol. 9 (2011), pp. 452--467, Fulltext.

About

A Markov Decision Process (MDP) model for activity-based travel demand model

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages