Long short-term memory networks (LSTM) produces promising results in the prediction of traffic flows. However, LSTM needs large numbers of data to produce satisfactory results. Therefore, the effect of LSTM training set size on performance and optimum training set size for short-term traffic flow prediction problems were investigated in this study. To achieve this, the numbers of data in the training set was set between 480 and 2800, and the prediction performance of the LSTMs trained using these adjusted training sets was measured. In addition, LSTM prediction results were compared with nonlinear autoregressive neural networks (NAR) trained using the same training sets. Consequently, it was seen that the increase in LSTM's training cluster size increased performance to a certain point. However, after this point, the performance decreased. Three main results emerged in this study: First, the optimum training set size for LSTM significantly improves the prediction performance of the model. Second, LSTM makes short-term traffic forecasting better than NAR. Third, LSTM predictions fluctuate less than the NAR model following instant traffic flow changes.


    Access

    Download


    Export, share and cite



    Title :

    ANALYSIS AND COMPARISON OF LONG SHORT-TERM MEMORY NETWORKS SHORT-TERM TRAFFIC PREDICTION PERFORMANCE


    Contributors:
    Erdem DOGAN (author)


    Publication date :

    2020



    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    Unknown





    A Short-Term Traffic Flow Prediction Method Based on Long Short-Term Memory Network

    Ci, Yusheng / Xiu, Gaoqun / Wu, Lina | Springer Verlag | 2018


    A Short-Term Traffic Flow Prediction Method Based on Long Short-Term Memory Network

    Ci, Yusheng / Xiu, Gaoqun / Wu, Lina | British Library Conference Proceedings | 2019



    PLSTM: Long Short-Term Memory Neural Networks for Propagatable Traffic Congested States Prediction

    Zheng, Yuxin / Liao, Lyuchao / Zou, Fumin et al. | Springer Verlag | 2020