This chapter presents an improved incipient FD method based on Kullback-Leibler divergence (KLD) under multivariate statistical analysis frame. Different from the traditional MVA-based FD methods, this methodology can detect slight anomalous behaviors by comparing the PDF online with the reference PDF obtained from large scale off-line datasets.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    PCA and Kullback-Leibler Divergence-Based FDD Methods


    Additional title:

    Lecture Notes in Intelligent Transportation and Infrastructure


    Contributors:
    Chen, Hongtian (author) / Jiang, Bin (author) / Lu, Ningyun (author) / Chen, Wen (author)


    Publication date :

    2020-04-26


    Size :

    17 pages





    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English




    Ellipticity and Circularity Measuring via Kullback–Leibler Divergence

    Misztal, K. | British Library Online Contents | 2016


    Kullback-Leibler boosting

    Ce Liu, / Hueng-Yeung Shum, | IEEE | 2003




    Kullback-Leibler Boosting

    Liu, C. / Shum, H.-Y. / IEEE | British Library Conference Proceedings | 2003