Image captioning is characterized as the cycle of producing inscriptions or printed depictions for pictures dependent on the substance of the picture. It is an AI task that includes both normal language proceedings furthermore, PC vision (for understanding picture substance). Auto picture inscribing is an ongoing and developing exploration issue these days. Step by step different new strategies are being presented to accomplish acceptable outcomes in this field. Be that as it may, there are still loads of consideration needed to accomplish results comparable to a human. This examination plans to discover in a deliberate manner that what unique and ongoing techniques and models are utilized for picture inscribing utilizing profound learning? What strategies are actualized to utilize those models? Furthermore, what strategies are bound to give great outcomes. For doing so, done an efficient writing audit on ongoing examinations from 2015 to 2020 from well-known information bases.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    A Systematic Literature Review on Story-Telling for Kids using Image Captioning - Deep Learning


    Contributors:
    Haritha, P. (author) / Vimala, S. (author) / Malathi, S. (author)


    Publication date :

    2020-11-05


    Size :

    169258 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Unternehmenspotenziale mit "Story Telling" aufdecken und nutzen

    Hesser, Erwin / Thier, Karin | IuD Bahn | 2007


    MPISTE: A Mobile, Personalised, Interactive Story Telling Environment

    Delgado-Mata, C / Velazquez, R / Pooley, R J et al. | IEEE | 2010


    Remote Sensing Image Captioning Using Transformer

    Wang, Binze / Xi, Jiangbo / Wang, Xingrun et al. | British Library Conference Proceedings | 2022


    Remote Sensing Image Captioning Using Transformer

    Wang, Binze / Xi, Jiangbo / Wang, Xingrun et al. | Springer Verlag | 2022