The mixture-of-experts (MoE) architecture is an approach to aggregate several expert components via an additional gating module, which learns to predict the most suitable distribution of the expert’s outputs for each input. An MoE thus not only relies on redundancy for increased robustness—we also demonstrate how this architecture can provide additional interpretability, while retaining performance similar to a standalone network. As an example, we train expert networks to perform semantic segmentation of the traffic scenes and combine them into an MoE with an additional gating network. Our experiments with two different expert model architectures (FRRN and DeepLabv3+) reveal that the MoE is able to reach, and for certain data subsets even surpass, the baseline performance and also outperforms a simple aggregation via ensembling. A further advantage of an MoE is the increased interpretability—a comparison of pixel-wise predictions of the whole MoE model and the participating experts’ help to identify regions of high uncertainty in an input.


    Access

    Download


    Export, share and cite



    Title :

    Evaluating Mixture-of-Experts Architectures for Network Aggregation



    Published in:

    Publication date :

    2022-06-18


    Size :

    19 pages




    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English





    Traffic speed forecasting by mixture of experts

    Coric, V. / Zhuang Wang, / Vucetic, S. | IEEE | 2011


    A Time Series is Worth Five Experts: Heterogeneous Mixture of Experts for Traffic Flow Prediction

    Wang, Guangyu / Chen, Yujie / Gao, Ming et al. | ArXiv | 2024

    Free access

    Evaluating alternative air defense architectures

    Gandee, P.L. / Gray, M.D. / Sweet, R. | Tema Archive | 1987


    Interpretable Cascading Mixture-of-Experts for Urban Traffic Congestion Prediction

    Jiang, Wenzhao / Han, Jindong / Liu, Hao et al. | ArXiv | 2024

    Free access

    Mixture of Experts based Model Integration for Traffic State Prediction

    Chattopadhyay, Rajarshi / Tham, Chen-Khong | IEEE | 2022