Deakin University
Browse

File(s) under embargo

Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control

conference contribution
posted on 2024-09-23, 23:39 authored by Y Kim, S Seo, Jihong ParkJihong Park, M Bennis, SL Kim, J Choi
In this work, we compare emergent communication (EC) built upon multi-agent deep reinforcement learning (MADRL) and language-oriented semantic communication (LSC) empowered by a pre-trained large language model (LLM) using human language. In a multi-agent remote navigation task, with multimodal input data comprising location and channel maps, it is shown that EC incurs high training cost and struggles when using multimodal data, whereas LSC yields high inference computing cost due to the LLM's large size. To address their respective bottlenecks, we propose a novel framework of language-guided EC (LEC) by guiding the EC training using LSC via knowledge distillation (KD). Simulations corroborate that LEC achieves faster travel time while avoiding areas with poor channel conditions, as well as speeding up the MADRL training convergence by up to 61.8% compared to EC.

History

Volume

00

Pagination

2962-2967

Location

Denver, CO.

Open access

  • No

Start date

2024-06-09

End date

2024-06-13

ISSN

1550-3607

ISBN-13

978-1-7281-9054-9

Language

eng

Publication classification

E1 Full written paper - refereed

Title of proceedings

ICC 2024 : Proceedings of the IEEE International Conference on Communications 2024

Event

IEEE Communications. Conference (2024 : Denver, CO.)

Publisher

IEEE

Place of publication

Piscataway, NJ.

Usage metrics

    Research Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC