Beyond monolingual assumptions: a survey of code-switched NLP in the era of large language models
Source
arXiv
ISSN
2331-8422
Date Issued
2025-10
Author(s)
Heth, Rajvee
Sinha, Samridhi Raj
Patil, Mahavir
Beniwal, Himanshu
Singh, Mayank
Abstract
Code-switching (CSW), the alternation of languages and scripts within a single utterance, remains a fundamental challenge for multilingual NLP, even amidst the rapid advances of large language models (LLMs). Most LLMs still struggle with mixed-language inputs, limited CSW datasets, and evaluation biases, hindering deployment in multilingual societies. This survey provides the first comprehensive analysis of CSW-aware LLM research, reviewing 308 studies spanning five research areas, 12 NLP tasks, 30+ datasets, and 80+ languages. We classify recent advances by architecture, training strategy, and evaluation methodology, outlining how LLMs have reshaped CSW modeling and what challenges persist. The paper concludes with a roadmap emphasizing the need for inclusive datasets, fair evaluation, and linguistically grounded models to achieve truly multilingual intelligence.
