Abstract: In this letter, we propose a low-complexity universal decoder for product codes based on guessing codeword decoding (GCD). Recognizing that errors typically concentrate at the intersections ...
Abstract: The Mixture of Experts (MoE) model is a promising approach for handling code-switching speech recognition (CS-ASR) tasks. However, the existing CS-ASR work on MoE has yet to leverage the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results