Enhancing Speech Codec Efficiency with Intra-Inter Broad Attention Mechanism
DOI:
https://doi.org/10.64758/eht55h36Keywords:
attention mechanisms, dual-branch conformer, bitrate efficiency, redundancyAbstract
This paper introduces a new approach in speech compression through advanced attention mechanisms, integration of LSTM, and dual-branch conformer structures for optimizing codec efficiency. The study focuses on five research questions, which are: intra-inter broad attention, multi-head attention networks, LSTM for sequence modeling, redundancy elimination, and comparative performance of IBACodec against traditional codecs. The study uses a quantitative methodology with performance metrics that include bitrate efficiency and quality evaluation. Results confirm that IBACodec significantly enhances context awareness, compression efficiency, sequence modeling, and redundancy elimination compared to existing solutions. These findings position IBACodec as a leading solution for speech compression. Further research is needed to explore real-time applications and broader datasets.
