Deakin University
Browse

A SUPERVISED LEARNING APPROACH WITH RESIDUAL ATTENTION CONNECTIONS

Download (1.83 MB)
journal contribution
posted on 2025-09-08, 01:43 authored by Ali Hamza, Fazal Muhammad, Talha Ali, Fazal E-Wahab, Muhammad Ismail
Our study aims to improve speech quality despite background noise, which often disrupts clear communication. We focus on developing efficient and effective models that work well on devices with limited resources. We draw inspiration from computational auditory scene analysis techniques to train our models to differentiate speech from background noise while keeping computational demands low. We introduce two models: CRN-WRC (Convolutional Recurrent Network without Residual Connections) and CRN-RCAG (Convolutional Recurrent Network with Residual Connections and Attention Gates). Our thorough testing shows that our models significantly enhance speech quality and understanding, even in noisy environments with varying background noise levels. Notably, the CRN-RCAG model consistently outperforms the CRN-WRC, particularly in handling untrained noise types. We achieve impressive results by integrating residual connections and attention gates into our models while maintaining computational efficiency.

History

Related Materials

Location

[Turkiye]

Open access

  • Yes

Language

eng

Journal

Journal of Science, Technology and Engineering Research

Pagination

78-85

ISSN

1450-202X

eISSN

1450-202X

Publisher

European Journals Inc.

Usage metrics

    Research Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC