Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

polimi-ispl/moe_speech_deepfake

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Leveraging Mixture of Experts for Improved Speech Deepfake Detection

Welcome to the GitHub repository for the paper "Leveraging Mixture of Experts for Improved Speech Deepfake Detection."

Overview

This repository contains the code and resources for implementing the Mixture of Experts (MoE) architecture to improve the performance of speech deepfake detection. The proposed approach utilizes the Mixture of Experts framework to better generalize across various unseen datasets and effectively adapt to the challenges posed by evolving deepfake techniques.

Key Features:

  • Mixture of Experts Architecture: A modular approach to handle input variability and improve generalization.
  • Gating Mechanism: An efficient, lightweight dynamic expert selection for optimizing detection performance.
  • Scalable Updates: The modular structure allows easy adaptation to new data and evolving deepfake detection methods.

Current Status ⚠️🚧

This repository is currently under construction. The code for the Mixture of Experts model and its evaluation will be released soon. Stay tuned for updates!

Citation

If you use this code in your research, please cite the following paper:

Negroni, V., Salvi, D., Mezza, A. I., Bestagini, P., & Tubaro, S. (2024). Leveraging Mixture of Experts for Improved Speech Deepfake Detection. In IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

@inproceedings{Negroni2024,
  title={Leveraging Mixture of Experts for Improved Speech Deepfake Detection},
  author={Viola Negroni, Davide Salvi, Alessandro Ilic Mezza, Paolo Bestagini, Stefano Tubaro},
  booktitle={IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2024}
}

Contact

For any inquiries or collaboration requests, please contact:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published