Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Oct 28, 2021 · View a PDF of the paper titled Semi-Siamese Bi-encoder Neural Ranking Model Using Lightweight Fine-Tuning, by Euna Jung and 2 other authors.
Semi-Siamese Bi-encoder Neural Ranking Model Using Lightweight Fine-Tuning. Authors: Euna Jung. Euna Jung. Seoul National University, Republic of Korea. View ...
Mar 2, 2022 · Our semi-Siamese networks are based on a common pre-trained. BERT model that is not fine-tuned at all. Instead, a mild differen- tiation between ...
This work shows two approaches for improving the performance of BERT-based bi-encoders by replacing the full fine-tuning step with a lightweight fine- ...
Their experimental results demonstrate that these two methods generally perform on par with or even outperform the full fine-tuning by tuning less than 1% of ...
This github repository is for the paper Semi-Siamese Bi-encoder Neural Ranking Model UsingLightweight Fine-Tuning published at WWW 2022. We provide codes for ...
We propose an innovative use of inversed-contrastive loss, focusing on identifying the negative sentence, and fine-tuning BERT with a dataset generated via ...
Dec 29, 2022 · Improving Bi-encoder Neural Ranking Models using Knowledge Distillation and Lightweight Fine-tuning ... 4 Semi-Siamese Lightweight fine-tuning 73
The results confirm that both lightweight fine-tuning and semi-Siamese are considerably helpful for improving BERT-based bi-encoders. Language Modelling.