Final published version, 310 KB, PDF document
Available under license: CC BY: Creative Commons Attribution 4.0 International License
Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to conference - Without ISBN/ISSN › Conference paper › peer-review
Research output: Contribution to conference - Without ISBN/ISSN › Conference paper › peer-review
}
TY - CONF
T1 - Contrastive Training with More Data
AU - Mander, Stephen
AU - Piao, Scott
AU - Rahmani, Hossein
PY - 2023/5/1
Y1 - 2023/5/1
N2 - This paper proposes a new method of contrastive training over multiple data points, focusing on the scaling issue present when using in-batch negatives. Our approach compares transformer training with dual encoders versus training with multiple encoders. Our method can provide a feasible approach to improve lossmodelling as encoders scale.
AB - This paper proposes a new method of contrastive training over multiple data points, focusing on the scaling issue present when using in-batch negatives. Our approach compares transformer training with dual encoders versus training with multiple encoders. Our method can provide a feasible approach to improve lossmodelling as encoders scale.
M3 - Conference paper
T2 - Eleventh International Conference on Learning Representations
Y2 - 1 May 2023
ER -