Home > Research > Publications & Outputs > Contrastive Training with More Data

Electronic data

  • 66_contrastive_training_with_more

    Final published version, 310 KB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

View graph of relations

Contrastive Training with More Data

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published
Publication date1/05/2023
<mark>Original language</mark>English
EventEleventh International Conference on Learning Representations - Kigali, Rwanda
Duration: 1/05/2023 → …
https://iclr.cc/Conferences/2023

Conference

ConferenceEleventh International Conference on Learning Representations
Abbreviated titleICLR 2023
Country/TerritoryRwanda
CityKigali
Period1/05/23 → …
Internet address

Abstract

This paper proposes a new method of contrastive training over multiple data points, focusing on the scaling issue present when using in-batch negatives. Our approach compares transformer training with dual encoders versus training with multiple encoders. Our method can provide a feasible approach to improve loss
modelling as encoders scale.