Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

"Online DPO: Online Direct Preference Optimization with Fast-Slow Chasing."

Biqing Qi et al. (2024)

Details and statistics

DOI: 10.48550/ARXIV.2406.05534

access: open

type: Informal or Other Publication

metadata version: 2024-11-13