Self-Consuming Generative Models with Curated Data Provably Optimize Human Preferences - LAboratoire Hubert Curien
Communication Dans Un Congrès Année : 2024

Self-Consuming Generative Models with Curated Data Provably Optimize Human Preferences

Résumé

The rapid progress in generative models has resulted in impressive leaps in generation quality, blurring the lines between synthetic and real data. Web-scale datasets are now prone to the inevitable contamination by synthetic data, directly impacting the training of future generated models. Already, some theoretical results on self-consuming generative models (a.k.a., iterative retraining) have emerged in the literature, showcasing that either model collapse or stability could be possible depending on the fraction of generated data used at each retraining step. However, in practice, synthetic data is often subject to human feedback and curated by users before being used and uploaded online. For instance, many interfaces of popular text-to-image generative models, such as Stable Diffusion or Midjourney, produce several variations of an image for a given query which can eventually be curated by the users. In this paper, we theoretically study the impact of data curation on iterated retraining of generative models and show that it can be seen as an implicit preference optimization mechanism. However, unlike standard preference optimization, the generative model does not have access to the reward function or negative samples needed for pairwise comparisons. Moreover, our study doesn’t require access to the density function, only to samples. We prove that, if the data is curated according to a reward model, then the expected reward of the iterative retraining procedure is maximized. We further provide theoretical results on the stability of the retraining loop when using a positive fraction of real data at each step. Finally, we conduct illustrative experiments on both synthetic datasets and on CIFAR10 showing that such a procedure amplifies biases of the reward model.
Fichier principal
Vignette du fichier
2407.09499v1.pdf (19.54 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04711701 , version 1 (27-09-2024)

Identifiants

Citer

Damien Ferbach, Quentin Bertrand, Avishek Joey Bose, Gauthier Gidel. Self-Consuming Generative Models with Curated Data Provably Optimize Human Preferences. NeurIPS, Dec 2024, Vancouver (BC), Canada. pp.1-27. ⟨hal-04711701⟩
13 Consultations
16 Téléchargements

Altmetric

Partager

More