top | item 40591522

(no title)

countvonbalzac | 1 year ago

Are GANs useful for synthetic data generation for transformer based models?

discuss

order

rgovostes|1 year ago

Probably. Apple published a paper back in 2017 about improving synthetic data for the purposes of training models (though not transformers).

The examples they give are for eye and hand tracking -- which not coincidentally are used for navigating the Apple Vision Pro user interface.

https://machinelearning.apple.com/research/gan

Two_hands|1 year ago

It'd be cool to run some tests where you train a model with data and then supplement the training data with generated stuff.

HanClinto|1 year ago

Yes, the concept is still powerful and in use today.

As I understand the RLHF method of training LLMs, this involves the creation of an internal "reward model" which is a secondary model that is trained to try to predict the score of an arbitrary generation. This feels very analogous to the "discriminator" half of a GAN, because they both critique the generation created by the other half of the network, and this score is fed back in to train the primary network through positive and negative rewards.

I'm sure it's an oversimplification, but RLHF feels like GANs applied to the newest generation of LLMs -- but I rarely hear people talk about it in these terms.

Two_hands|1 year ago

I think diffusion models are useful too, I’m currently working on a project to use them to generate medical type data. It seems they'd both be useful as they are both targeted towards generation of data, especially in areas where data is hard to come by. Doing this blog made me wonder of the application in finance too.

HanClinto|1 year ago

I agree -- I would love to see diffusion models applied to more types of data. I would love to see more experiments done with text generation using a diffusion model, because it would have an easier time looking at the "whole text" rather than the myopia that can occur from simple next-token prediction.

GaggiX|1 year ago

Adversarial loss is used in many cases like when training a VAE, and a VAE can use a transformer architecture.