top | item 40258035

(no title)

trolan | 1 year ago

I'm not an expert, but I believe it is the act of the model itself being used to align itself with the intended outputs[0]

[0] https://arxiv.org/abs/2402.05699

discuss

order

exe34|1 year ago

Could I ask a dumb question, what does it look like when a model isn't aligned with its intended output? Does the text look off-center?

irthomasthomas|1 year ago

Alignment really just means how close do the model outputs align with human preferences or some other criteria.

At a glance, this looks like a model pretrained to perform prompt-engineering. It should automatically use Chain-of-Thought in its responses in order to improve it's programming abilities, and, therefore be better aligned with users expectations.

It also has reflection. So they include code to execute the model output and return the response to the model for feedback.

PoignardAzur|1 year ago

That's not the most helpful of explanations =/