(no title)
sohamgovande | 2 years ago
Wouldn't it be more accurate to have the LLM think of a "reason" before the decision on whether or not a text is "safe"? Order matters for LLMs - the reasoning would guide it to accurately spit out true or false.
No comments yet.