Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I haven't tried it in a while, but a known way of jailbreaking an LLM used to be to play with their "emotions."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: