Conceivably, prompt injection could be leveraged to make LLMs give bad advice. Almost like social engineering.
Conceivably, prompt injection could be leveraged to make LLMs give bad advice. Almost like social engineering.