A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users (Benj Edwards/Ars Technica) - TechnW3
A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users (Benj Edwards/Ars Technica) - TechnW3
from TechnW3
Benj Edwards / Ars Technica:
A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users — By asking “Sydney” to ignore previous instructions, it reveals its original directives. — On Tuesday, Microsoft revealed a …
from TechnW3
No comments: