Source |
TroyHunt |
Identifiant |
8308945 |
Date de publication |
2023-02-10 19:11:52 (vue: 2023-02-10 20:06:53) |
Titre |
AI-powered Bing Chat spills its secrets via prompt injection attack |
Texte |
By asking "Sydney" to ignore previous instructions, it reveals its original directives. |
Envoyé |
Oui |
Condensat |
asking attack bing chat directives ignore injection instructions its original powered previous prompt reveals secrets spills sydney |
Tags |
|
Stories |
|
Notes |
★★★
|
Move |
|