News in English

'Skeleton Key' attack unlocks the worst of AI, says Microsoft

Simple jailbreak prompt can bypass safety guardrails on major models

Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content.…

Читайте на 123ru.net