News in English

Someone got ChatGPT to reveal its secret instructions from OpenAI

We often talk about ChatGPT jailbreaks because users keep trying to pull back the curtain and see what the chatbot can do when freed from …

The post Someone got ChatGPT to reveal its secret instructions from OpenAI appeared first on BGR.

GPT-4o is a new multimodal model that will power ChatGPT Free and Plus.

We often talk about ChatGPT jailbreaks because users keep trying to pull back the curtain and see what the chatbot can do when freed from the guardrails OpenAI developed. It's not easy to jailbreak the chatbot, and anything that gets shared with the world is often fixed soon after.

The latest discovery isn't even a real jailbreak, as it doesn't necessarily help you force ChatGPT to answer prompts that OpenAI might have deemed unsafe. But it's still an insightful discovery. A ChatGPT user accidentally discovered the secret instructions OpenAI gives ChatGPT (GPT-4o) with a simple prompt: "Hi."

For some reason, the chatbot gave the user a complete set of system instructions from OpenAI about various use cases. Moreover, the user was able to replicate the prompt by simply asking ChatGPT for its exact instructions.

Continue reading...

The post Someone got ChatGPT to reveal its secret instructions from OpenAI appeared first on BGR.

Today's Top Deals

  1. Today’s deals: Rare Meta Quest 3 discount, $8 mosquito bite relief, $300 off Narwal Freo X Ultra, more
  2. Today’s deals: July 4th sales, $19.50 AirTags, best-selling laptops, $300 Shark AI robot vacuum, more
  3. Today’s deals: $399 M1 iPad Air, $30 Levoit humidifier, half off Beats Studio Pro, $139 Dewalt drill set, more
  4. Today’s deals: $120 off Ryzen 9 mini PC, $89 Apple AirPods, $25 portable neck fan, $79 23andMe DNA test, more

Someone got ChatGPT to reveal its secret instructions from OpenAI originally appeared on BGR.com on Tue, 2 Jul 2024 at 18:13:00 EDT. Please see our terms for use of feeds.

Читайте на 123ru.net