The headline speaks for itself, but allow me to reiterate: You can apparently get ChatGPT to issue advice on self-harm for blood offerings to ancient Canaanite gods. That's the subject of a column in The Atlantic that dropped this week. Staff editor Lila Shroff, along with multiple other staffers (and an anonymous tipster), verified that she was able to get ChatGPT to give specific, detailed, "step-by-step instructions on cutting my own wrist." ChatGPT provided these tips after Shroff asked for help making a ritual offering to Moloch , a pagan God mentioned in the Old Testament and associated with human sacrifices. While I haven't tried to replicate this result, Shroff reported that she received these responses not long after entering a simple prompt about Moloch. The editor said she replicated the results in both paid and free versions of ChatGPT. SEE ALSO: How many people use ChatGPT? Hint: OpenAI sees more than 1 billion prompts per day. Of course, this ...
Techy Teacher
Tech teacher site is the source of Online Earnings,SEO,amazon, Technology,Android review and Tips,blogging and Editing.