ChatGPT jailbreak flaw, dubbed "Time Bandit," allows you to bypass OpenAI's safety guidelines when asking for detailed instructions on sensitive topics, including the creation of weapons, information on nuclear topics, and malware creation
800,000 people in Europe and the US appear to have been duped into sharing card details and other sensitive personal data with a vast network of fake online designer shops apparently operated from China