ChatGPT jailbreak flaw, dubbed "Time Bandit," allows you to bypass OpenAI's safety guidelines when asking for detailed instructions on sensitive topics, including the creation of weapons, information on nuclear topics, and malware creation
major vulnerability believed to be present in most versions of the Android Mobile OS can allow malicious Android applications to make phone calls on a user’s device