It took TikTok 21 minutes and 15 seconds, on a brand new account, before it served up the first suicide-related content. Why is their algorithm pushing this content to teenage accounts? These companies know we are biologically wired to pay attention to distressing content and they are weaponising it back at us. Just one day before, Meta was also found guilty of concealing the risks of harm, including child sexual exploitation. The collateral damage of that approach is all too human and finally there is a legal avenue to challenge this.
Source: New Zealand Herald March 31, 2026 02:25 UTC