The U.K.’s data protection watchdog said Monday that it’s investigating how TikTok uses the personal information of teenagers to deliver content recommendations to them when they use the social media platform. The Information Commissioner’s Office said that there are growing concerns around how social media platforms were using data generated by children’s online activity to power their recommendation algorithms, and the potential for young people to see inappropriate or harmful content as a result. The regulator said that it wanted to ensure the robustness of TikTok’s safety procedures when it comes to using the personal information of teens ranging in age from 13 to 17. “It’s what they’re collecting, it’s how they work,” information commissioner John Edwards said. “I will expect to find that there will be many benign and positive uses of children’s data in their recommender systems.” “What I am concerned about is whether they are sufficiently robust to prevent children being exposed to harm, either from addictive practices on the device or the platform, or from content that they see, or from other unhealthy practices,” he said. As part of the investigation, the regulator will also look into how online forum site Reddit and image-sharing site Imgur use children’s personal data and how they estimate or verify a child’s age. TikTok, which is operated by Chinese technology firm ByteDance, said in a statement that it was “deeply committed to ensuring a positive experience for young people on TikTok.” “Our recommender systems are designed and operate under strict and comprehensive measures that protect the privacy and safety of teens, including industry-leading safety features and robust restrictions on the content allowed in teens’ feeds,” it said. In 2023, the regulator imposed a fine of 12.7 million pounds (about $16 million) on the video sharing app for misusing children’s data and violating other protections for young users’ personal information. The office said at the time that TikTok didn’t adequately identify and remove children under 13 from the platform, and that it allowed as many as 1.4 million children in the U.K. under 13 to use the app in 2020, despite the platform’s own rules prohibiting children that young from setting up accounts. (AP)
Category:
Recent comments